WorldWideScience

Sample records for model selection treatment

  1. Variable selection for propensity score models when estimating treatment effects on multiple outcomes: a simulation study.

    Science.gov (United States)

    Wyss, Richard; Girman, Cynthia J; LoCasale, Robert J; Brookhart, Alan M; Stürmer, Til

    2013-01-01

    It is often preferable to simplify the estimation of treatment effects on multiple outcomes by using a single propensity score (PS) model. Variable selection in PS models impacts the efficiency and validity of treatment effects. However, the impact of different variable selection strategies on the estimated treatment effects in settings involving multiple outcomes is not well understood. The authors use simulations to evaluate the impact of different variable selection strategies on the bias and precision of effect estimates to provide insight into the performance of various PS models in settings with multiple outcomes. Simulated studies consisted of dichotomous treatment, two Poisson outcomes, and eight standard-normal covariates. Covariates were selected for the PS models based on their effects on treatment, a specific outcome, or both outcomes. The PSs were implemented using stratification, matching, and weighting (inverse probability treatment weighting). PS models including only covariates affecting a specific outcome (outcome-specific models) resulted in the most efficient effect estimates. The PS model that only included covariates affecting either outcome (generic-outcome model) performed best among the models that simultaneously controlled measured confounding for both outcomes. Similar patterns were observed over the range of parameter values assessed and all PS implementation methods. A single, generic-outcome model performed well compared with separate outcome-specific models in most scenarios considered. The results emphasize the benefit of using prior knowledge to identify covariates that affect the outcome when constructing PS models and support the potential to use a single, generic-outcome PS model when multiple outcomes are being examined. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Selective Laser Treatment on Cold-Sprayed Titanium Coatings: Numerical Modeling and Experimental Analysis

    Science.gov (United States)

    Carlone, Pierpaolo; Astarita, Antonello; Rubino, Felice; Pasquino, Nicola; Aprea, Paolo

    2016-12-01

    In this paper, a selective laser post-deposition on pure grade II titanium coatings, cold-sprayed on AA2024-T3 sheets, was experimentally and numerically investigated. Morphological features, microstructure, and chemical composition of the treated zone were assessed by means of optical microscopy, scanning electron microscopy, and energy dispersive X-ray spectrometry. Microhardness measurements were also carried out to evaluate the mechanical properties of the coating. A numerical model of the laser treatment was implemented and solved to simulate the process and discuss the experimental outcomes. Obtained results highlighted the key role played by heat input and dimensional features on the effectiveness of the treatment.

  3. Aggressive adolescents in residential care : A selective review of treatment requirements and models

    NARCIS (Netherlands)

    Knorth, Erik J.; Klomp, Martin; Van den Bergh, Peter M.; Noom, Marc J.

    2007-01-01

    This article presents a selective inventory of treatment methods of aggressive behavior. Special attention is paid to types of intervention that, according to research, are frequently used in Dutch residential youth care. These methods are based on (1) principles of (cognitive) behavior management a

  4. Using Peer Modeling and Differential Reinforcement in the Treatment of Food Selectivity

    Science.gov (United States)

    Sira, Bipon K.; Fryling, Mitch J.

    2012-01-01

    Behavior analysts have evaluated a wide range of assessment and treatment strategies in the area of feeding disorders. However, little is known about the effects of interventions employing peer modeling. This study extends upon the existing research on peer modeling and differential reinforcement with a 9-year-old boy diagnosed with autism who…

  5. Using Peer Modeling and Differential Reinforcement in the Treatment of Food Selectivity

    Science.gov (United States)

    Sira, Bipon K.; Fryling, Mitch J.

    2012-01-01

    Behavior analysts have evaluated a wide range of assessment and treatment strategies in the area of feeding disorders. However, little is known about the effects of interventions employing peer modeling. This study extends upon the existing research on peer modeling and differential reinforcement with a 9-year-old boy diagnosed with autism who…

  6. A Review and Treatment Selection Model for Individuals with Developmental Disabilities Who Engage in Inappropriate Sexual Behavior.

    Science.gov (United States)

    Davis, Tonya N; Machalicek, Wendy; Scalzo, Rachel; Kobylecky, Alicia; Campbell, Vincent; Pinkelman, Sarah; Chan, Jeffrey Michael; Sigafoos, Jeff

    2016-12-01

    Some individuals with developmental disabilities develop inappropriate sexual behaviors such as public masturbation, disrobing, and touching others in an unwanted sexual manner. Such acts are problematic given the taboo nature of the behaviors and the potential for significant negative consequences, such as restricted community access, injury, and legal ramifications. Therefore, it is necessary to equip caregivers and practitioners with effective treatment options. The purpose of this paper is to review studies that have evaluated behavioral treatments to reduce inappropriate sexual behavior in persons with developmental disabilities. The strengths and weaknesses of each treatment are reviewed, and a model for treatment selection is provided.

  7. Selection of resistant Streptococcus pneumoniae during penicillin treatment in vitro and in three animal models

    DEFF Research Database (Denmark)

    Knudsen, Jenny Dahl; Odenholt, Inga; Erlendsdottir, Helga

    2003-01-01

    Pharmacokinetic (PK) and pharmacodynamic (PD) properties for the selection of resistant pneumococci were studied by using three strains of the same serotype (6B) for mixed-culture infection in time-kill experiments in vitro and in three different animal models, the mouse peritonitis, the mouse th...

  8. Stormwater biofilter treatment model (MPiRe) for selected micro-pollutants.

    Science.gov (United States)

    Randelovic, Anja; Zhang, Kefeng; Jacimovic, Nenad; McCarthy, David; Deletic, Ana

    2016-02-01

    Biofiltration systems, also known as bioretentions or rain-gardens, are widely used for treatment of stormwater. In order to design them well, it is important to improve models that can predict their performance. This paper presents a rare model that can simulate removal of a wide range of micro-pollutants from stormwater by biofilters. The model is based on (1) a bucket approach for water flow simulation, and (2) advection/dispersion transport equations for pollutant transport and fate. The latter includes chemical non-equilibrium two-site model of sorption, first-order decay, and volatilization, thus is a compromise between the limited availability of data (on stormwater micro-pollutants) and the required complexity to accurately describe the nature of the phenomenon. The model was calibrated and independently validated on two field data series collected for different organic micro-pollutants at two biofilters of different design. This included data on triazines (atrazine, prometryn, and simazine), glyphosate, and chloroform during six simulated stormwater events. The data included variable and challenging biofilter operational conditions; e.g. variable inflow volumes, dry and wet period dynamics, and inflow pollutant concentrations. The model was able to simulate water flow well, with slight discrepancies being observed only during long dry periods when, presumably, soil cracking occurred. In general, the agreement between simulated and measured pollutographs was good. As with flows, the long dry periods posed a problem for water quality simulation (e.g. simazine and prometryn were difficult to model in low inflow events that followed prolonged dry periods). However, it was encouraging that pollutant transport and fate parameters estimated by the model calibration were in agreement with available literature data. This suggests that the model could probably be adopted for assessment of biofilter performance of other stormwater micro-pollutants (PAHs, phenols

  9. Model Selection for Geostatistical Models

    Energy Technology Data Exchange (ETDEWEB)

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  10. Explaining and Selecting Treatments for Autism: Parental Explanatory Models in Taiwan

    Science.gov (United States)

    Shyu, Yea-Ing Lotus; Tsai, Jia-Ling; Tsai, Wen-Che

    2010-01-01

    Parental explanatory models about autism influence the type of therapy a child receives, the child's well-being, and the parents' own psychological adaptation. This qualitative study explored explanatory models used by parents of children with autism. In-depth interviews were conducted with 13 parents of children with autism from a medical center…

  11. Recruiter Selection Model

    Science.gov (United States)

    2006-05-01

    interests include feature selection, statistical learning, multivariate statistics, market research, and classification. He may be contacted at...current youth market , and reducing barriers to Army enlistment. Part of the Army Recruiting Initiatives was the creation of a recruiter selection...Selection Model DevelPed by the Openuier Reseach Crate of E...lneSstm Erapseeeng Depce-teo, WViitd Ntt. siliec Academy, NW..t Point, 271 Weau/’itt 21M

  12. Socio-economic determinants in selecting childhood diarrhoea treatment options in Sub-Saharan Africa: A multilevel model

    Directory of Open Access Journals (Sweden)

    Lawoko Stephen

    2011-03-01

    Full Text Available Abstract Background Diarrhoea disease which has been attributed to poverty constitutes a major cause of morbidity and mortality in children aged five and below in most low-and-middle income countries. This study sought to examine the contribution of individual and neighbourhood socio-economic characteristics to caregiver's treatment choices for managing childhood diarrhoea at household level in sub-Saharan Africa. Methods Multilevel multinomial logistic regression analysis was applied to Demographic and Health Survey data conducted in 11 countries in sub-Saharan Africa. The unit of analysis were the 12,988 caregivers of children who were reported to have had diarrhoea two weeks prior to the survey period. Results There were variability in selecting treatment options based on several socioeconomic characteristics. Multilevel-multinomial regression analysis indicated that higher level of education of both the caregiver and that of the partner, as well as caregivers occupation were associated with selection of medical centre, pharmacies and home care as compared to no treatment. In contrast, caregiver's partners' occupation was negatively associated with selection medical centre and home care for managing diarrhoea. In addition, a low-level of neighbourhood socio-economic disadvantage was significantly associated with selection of both medical centre and pharmacy stores and medicine vendors. Conclusion In the light of the findings from this study, intervention aimed at improving on care seeking for managing diarrhoea episode and other childhood infectious disease should jointly consider the influence of both individual SEP and the level of economic development of the communities in which caregivers of these children resides.

  13. Bayesian Treatment Effects Models with Variable Selection for Panel Outcomes with an Application to Earnings Effects of Maternity Leave

    OpenAIRE

    Jacobi, Liana; Wagner, Helga; Frühwirth-Schnatter, Sylvia

    2014-01-01

    Child birth leads to a break in a woman's employment history and is considered one reason for the relatively poor labor market outcomes observed for women compared to men. However, the time spent at home after child birth varies significantly across mothers and is likely driven by observed and, more importantly, unobserved factors that also affect labor market outcomes directly. In this paper we propose two alternative Bayesian treatment modeling and inferential frameworks for panel outcomes ...

  14. Complexity regularized hydrological model selection

    NARCIS (Netherlands)

    Pande, S.; Arkesteijn, L.; Bastidas, L.A.

    2014-01-01

    This paper uses a recently proposed measure of hydrological model complexity in a model selection exercise. It demonstrates that a robust hydrological model is selected by penalizing model complexity while maximizing a model performance measure. This especially holds when limited data is available.

  15. Complexity regularized hydrological model selection

    NARCIS (Netherlands)

    Pande, S.; Arkesteijn, L.; Bastidas, L.A.

    2014-01-01

    This paper uses a recently proposed measure of hydrological model complexity in a model selection exercise. It demonstrates that a robust hydrological model is selected by penalizing model complexity while maximizing a model performance measure. This especially holds when limited data is available.

  16. Individual Influence on Model Selection

    Science.gov (United States)

    Sterba, Sonya K.; Pek, Jolynn

    2012-01-01

    Researchers in psychology are increasingly using model selection strategies to decide among competing models, rather than evaluating the fit of a given model in isolation. However, such interest in model selection outpaces an awareness that one or a few cases can have disproportionate impact on the model ranking. Though case influence on the fit…

  17. Model-Based Dose Selection for Intravaginal Ring Formulations Releasing Anastrozole and Levonorgestrel Intended for the Treatment of Endometriosis Symptoms.

    Science.gov (United States)

    Reinecke, Isabel; Schultze-Mosgau, Marcus-Hillert; Nave, Rüdiger; Schmitz, Heinz; Ploeger, Bart A

    2017-05-01

    Pharmacokinetics (PK) of anastrozole (ATZ) and levonorgestrel (LNG) released from an intravaginal ring (IVR) intended to treat endometriosis symptoms were characterized, and the exposure-response relationship focusing on the development of large ovarian follicle-like structures was investigated by modeling and simulation to support dose selection for further studies. A population PK analysis and simulations were performed for ATZ and LNG based on clinical phase 1 study data from 66 healthy women. A PK/PD model was developed to predict the probability of a maximum follicle size ≥30 mm and the potential contribution of ATZ beside the known LNG effects. Population PK models for ATZ and LNG were established where the interaction of LNG with sex hormone-binding globulin (SHBG) as well as a stimulating effect of estradiol on SHBG were considered. Furthermore, simulations showed that doses of 40 μg/d LNG combined with 300, 600, or 1050 μg/d ATZ reached anticipated exposure levels for both drugs, facilitating selection of ATZ and LNG doses in the phase 2 dose-finding study. The main driver for the effect on maximum follicle size appears to be unbound LNG exposure. A 50% probability of maximum follicle size ≥30 mm was estimated for 40 μg/d LNG based on the exposure-response analysis. ATZ in the dose range investigated does not increase the risk for ovarian cysts as occurs with LNG at a dose that does not inhibit ovulation. © 2016, The American College of Clinical Pharmacology.

  18. Automatic irradiation control by an optical feedback technique for selective retina treatment (SRT) in a rabbit model

    Science.gov (United States)

    Seifert, Eric; Roh, Young-Jung; Fritz, Andreas; Park, Young Gun; Kang, Seungbum; Theisen-Kunde, Dirk; Brinkmann, Ralf

    2013-06-01

    Selective Retina Therapy (SRT) targets the Retinal Pigment Epithelium (RPE) without effecting neighboring layers as the photoreceptors or the choroid. SRT related RPE defects are ophthalmoscopically invisible. Owing to this invisibility and the variation of the threshold radiant exposure for RPE damage the treating physician does not know whether the treatment was successful or not. Thus measurement techniques enabling a correct dosing are a demanded element in SRT devices. The acquired signal can be used for monitoring or automatic irradiation control. Existing monitoring techniques are based on the detection of micro-bubbles. These bubbles are the origin of RPE cell damage for pulse durations in the ns and μs time regime 5μs. The detection can be performed by optical or acoustical approaches. Monitoring based on an acoustical approach has already been used to study the beneficial effects of SRT on diabetic macula edema and central serous retinopathy. We have developed a first real time feedback technique able to detect micro-bubble induced characteristics in the backscattered laser light fast enough to cease the laser irradiation within a burst. Therefore the laser energy within a burst of at most 30 pulses is increased linearly with every pulse. The laser irradiation is ceased as soon as micro-bubbles are detected. With this automatic approach it was possible to observe invisible lesions, an intact photoreceptor layer and a reconstruction of the RPE within one week.

  19. Selective hydrolysis of wastewater sludge. Part 1. Model calculations and cost benefit analysis for Esbjerg West waste water treatment plant, Denmark

    Energy Technology Data Exchange (ETDEWEB)

    OEstergaard, N. (Eurotec West A/S (DK)); Thomsen, Anne Belinda; Thygesen, Anders; Bangsoe Nielsen, H. (Risoe National Laboratory, DTU (DK)); Rasmussen, Soeren (SamRas (DK))

    2007-09-15

    The project 'Selective hydrolysis of wastewater sludge' investigates the possibilities of utilizing selective hydrolysis of sludge at waste water treatment plants to increase the production of biogas based power and heat, and at the same time reduce power consumption for handling and treatment of nitrogen and sludge as well as for disposal of the sludge. The selective hydrolysis system is based on the fact that an anaerobic digestion before a hydrolysis treatment increases the hydrolysis efficiency, as the production of volatile organic components, which might inhibit the hydrolysis efficiency, are not produced to the same extent as may be the case for a hydrolysis made on un-digested material. Furthermore it is possible to separate ammonia from the sludge without using chemicals; it has, however, proven difficult to treat wastewater sludge, as the sludge seems to be difficult to treat in the laboratory using simple equipment. Esbjerg Wastewater Treatment Plant West, Denmark, is used as model plant for the calculations of the benefits using selective hydrolysis of sludge as if established at the existing sludge digester system. The plant is a traditional build plant based on the activated sludge concept in addition to traditional digester technology. The plant treats combined household and factory wastewater with a considerable amount of the wastewater received from the industries. During the project period Esbjerg Treatment Plant West went through considerable process changes, thus the results presented in this report are based on historical plant characteristics and may be viewed as conservative relative to what actually may be obtainable. (BA)

  20. Selective Serotonin Reuptake Inhibitors for Treatment of Selective Mutism

    Directory of Open Access Journals (Sweden)

    Mazlum Çöpür

    2012-03-01

    Full Text Available Some authors suggest that selective mutism should be considered as a variant of social phobia or a disorder in the obsessive-compulsive spectrum. Recent studies indicate that pharmacological treatments may be effective in the treatment of selective mutism. In this article, four cases who were treated with citalopram and escitalopram are presented. The results indicate that the drugs were well tolerated, and the level of social and verbal interactions improved significantly. These findings have shown that citalopram and escitalopram can be considered in medication of selective mutism; nevertheless, it is essential that research be done with more cases than previous ones, in order to prove their accuracy

  1. Selected System Models

    Science.gov (United States)

    Schmidt-Eisenlohr, F.; Puñal, O.; Klagges, K.; Kirsche, M.

    Apart from the general issue of modeling the channel, the PHY and the MAC of wireless networks, there are specific modeling assumptions that are considered for different systems. In this chapter we consider three specific wireless standards and highlight modeling options for them. These are IEEE 802.11 (as example for wireless local area networks), IEEE 802.16 (as example for wireless metropolitan networks) and IEEE 802.15 (as example for body area networks). Each section on these three systems discusses also at the end a set of model implementations that are available today.

  2. Launch vehicle selection model

    Science.gov (United States)

    Montoya, Alex J.

    1990-01-01

    Over the next 50 years, humans will be heading for the Moon and Mars to build scientific bases to gain further knowledge about the universe and to develop rewarding space activities. These large scale projects will last many years and will require large amounts of mass to be delivered to Low Earth Orbit (LEO). It will take a great deal of planning to complete these missions in an efficient manner. The planning of a future Heavy Lift Launch Vehicle (HLLV) will significantly impact the overall multi-year launching cost for the vehicle fleet depending upon when the HLLV will be ready for use. It is desirable to develop a model in which many trade studies can be performed. In one sample multi-year space program analysis, the total launch vehicle cost of implementing the program reduced from 50 percent to 25 percent. This indicates how critical it is to reduce space logistics costs. A linear programming model has been developed to answer such questions. The model is now in its second phase of development, and this paper will address the capabilities of the model and its intended uses. The main emphasis over the past year was to make the model user friendly and to incorporate additional realistic constraints that are difficult to represent mathematically. We have developed a methodology in which the user has to be knowledgeable about the mission model and the requirements of the payloads. We have found a representation that will cut down the solution space of the problem by inserting some preliminary tests to eliminate some infeasible vehicle solutions. The paper will address the handling of these additional constraints and the methodology for incorporating new costing information utilizing learning curve theory. The paper will review several test cases that will explore the preferred vehicle characteristics and the preferred period of construction, i.e., within the next decade, or in the first decade of the next century. Finally, the paper will explore the interaction

  3. Model Selection Principles in Misspecified Models

    CERN Document Server

    Lv, Jinchi

    2010-01-01

    Model selection is of fundamental importance to high dimensional modeling featured in many contemporary applications. Classical principles of model selection include the Kullback-Leibler divergence principle and the Bayesian principle, which lead to the Akaike information criterion and Bayesian information criterion when models are correctly specified. Yet model misspecification is unavoidable when we have no knowledge of the true model or when we have the correct family of distributions but miss some true predictor. In this paper, we propose a family of semi-Bayesian principles for model selection in misspecified models, which combine the strengths of the two well-known principles. We derive asymptotic expansions of the semi-Bayesian principles in misspecified generalized linear models, which give the new semi-Bayesian information criteria (SIC). A specific form of SIC admits a natural decomposition into the negative maximum quasi-log-likelihood, a penalty on model dimensionality, and a penalty on model miss...

  4. Bayesian Model Selection and Statistical Modeling

    CERN Document Server

    Ando, Tomohiro

    2010-01-01

    Bayesian model selection is a fundamental part of the Bayesian statistical modeling process. The quality of these solutions usually depends on the goodness of the constructed Bayesian model. Realizing how crucial this issue is, many researchers and practitioners have been extensively investigating the Bayesian model selection problem. This book provides comprehensive explanations of the concepts and derivations of the Bayesian approach for model selection and related criteria, including the Bayes factor, the Bayesian information criterion (BIC), the generalized BIC, and the pseudo marginal lik

  5. A Heckman Selection- t Model

    KAUST Repository

    Marchenko, Yulia V.

    2012-03-01

    Sample selection arises often in practice as a result of the partial observability of the outcome of interest in a study. In the presence of sample selection, the observed data do not represent a random sample from the population, even after controlling for explanatory variables. That is, data are missing not at random. Thus, standard analysis using only complete cases will lead to biased results. Heckman introduced a sample selection model to analyze such data and proposed a full maximum likelihood estimation method under the assumption of normality. The method was criticized in the literature because of its sensitivity to the normality assumption. In practice, data, such as income or expenditure data, often violate the normality assumption because of heavier tails. We first establish a new link between sample selection models and recently studied families of extended skew-elliptical distributions. Then, this allows us to introduce a selection-t (SLt) model, which models the error distribution using a Student\\'s t distribution. We study its properties and investigate the finite-sample performance of the maximum likelihood estimators for this model. We compare the performance of the SLt model to the conventional Heckman selection-normal (SLN) model and apply it to analyze ambulatory expenditures. Unlike the SLNmodel, our analysis using the SLt model provides statistical evidence for the existence of sample selection bias in these data. We also investigate the performance of the test for sample selection bias based on the SLt model and compare it with the performances of several tests used with the SLN model. Our findings indicate that the latter tests can be misleading in the presence of heavy-tailed data. © 2012 American Statistical Association.

  6. Socio-economic determinants in selecting childhood diarrhoea treatment options in Sub-Saharan Africa: A multilevel model

    OpenAIRE

    Lawoko Stephen; Aremu Olatunde; Moradi Tahereh; Dalal Koustuv

    2011-01-01

    ackground: Diarrhoea disease which has been attributed to poverty constitutes a major cause of morbidity and mortality in children aged five and below in most low-and-middle income countries. This study sought to examine the contribution of individual and neighbourhood socio-economic characteristics to caregivers treatment choices for managing childhood diarrhoea at household level in sub-Saharan Africa. less thanbrgreater than less thanbrgreater thanMethods: Multilevel multinomial logistic r...

  7. Introduction. Modelling natural action selection.

    Science.gov (United States)

    Prescott, Tony J; Bryson, Joanna J; Seth, Anil K

    2007-09-29

    Action selection is the task of resolving conflicts between competing behavioural alternatives. This theme issue is dedicated to advancing our understanding of the behavioural patterns and neural substrates supporting action selection in animals, including humans. The scope of problems investigated includes: (i) whether biological action selection is optimal (and, if so, what is optimized), (ii) the neural substrates for action selection in the vertebrate brain, (iii) the role of perceptual selection in decision-making, and (iv) the interaction of group and individual action selection. A second aim of this issue is to advance methodological practice with respect to modelling natural action section. A wide variety of computational modelling techniques are therefore employed ranging from formal mathematical approaches through to computational neuroscience, connectionism and agent-based modelling. The research described has broad implications for both natural and artificial sciences. One example, highlighted here, is its application to medical science where models of the neural substrates for action selection are contributing to the understanding of brain disorders such as Parkinson's disease, schizophrenia and attention deficit/hyperactivity disorder.

  8. Systematic Treatment Selection (STS): A Review and Future Directions

    Science.gov (United States)

    Nguyen, Tam T.; Bertoni, Matteo; Charvat, Mylea; Gheytanchi, Anahita; Beutler, Larry E.

    2007-01-01

    Systematic Treatment Selection (STS) is a form of technical eclectism that develops and plans treatments using empirically founded principles of psychotherapy. It is a model that provides systematic guidelines for the utilization of different psychotherapeutic strategies based on patient qualities and problem characteristics. Historically, it…

  9. Wastewater treatment models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2011-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...

  10. Wastewater Treatment Models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2008-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... the practice of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...

  11. Treatment selection for tonsillar squamous cell carcinoma

    Directory of Open Access Journals (Sweden)

    Yao-Yuan Kuo

    2013-04-01

    Conclusion: Both primary surgery and RT/CRT organ preservation are effective treatments for tonsillar SCC. Single modality treatment, either surgery or RT/CRT, can typically be provided for stage I–II diseases. Although RT/CRT organ preservation is used more frequently for stage III–IV tonsillar SCC in recent years, primary surgery combined with adjuvant therapy still achieves equivalent outcomes. Multidisciplinary pretreatment counseling and the facilities and personnel available are therefore important for decision-making. In addition, if RT/CRT organ preservation is selected as the primary treatment, tumor tonsillectomy is not indicated.

  12. Wastewater treatment models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2011-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise....... Efficient and good modeling practice therefore requires the use of a proper set of guidelines, thus grounding the modeling studies on a general and systematic framework. Last but not least, general limitations of WWTP models – more specifically activated sludge models – are introduced since these define...

  13. Wastewater Treatment Models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2008-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... the practice of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise....... Efficient and good modeling practice therefore requires the use of a proper set of guidelines, thus grounding the modeling studies on a general and systematic framework. Last but not least, general limitations of WWTP models – more specifically, activated sludge models – are introduced since these define...

  14. Bayesian Evidence and Model Selection

    CERN Document Server

    Knuth, Kevin H; Malakar, Nabin K; Mubeen, Asim M; Placek, Ben

    2014-01-01

    In this paper we review the concept of the Bayesian evidence and its application to model selection. The theory is presented along with a discussion of analytic, approximate and numerical techniques. Application to several practical examples within the context of signal processing are discussed.

  15. Endovascular Treatment of Diabetic Foot in a Selected Population of Patients with Below-the-Knee Disease: Is the Angiosome Model Effective?

    Energy Technology Data Exchange (ETDEWEB)

    Fossaceca, Rita, E-mail: rfossaceca@hotmail.com; Guzzardi, Giuseppe, E-mail: guz@libero.it; Cerini, Paolo, E-mail: cerini84@hotmail.it [' Maggiore della Carita' Hospital, University of Eastern Piedmont ' Amedeo Avogadro' , Department of Diagnostic and Interventional Radiology (Italy); Cusaro, Claudio, E-mail: claudio.cusaro@libero.it [' Maggiore della Carita' Hospital, Department of Diabetic Complications (Italy); Stecco, Alessandro, E-mail: a.stecco@libero.it; Parziale, Giuseppe, E-mail: giuseppeparziale@gmail.com; Perchinunno, Marco, E-mail: marcoperchinunno@gmail.com; Bonis, Marco De, E-mail: marco_deb@hotmail.it; Carriero, Alessandro, E-mail: profcarriero@virgilio.it [' Maggiore della Carita' Hospital, University of Eastern Piedmont ' Amedeo Avogadro' , Department of Diagnostic and Interventional Radiology (Italy)

    2013-06-15

    Purpose. To evaluate the efficacy of percutaneous transluminal angioplasty (PTA) in a selected population of diabetic patients with below-the-knee (BTK) disease and to analyze the reliability of the angiosome model. Methods. We made a retrospective analysis of the results of PTA performed in 201 diabetic patients with BTK-only disease treated at our institute from January 2005 to December 2011. We evaluated the postoperative technical success, and at 1, 6, and 12 months' follow-up, we assessed the rates and values of partial and complete ulcer healing, restenosis, major and minor amputation, limb salvage, and percutaneous oximetry (TcPO{sub 2}) (Student's t test). We used the angiosome model to compare different clinicolaboratory outcomes in patients treated by direct revascularization (DR) from patients treated with indirect revascularization (IR) technique by Student's t test and the {chi}{sup 2} test. Results. At a mean {+-} standard deviation follow-up of 17.5 {+-} 12 months, we observed a mortality rate of 3.5 %, a major amputation rate of 9.4 %, and a limb salvage rate of 87 % with a statistically significant increase of TcPO{sub 2} values at follow-up compared to baseline (p < 0.05). In 34 patients, treatment was performed with the IR technique and in 167 by DR; in both groups, there was a statistically significant increase of TcPO{sub 2} values at follow-up compared to baseline (p < 0.05), without statistically significant differences in therapeutic efficacy. Conclusion. PTA of the BTK-only disease is a safe and effective option. The DR technique is the first treatment option; we believe, however, that IR is similarly effective, with good results over time.

  16. Model Selection for Pion Photoproduction

    CERN Document Server

    Landay, J; Fernández-Ramírez, C; Hu, B; Molina, R

    2016-01-01

    Partial-wave analysis of meson and photon-induced reactions is needed to enable the comparison of many theoretical approaches to data. In both energy-dependent and independent parametrizations of partial waves, the selection of the model amplitude is crucial. Principles of the $S$-matrix are implemented to different degree in different approaches, but a many times overlooked aspect concerns the selection of undetermined coefficients and functional forms for fitting, leading to a minimal yet sufficient parametrization. We present an analysis of low-energy neutral pion photoproduction using the Least Absolute Shrinkage and Selection Operator (LASSO) in combination with criteria from information theory and $K$-fold cross validation. These methods are not yet widely known in the analysis of excited hadrons but will become relevant in the era of precision spectroscopy. The principle is first illustrated with synthetic data, then, its feasibility for real data is demonstrated by analyzing the latest available measu...

  17. Selection of technologies for municipal wastewater treatment

    Directory of Open Access Journals (Sweden)

    Juan Pablo Rodríguez Miranda

    2015-11-01

    Full Text Available In water environmental planning in watersheds should contain aspects for the decontamination of receiving water body, therefore the selection of the treatment plants municipal wastewater in developing countries, you should consider aspects of the typical composition raw wastewater pollutant removal efficiency by technology, performance indicators for technology, environmental aspects of localization and spatial localization strategy. This methodology is built on the basis of technical, economic and environmental attributes, such as a tool for decision making future investments in treatment plants municipal wastewater with multidisciplinary elements.

  18. Entropic criterion for model selection

    Science.gov (United States)

    Tseng, Chih-Yuan

    2006-10-01

    Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why use this criterion and are there any other criteria. Besides, conventional approaches require a reference prior, which is usually difficult to get. Following the logic of inductive inference proposed by Caticha [Relative entropy and inductive inference, in: G. Erickson, Y. Zhai (Eds.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, AIP Conference Proceedings, vol. 707, 2004 (available from arXiv.org/abs/physics/0311093)], we show relative entropy to be a unique criterion, which requires no prior information and can be applied to different fields. We examine this criterion by considering a physical problem, simple fluids, and results are promising.

  19. A Selective Review of Group Selection in High Dimensional Models

    CERN Document Server

    Huang, Jian; Ma, Shuangge

    2012-01-01

    Grouping structures arise naturally in many statistical modeling problems. Several methods have been proposed for variable selection that respect grouping structure in variables. Examples include the group LASSO and several concave group selection methods. In this article, we give a selective review of group selection concerning methodological developments, theoretical properties, and computational algorithms. We pay particular attention to group selection methods involving concave penalties. We address both group selection and bi-level selection methods. We describe several applications of these methods in nonparametric additive models, semiparametric regression, seemingly unrelated regressions, genomic data analysis and genome wide association studies. We also highlight some issues that require further study.

  20. Selected soil thermal conductivity models

    Directory of Open Access Journals (Sweden)

    Rerak Monika

    2017-01-01

    Full Text Available The paper presents collected from the literature models of soil thermal conductivity. This is a very important parameter, which allows one to assess how much heat can be transferred from the underground power cables through the soil. The models are presented in table form, thus when the properties of the soil are given, it is possible to select the most accurate method of calculating its thermal conductivity. Precise determination of this parameter results in designing the cable line in such a way that it does not occur the process of cable overheating.

  1. Model selection for pion photoproduction

    Science.gov (United States)

    Landay, J.; Döring, M.; Fernández-Ramírez, C.; Hu, B.; Molina, R.

    2017-01-01

    Partial-wave analysis of meson and photon-induced reactions is needed to enable the comparison of many theoretical approaches to data. In both energy-dependent and independent parametrizations of partial waves, the selection of the model amplitude is crucial. Principles of the S matrix are implemented to a different degree in different approaches; but a many times overlooked aspect concerns the selection of undetermined coefficients and functional forms for fitting, leading to a minimal yet sufficient parametrization. We present an analysis of low-energy neutral pion photoproduction using the least absolute shrinkage and selection operator (LASSO) in combination with criteria from information theory and K -fold cross validation. These methods are not yet widely known in the analysis of excited hadrons but will become relevant in the era of precision spectroscopy. The principle is first illustrated with synthetic data; then, its feasibility for real data is demonstrated by analyzing the latest available measurements of differential cross sections (d σ /d Ω ), photon-beam asymmetries (Σ ), and target asymmetry differential cross sections (d σT/d ≡T d σ /d Ω ) in the low-energy regime.

  2. Selective Maintenance Model Considering Time Uncertainty

    OpenAIRE

    Le Chen; Zhengping Shu; Yuan Li; Xuezhi Lv

    2012-01-01

    This study proposes a selective maintenance model for weapon system during mission interval. First, it gives relevant definitions and operational process of material support system. Then, it introduces current research on selective maintenance modeling. Finally, it establishes numerical model for selecting corrective and preventive maintenance tasks, considering time uncertainty brought by unpredictability of maintenance procedure, indetermination of downtime for spares and difference of skil...

  3. Bayesian Constrained-Model Selection for Factor Analytic Modeling

    OpenAIRE

    Peeters, Carel F.W.

    2016-01-01

    My dissertation revolves around Bayesian approaches towards constrained statistical inference in the factor analysis (FA) model. Two interconnected types of restricted-model selection are considered. These types have a natural connection to selection problems in the exploratory FA (EFA) and confirmatory FA (CFA) model and are termed Type I and Type II model selection. Type I constrained-model selection is taken to mean the determination of the appropriate dimensionality of a model. This type ...

  4. Model selection bias and Freedman's paradox

    Science.gov (United States)

    Lukacs, P.M.; Burnham, K.P.; Anderson, D.R.

    2010-01-01

    In situations where limited knowledge of a system exists and the ratio of data points to variables is small, variable selection methods can often be misleading. Freedman (Am Stat 37:152-155, 1983) demonstrated how common it is to select completely unrelated variables as highly "significant" when the number of data points is similar in magnitude to the number of variables. A new type of model averaging estimator based on model selection with Akaike's AIC is used with linear regression to investigate the problems of likely inclusion of spurious effects and model selection bias, the bias introduced while using the data to select a single seemingly "best" model from a (often large) set of models employing many predictor variables. The new model averaging estimator helps reduce these problems and provides confidence interval coverage at the nominal level while traditional stepwise selection has poor inferential properties. ?? The Institute of Statistical Mathematics, Tokyo 2009.

  5. Selected Logistics Models and Techniques.

    Science.gov (United States)

    1984-09-01

    ACCESS PROCEDURE: On-Line System (OLS), UNINET . RCA maintains proprietary control of this model, and the model is available only through a lease...System (OLS), UNINET . RCA maintains proprietary control of this model, and the model is available only through a lease arrangement. • SPONSOR: ASD/ACCC

  6. MODEL SELECTION FOR SPECTROPOLARIMETRIC INVERSIONS

    Energy Technology Data Exchange (ETDEWEB)

    Asensio Ramos, A.; Manso Sainz, R.; Martinez Gonzalez, M. J.; Socas-Navarro, H. [Instituto de Astrofisica de Canarias, E-38205, La Laguna, Tenerife (Spain); Viticchie, B. [ESA/ESTEC RSSD, Keplerlaan 1, 2200 AG Noordwijk (Netherlands); Orozco Suarez, D., E-mail: aasensio@iac.es [National Astronomical Observatory of Japan, Mitaka, Tokyo 181-8588 (Japan)

    2012-04-01

    Inferring magnetic and thermodynamic information from spectropolarimetric observations relies on the assumption of a parameterized model atmosphere whose parameters are tuned by comparison with observations. Often, the choice of the underlying atmospheric model is based on subjective reasons. In other cases, complex models are chosen based on objective reasons (for instance, the necessity to explain asymmetries in the Stokes profiles) but it is not clear what degree of complexity is needed. The lack of an objective way of comparing models has, sometimes, led to opposing views of the solar magnetism because the inferred physical scenarios are essentially different. We present the first quantitative model comparison based on the computation of the Bayesian evidence ratios for spectropolarimetric observations. Our results show that there is not a single model appropriate for all profiles simultaneously. Data with moderate signal-to-noise ratios (S/Ns) favor models without gradients along the line of sight. If the observations show clear circular and linear polarization signals above the noise level, models with gradients along the line are preferred. As a general rule, observations with large S/Ns favor more complex models. We demonstrate that the evidence ratios correlate well with simple proxies. Therefore, we propose to calculate these proxies when carrying out standard least-squares inversions to allow for model comparison in the future.

  7. Intensive treatment models and coercion

    DEFF Research Database (Denmark)

    Ohlenschlaeger, Johan; Thorup, Anne; Petersen, Lone;

    2007-01-01

    Little evidence exists concerning the optimal treatment for patients with first-episode schizophrenia-spectrum disorders and the effect on traditional outcomes. The aim was to investigate whether optimal treatment models have an effect on the level of use of coercion and on traditional outcomes. ...... treatment. A higher number of bed-days in Hospital-based Rehabilitation did not influence the effect on the outcomes measured........ Hospital-based Rehabilitation, an intensified inpatient treatment model, Integrated Treatment, an intensified model of Assertive Community Treatment, and standard treatment were compared for patients with first-episode schizophrenia-spectrum disorders. Ninety-four patients with first-episode schizophrenia......-spectrum disorders estimated to benefit from long-term hospitalization were included consecutively from the Copenhagen OPUS-trial and randomized to the three treatment models. At 1-year follow-up, Hospital-based Rehabilitation and Integrated Treatment had better scores on symptoms in the negative dimension...

  8. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built...

  9. A model selection approach to analysis of variance and covariance.

    Science.gov (United States)

    Alber, Susan A; Weiss, Robert E

    2009-06-15

    An alternative to analysis of variance is a model selection approach where every partition of the treatment means into clusters with equal value is treated as a separate model. The null hypothesis that all treatments are equal corresponds to the partition with all means in a single cluster. The alternative hypothesis correspond to the set of all other partitions of treatment means. A model selection approach can also be used for a treatment by covariate interaction, where the null hypothesis and each alternative correspond to a partition of treatments into clusters with equal covariate effects. We extend the partition-as-model approach to simultaneous inference for both treatment main effect and treatment interaction with a continuous covariate with separate partitions for the intercepts and treatment-specific slopes. The model space is the Cartesian product of the intercept partition and the slope partition, and we develop five joint priors for this model space. In four of these priors the intercept and slope partition are dependent. We advise on setting priors over models, and we use the model to analyze an orthodontic data set that compares the frictional resistance created by orthodontic fixtures. Copyright (c) 2009 John Wiley & Sons, Ltd.

  10. Model selection for amplitude analysis

    CERN Document Server

    Guegan, Baptiste; Stevens, Justin; Williams, Mike

    2015-01-01

    Model complexity in amplitude analyses is often a priori under-constrained since the underlying theory permits a large number of amplitudes to contribute to most physical processes. The use of an overly complex model results in reduced predictive power and worse resolution on unknown parameters of interest. Therefore, it is common to reduce the complexity by removing from consideration some subset of the allowed amplitudes. This paper studies a data-driven method for limiting model complexity through regularization during regression in the context of a multivariate (Dalitz-plot) analysis. The regularization technique applied greatly improves the performance. A method is also proposed for obtaining the significance of a resonance in a multivariate amplitude analysis.

  11. The Ouroboros Model, selected facets.

    Science.gov (United States)

    Thomsen, Knud

    2011-01-01

    The Ouroboros Model features a biologically inspired cognitive architecture. At its core lies a self-referential recursive process with alternating phases of data acquisition and evaluation. Memory entries are organized in schemata. The activation at a time of part of a schema biases the whole structure and, in particular, missing features, thus triggering expectations. An iterative recursive monitor process termed 'consumption analysis' is then checking how well such expectations fit with successive activations. Mismatches between anticipations based on previous experience and actual current data are highlighted and used for controlling the allocation of attention. A measure for the goodness of fit provides feedback as (self-) monitoring signal. The basic algorithm works for goal directed movements and memory search as well as during abstract reasoning. It is sketched how the Ouroboros Model can shed light on characteristics of human behavior including attention, emotions, priming, masking, learning, sleep and consciousness.

  12. Random Effect and Latent Variable Model Selection

    CERN Document Server

    Dunson, David B

    2008-01-01

    Presents various methods for accommodating model uncertainty in random effects and latent variable models. This book focuses on frequentist likelihood ratio and score tests for zero variance components. It also focuses on Bayesian methods for random effects selection in linear mixed effects and generalized linear mixed models

  13. Review and selection of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Reeves, M.; Baker, N.A.; Duguid, J.O. [INTERA, Inc., Las Vegas, NV (United States)

    1994-04-04

    Since the 1960`s, ground-water flow models have been used for analysis of water resources problems. In the 1970`s, emphasis began to shift to analysis of waste management problems. This shift in emphasis was largely brought about by site selection activities for geologic repositories for disposal of high-level radioactive wastes. Model development during the 1970`s and well into the 1980`s focused primarily on saturated ground-water flow because geologic repositories in salt, basalt, granite, shale, and tuff were envisioned to be below the water table. Selection of the unsaturated zone at Yucca Mountain, Nevada, for potential disposal of waste began to shift model development toward unsaturated flow models. Under the US Department of Energy (DOE), the Civilian Radioactive Waste Management System Management and Operating Contractor (CRWMS M&O) has the responsibility to review, evaluate, and document existing computer models; to conduct performance assessments; and to develop performance assessment models, where necessary. This document describes the CRWMS M&O approach to model review and evaluation (Chapter 2), and the requirements for unsaturated flow models which are the bases for selection from among the current models (Chapter 3). Chapter 4 identifies existing models, and their characteristics. Through a detailed examination of characteristics, Chapter 5 presents the selection of models for testing. Chapter 6 discusses the testing and verification of selected models. Chapters 7 and 8 give conclusions and make recommendations, respectively. Chapter 9 records the major references for each of the models reviewed. Appendix A, a collection of technical reviews for each model, contains a more complete list of references. Finally, Appendix B characterizes the problems used for model testing.

  14. Genetic search feature selection for affective modeling

    DEFF Research Database (Denmark)

    Martínez, Héctor P.; Yannakakis, Georgios N.

    2010-01-01

    Automatic feature selection is a critical step towards the generation of successful computational models of affect. This paper presents a genetic search-based feature selection method which is developed as a global-search algorithm for improving the accuracy of the affective models built....... The method is tested and compared against sequential forward feature selection and random search in a dataset derived from a game survey experiment which contains bimodal input features (physiological and gameplay) and expressed pairwise preferences of affect. Results suggest that the proposed method...

  15. Pro and cons of targeted selective treatment against digestive-tract strongyles of ruminants

    Directory of Open Access Journals (Sweden)

    Cabaret J.

    2008-09-01

    Full Text Available The increasing prevalence of resistance to anthelmintics among gastrointestinal nematodes and the desire for lower input agriculture have promoted the idea that targeted selective treatment (treating the animals in need of such a treatment and only them could be a sustainable solution for controlling internal parasites of ruminants. The pros are the slowing of resistance prevalence, lower residues of anthelmintics in meat and milk, and lower cost; the cons are the difficulty and time spent on selecting animals in need of treatment and the possibility of lower production. Using actual experiments and modelling we show that targeted selective treatment can be used to sustainably control gastrointestinal nematode infections in flock.

  16. Model selection for Gaussian kernel PCA denoising

    DEFF Research Database (Denmark)

    Jørgensen, Kasper Winther; Hansen, Lars Kai

    2012-01-01

    We propose kernel Parallel Analysis (kPA) for automatic kernel scale and model order selection in Gaussian kernel PCA. Parallel Analysis [1] is based on a permutation test for covariance and has previously been applied for model order selection in linear PCA, we here augment the procedure to also...... tune the Gaussian kernel scale of radial basis function based kernel PCA.We evaluate kPA for denoising of simulated data and the US Postal data set of handwritten digits. We find that kPA outperforms other heuristics to choose the model order and kernel scale in terms of signal-to-noise ratio (SNR...

  17. Melody Track Selection Using Discriminative Language Model

    Science.gov (United States)

    Wu, Xiao; Li, Ming; Suo, Hongbin; Yan, Yonghong

    In this letter we focus on the task of selecting the melody track from a polyphonic MIDI file. Based on the intuition that music and language are similar in many aspects, we solve the selection problem by introducing an n-gram language model to learn the melody co-occurrence patterns in a statistical manner and determine the melodic degree of a given MIDI track. Furthermore, we propose the idea of using background model and posterior probability criteria to make modeling more discriminative. In the evaluation, the achieved 81.6% correct rate indicates the feasibility of our approach.

  18. Continuation of Weight Loss Treatment Is Associated with the Number of Self-Selected Treatment Modalities

    Science.gov (United States)

    Martin, Corby K.; Drab-Hudson, Danae L.; York-Crowe, Emily; Mayville, Stephen B.; Yu, Ying; Greenway, Frank L.

    2007-01-01

    Behavior therapy is a cornerstone of weight loss treatment and behaviorists help direct patients' treatment. A novel design was used that allowed participants to choose different treatment modalities during behavioral weight loss treatment. The association between the selection of different treatment modalities and program completion was examined…

  19. Measuring balance and model selection in propensity score methods

    NARCIS (Netherlands)

    Belitser, S.; Martens, Edwin P.; Pestman, Wiebe R.; Groenwold, Rolf H.H.; De Boer, Anthonius; Klungel, Olaf H.

    2011-01-01

    Background: Propensity score (PS) methods focus on balancing confounders between groups to estimate an unbiased treatment or exposure effect. However, there is lack of attention in actually measuring, reporting and using the information on balance, for instance for model selection. Objectives: To de

  20. Modeling Hepatitis C treatment policy.

    Energy Technology Data Exchange (ETDEWEB)

    Kuypers, Marshall A.; Lambert, Gregory Joseph; Moore, Thomas W.; Glass, Robert John,; Finley, Patrick D.; Ross, David; Chartier, Maggie

    2013-09-01

    Chronic infection with Hepatitis C virus (HCV) results in cirrhosis, liver cancer and death. As the nations largest provider of care for HCV, US Veterans Health Administration (VHA) invests extensive resources in the diagnosis and treatment of the disease. This report documents modeling and analysis of HCV treatment dynamics performed for the VHA aimed at improving service delivery efficiency. System dynamics modeling of disease treatment demonstrated the benefits of early detection and the role of comorbidities in disease progress and patient mortality. Preliminary modeling showed that adherence to rigorous treatment protocols is a primary determinant of treatment success. In depth meta-analysis revealed correlations of adherence and various psycho-social factors. This initial meta-analysis indicates areas where substantial improvement in patient outcomes can potentially result from VA programs which incorporate these factors into their design.

  1. Expert System Model for Educational Personnel Selection

    Directory of Open Access Journals (Sweden)

    Héctor A. Tabares-Ospina

    2013-06-01

    Full Text Available The staff selection is a difficult task due to the subjectivity that the evaluation means. This process can be complemented using a system to support decision. This paper presents the implementation of an expert system to systematize the selection process of professors. The management of software development is divided into 4 parts: requirements, design, implementation and commissioning. The proposed system models a specific knowledge through relationships between variables evidence and objective.

  2. Intensive treatment models and coercion

    DEFF Research Database (Denmark)

    Ohlenschlaeger, Johan; Thorup, Anne; Petersen, Lone

    2007-01-01

    Little evidence exists concerning the optimal treatment for patients with first-episode schizophrenia-spectrum disorders and the effect on traditional outcomes. The aim was to investigate whether optimal treatment models have an effect on the level of use of coercion and on traditional outcomes...

  3. Bayesian variable selection for latent class models.

    Science.gov (United States)

    Ghosh, Joyee; Herring, Amy H; Siega-Riz, Anna Maria

    2011-09-01

    In this article, we develop a latent class model with class probabilities that depend on subject-specific covariates. One of our major goals is to identify important predictors of latent classes. We consider methodology that allows estimation of latent classes while allowing for variable selection uncertainty. We propose a Bayesian variable selection approach and implement a stochastic search Gibbs sampler for posterior computation to obtain model-averaged estimates of quantities of interest such as marginal inclusion probabilities of predictors. Our methods are illustrated through simulation studies and application to data on weight gain during pregnancy, where it is of interest to identify important predictors of latent weight gain classes.

  4. Treatment of model solutions and wastewater containing selected hazardous metal ions using a chitin/lignin hybrid material as an effective sorbent.

    Science.gov (United States)

    Bartczak, Przemysław; Klapiszewski, Łukasz; Wysokowski, Marcin; Majchrzak, Izabela; Czernicka, Weronika; Piasecki, Adam; Ehrlich, Hermann; Jesionowski, Teofil

    2017-12-15

    A chitin/lignin material with defined physicochemical and morphological properties was used as an effective adsorbent of environmentally toxic metals from model systems. Particularly significant is its use in the neutralization of real industrial wastes. The ions Ni(2+), Cu(2+), Zn(2+) and Pb(2+) were adsorbed on the functional sorbent, confirming the high sorption capacity of the newly obtained product, primarily due to the presence on its surface of numerous active functional groups from the component biopolymers. The kinetics of the process of ion adsorption from model solution were investigated, and the experimental data were found to fit significantly better to a type 1 pseudo-second-order kinetic model, as confirmed by the high correlation coefficient of 0.999 for adsorption of both nickel(II) copper(II) zinc(II) and lead(II) ions. The experimental data obtained on the basis of adsorption isotherms corresponded to the Langmuir model. The sorption capacity of the chitin/lignin material was measured at 70.41 mg(Ni(2+))/g, 75.70 mg(Cu(2+))/g, 82.41 mg(Zn(2+))/g and 91.74 mg(Pb(2+))/g. Analysis of thermodynamic parameters confirmed the endothermic nature of the process. It was also shown that nitric acid is a very effective desorbing (regenerating) agent, enabling the chitin/lignin material to be reused as an effective sorbent of metal ions. The sorption abilities of the chitin/lignin system with respect to particular metal ions can be ordered in the sequence Ni(2+)

  5. MODEL SELECTION FOR LOG-LINEAR MODELS OF CONTINGENCY TABLES

    Institute of Scientific and Technical Information of China (English)

    ZHAO Lincheng; ZHANG Hong

    2003-01-01

    In this paper, we propose an information-theoretic-criterion-based model selection procedure for log-linear model of contingency tables under multinomial sampling, and establish the strong consistency of the method under some mild conditions. An exponential bound of miss detection probability is also obtained. The selection procedure is modified so that it can be used in practice. Simulation shows that the modified method is valid. To avoid selecting the penalty coefficient in the information criteria, an alternative selection procedure is given.

  6. Adverse selection model regarding tobacco consumption

    Directory of Open Access Journals (Sweden)

    Dumitru MARIN

    2006-01-01

    Full Text Available The impact of introducing a tax on tobacco consumption can be studied trough an adverse selection model. The objective of the model presented in the following is to characterize the optimal contractual relationship between the governmental authorities and the two type employees: smokers and non-smokers, taking into account that the consumers’ decision to smoke or not represents an element of risk and uncertainty. Two scenarios are run using the General Algebraic Modeling Systems software: one without taxes set on tobacco consumption and another one with taxes set on tobacco consumption, based on an adverse selection model described previously. The results of the two scenarios are compared in the end of the paper: the wage earnings levels and the social welfare in case of a smoking agent and in case of a non-smoking agent.

  7. A guide to Bayesian model selection for ecologists

    Science.gov (United States)

    Hooten, Mevin B.; Hobbs, N.T.

    2015-01-01

    The steady upward trend in the use of model selection and Bayesian methods in ecological research has made it clear that both approaches to inference are important for modern analysis of models and data. However, in teaching Bayesian methods and in working with our research colleagues, we have noticed a general dissatisfaction with the available literature on Bayesian model selection and multimodel inference. Students and researchers new to Bayesian methods quickly find that the published advice on model selection is often preferential in its treatment of options for analysis, frequently advocating one particular method above others. The recent appearance of many articles and textbooks on Bayesian modeling has provided welcome background on relevant approaches to model selection in the Bayesian framework, but most of these are either very narrowly focused in scope or inaccessible to ecologists. Moreover, the methodological details of Bayesian model selection approaches are spread thinly throughout the literature, appearing in journals from many different fields. Our aim with this guide is to condense the large body of literature on Bayesian approaches to model selection and multimodel inference and present it specifically for quantitative ecologists as neutrally as possible. We also bring to light a few important and fundamental concepts relating directly to model selection that seem to have gone unnoticed in the ecological literature. Throughout, we provide only a minimal discussion of philosophy, preferring instead to examine the breadth of approaches as well as their practical advantages and disadvantages. This guide serves as a reference for ecologists using Bayesian methods, so that they can better understand their options and can make an informed choice that is best aligned with their goals for inference.

  8. Adaptive Covariance Estimation with model selection

    CERN Document Server

    Biscay, Rolando; Loubes, Jean-Michel

    2012-01-01

    We provide in this paper a fully adaptive penalized procedure to select a covariance among a collection of models observing i.i.d replications of the process at fixed observation points. For this we generalize previous results of Bigot and al. and propose to use a data driven penalty to obtain an oracle inequality for the estimator. We prove that this method is an extension to the matricial regression model of the work by Baraud.

  9. A Theoretical Model for Selective Exposure Research.

    Science.gov (United States)

    Roloff, Michael E.; Noland, Mark

    This study tests the basic assumptions underlying Fishbein's Model of Attitudes by correlating an individual's selective exposure to types of television programs (situation comedies, family drama, and action/adventure) with the attitudinal similarity between individual attitudes and attitudes characterized on the programs. Twenty-three college…

  10. An expert system for selecting wart treatment method.

    Science.gov (United States)

    Khozeimeh, Fahime; Alizadehsani, Roohallah; Roshanzamir, Mohamad; Khosravi, Abbas; Layegh, Pouran; Nahavandi, Saeid

    2017-02-01

    As benign tumors, warts are made through the mediation of Human Papillomavirus (HPV) and may grow on all parts of body, especially hands and feet. There are several treatment methods for this illness. However, none of them can heal all patients. Consequently, physicians are looking for more effective and customized treatments for each patient. They are endeavoring to discover which treatments have better impacts on a particular patient. The aim of this study is to identify the appropriate treatment for two common types of warts (plantar and common) and to predict the responses of two of the best methods (immunotherapy and cryotherapy) to the treatment. As an original work, the study was conducted on 180 patients, with plantar and common warts, who had referred to the dermatology clinic of Ghaem Hospital, Mashhad, Iran. In this study, 90 patients were treated by cryotherapy method with liquid nitrogen and 90 patients with immunotherapy method. The selection of the treatment method was made randomly. A fuzzy logic rule-based system was proposed and implemented to predict the responses to the treatment method. It was observed that the prediction accuracy of immunotherapy and cryotherapy methods was 83.33% and 80.7%, respectively. According to the results obtained, the benefits of this expert system are multifold: assisting physicians in selecting the best treatment method, saving time for patients, reducing the treatment cost, and improving the quality of treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Decision making software for effective selection of treatment train alternative for wastewater using analytical hierarchy process.

    Science.gov (United States)

    Prasad, A D; Tembhurkar, A R

    2013-10-01

    Proper selection of treatment process and synthesis of treatment train is complex engineering activity requires crucial decision making during planning and designing of any Wastewater Treatment Plant (WWTP). Earlier studies on process selection mainly considered cost as the most important selection criteria and number of studies focused on cost optimization models using dynamic programming, geometric programming and nonlinear programming. However, it has been noticed that traditional cost analysis alone cannot be applied to evaluate Treatment Train (TT) alternatives, as number of important non-tangible factors cannot be easily expressed in monetary units. Recently researches focus on use of multi-criteria technique for selection of treatment process. AHP provides a powerful tool for multi-hierarchy and multi-variable system overcoming limitation of traditional techniques. The AHP model designed to facilitate proper decision making and reduce the margin of errors during optimization due to number of parameters in the hierarchy levels has been used in this study. About 14 important factors and 13 sub factors were identified for the selection of treatment alternatives for wastewater and sludge stream although cost is one of the most important selection criteria. The present paper provides details of developing a soft-tool called "ProSelArt" using an AHP model aiding for proper decision making.

  12. Testing exclusion restrictions and additive separability in sample selection models

    DEFF Research Database (Denmark)

    Huber, Martin; Mellace, Giovanni

    2014-01-01

    Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction of these......Standard sample selection models with non-randomly censored outcomes assume (i) an exclusion restriction (i.e., a variable affecting selection, but not the outcome) and (ii) additive separability of the errors in the selection process. This paper proposes tests for the joint satisfaction...... of these assumptions by applying the approach of Huber and Mellace (Testing instrument validity for LATE identification based on inequality moment constraints, 2011) (for testing instrument validity under treatment endogeneity) to the sample selection framework. We show that the exclusion restriction and additive...... separability imply two testable inequality constraints that come from both point identifying and bounding the outcome distribution of the subpopulation that is always selected/observed. We apply the tests to two variables for which the exclusion restriction is frequently invoked in female wage regressions: non...

  13. Psychotherapy treatment decisions supported by SelectCare

    NARCIS (Netherlands)

    Witteman, C.L.M.

    2008-01-01

    SelectCare is a computerized decision support system for psychotherapists who decide how to treat their depressed patients. This paper descibes the decision making model that is implemented in SelectCare and the decision elements it uses to give advice to its users. The system itself is then present

  14. Psychotherapy treatment decisions supported by SelectCare

    NARCIS (Netherlands)

    Witteman, C.L.M.

    SelectCare is a computerized decision support system for psychotherapists who decide how to treat their depressed patients. This paper descibes the decision making model that is implemented in SelectCare and the decision elements it uses to give advice to its users. The system itself is then

  15. Model selection for radiochromic film dosimetry

    CERN Document Server

    Méndez, Ignasi

    2015-01-01

    The purpose of this study was to find the most accurate model for radiochromic film dosimetry by comparing different channel independent perturbation models. A model selection approach based on (algorithmic) information theory was followed, and the results were validated using gamma-index analysis on a set of benchmark test cases. Several questions were addressed: (a) whether incorporating the information of the non-irradiated film, by scanning prior to irradiation, improves the results; (b) whether lateral corrections are necessary when using multichannel models; (c) whether multichannel dosimetry produces better results than single-channel dosimetry; (d) which multichannel perturbation model provides more accurate film doses. It was found that scanning prior to irradiation and applying lateral corrections improved the accuracy of the results. For some perturbation models, increasing the number of color channels did not result in more accurate film doses. Employing Truncated Normal perturbations was found to...

  16. Portfolio Selection Model with Derivative Securities

    Institute of Scientific and Technical Information of China (English)

    王春峰; 杨建林; 蒋祥林

    2003-01-01

    Traditional portfolio theory assumes that the return rate of portfolio follows normality. However, this assumption is not true when derivative assets are incorporated. In this paper a portfolio selection model is developed based on utility function which can capture asymmetries in random variable distributions. Other realistic conditions are also considered, such as liabilities and integer decision variables. Since the resulting model is a complex mixed-integer nonlinear programming problem, simulated annealing algorithm is applied for its solution. A numerical example is given and sensitivity analysis is conducted for the model.

  17. Aerosol model selection and uncertainty modelling by adaptive MCMC technique

    Directory of Open Access Journals (Sweden)

    M. Laine

    2008-12-01

    Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.

    The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.

    We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.

  18. On Model Selection Criteria in Multimodel Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ye, Ming; Meyer, Philip D.; Neuman, Shlomo P.

    2008-03-21

    Hydrologic systems are open and complex, rendering them prone to multiple conceptualizations and mathematical descriptions. There has been a growing tendency to postulate several alternative hydrologic models for a site and use model selection criteria to (a) rank these models, (b) eliminate some of them and/or (c) weigh and average predictions and statistics generated by multiple models. This has led to some debate among hydrogeologists about the merits and demerits of common model selection (also known as model discrimination or information) criteria such as AIC [Akaike, 1974], AICc [Hurvich and Tsai, 1989], BIC [Schwartz, 1978] and KIC [Kashyap, 1982] and some lack of clarity about the proper interpretation and mathematical representation of each criterion. In particular, whereas we [Neuman, 2003; Ye et al., 2004, 2005; Meyer et al., 2007] have based our approach to multimodel hydrologic ranking and inference on the Bayesian criterion KIC (which reduces asymptotically to BIC), Poeter and Anderson [2005] and Poeter and Hill [2007] have voiced a preference for the information-theoretic criterion AICc (which reduces asymptotically to AIC). Their preference stems in part from a perception that KIC and BIC require a "true" or "quasi-true" model to be in the set of alternatives while AIC and AICc are free of such an unreasonable requirement. We examine the model selection literature to find that (a) all published rigorous derivations of AIC and AICc require that the (true) model having generated the observational data be in the set of candidate models; (b) though BIC and KIC were originally derived by assuming that such a model is in the set, BIC has been rederived by Cavanaugh and Neath [1999] without the need for such an assumption; (c) KIC reduces to BIC as the number of observations becomes large relative to the number of adjustable model parameters, implying that it likewise does not require the existence of a true model in the set of alternatives; (d) if a true

  19. An integrated in silico 3D model-driven discovery of a novel, potent, and selective amidosulfonamide 5-HT1A agonist (PRX-00023) for the treatment of anxiety and depression.

    Science.gov (United States)

    Becker, Oren M; Dhanoa, Dale S; Marantz, Yael; Chen, Dongli; Shacham, Sharon; Cheruku, Srinivasa; Heifetz, Alexander; Mohanty, Pradyumna; Fichman, Merav; Sharadendu, Anurag; Nudelman, Raphael; Kauffman, Michael; Noiman, Silvia

    2006-06-01

    We report the discovery of a novel, potent, and selective amidosulfonamide nonazapirone 5-HT1A agonist for the treatment of anxiety and depression, which is now in Phase III clinical trials for generalized anxiety disorder (GAD). The discovery of 20m (PRX-00023), N-{3-[4-(4-cyclohexylmethanesulfonylaminobutyl)piperazin-1-yl]phenyl}acetamide, and its backup compounds, followed a new paradigm, driving the entire discovery process with in silico methods and seamlessly integrating computational chemistry with medicinal chemistry, which led to a very rapid discovery timeline. The program reached clinical trials within less than 2 years from initiation, spending less than 6 months in lead optimization with only 31 compounds synthesized. In this paper we detail the entire discovery process, which started with modeling the 3D structure of 5-HT1A using the PREDICT methodology, and then performing in silico screening on that structure leading to the discovery of a 1 nM lead compound (8). The lead compound was optimized following a strategy devised based on in silico 3D models and realized through an in silico-driven optimization process, rapidly overcoming selectivity issues (affinity to 5-HT1A vs alpha1-adrenergic receptor) and potential cardiovascular issues (hERG binding), leading to a clinical compound. Finally we report key in vivo preclinical and Phase I clinical data for 20m tolerability, pharmacokinetics, and pharmacodynamics and show that these favorable results are a direct outcome of the properties that were ascribed to the compound during the rational structure-based discovery process. We believe that this is one of the first examples for a Phase III drug candidate that was discovered and optimized, from start to finish, using in silico model-based methods as the primary tool.

  20. A Neurodynamical Model for Selective Visual Attention

    Institute of Scientific and Technical Information of China (English)

    QU Jing-Yi; WANG Ru-Bin; ZHANG Yuan; DU Ying

    2011-01-01

    A neurodynamical model for selective visual attention considering orientation preference is proposed. Since orientation preference is one of the most important properties of neurons in the primary visual cortex, it should be fully considered besides external stimuli intensity. By tuning the parameter of orientation preference, the regimes of synchronous dynamics associated with the development of the attention focus are studied. The attention focus is represented by those peripheral neurons that generate spikes synchronously with the central neuron while the activity of other peripheral neurons is suppressed. Such dynamics correspond to the partial synchronization mode. Simulation results show that the model can sequentially select objects with different orientation preferences and has a reliable shift of attention from one object to another, which are consistent with the experimental results that neurons with different orientation preferences are laid out in pinwheel patterns.%A neurodynamical model for selective visual attention considering orientation preference is proposed.Since orientation preference is one of the most important properties of neurons in the primary visual cortex,it should be fully considered besides external stimuli intensity.By tuning the parameter of orientation preference,the regimes of synchronous dynamics associated with the development of the attention focus are studied.The attention focus is represented by those peripheral neurons that generate spikes synchronously with the central neuron while the activity of other peripheral neurons is suppressed.Such dynamics correspond to the partial synchronization mode.Simulation results show that the model can sequentially select objects with different orientation preferences and has a reliable shift of attention from one object to another,which are consistent with the experimental results that neurons with different orientation preferences are laid out in pinwheel patterns.Selective visual

  1. Post-model selection inference and model averaging

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2011-07-01

    Full Text Available Although model selection is routinely used in practice nowadays, little is known about its precise effects on any subsequent inference that is carried out. The same goes for the effects induced by the closely related technique of model averaging. This paper is concerned with the use of the same data first to select a model and then to carry out inference, in particular point estimation and point prediction. The properties of the resulting estimator, called a post-model-selection estimator (PMSE, are hard to derive. Using selection criteria such as hypothesis testing, AIC, BIC, HQ and Cp, we illustrate that, in terms of risk function, no single PMSE dominates the others. The same conclusion holds more generally for any penalised likelihood information criterion. We also compare various model averaging schemes and show that no single one dominates the others in terms of risk function. Since PMSEs can be regarded as a special case of model averaging, with 0-1 random-weights, we propose a connection between the two theories, in the frequentist approach, by taking account of the selection procedure when performing model averaging. We illustrate the point by simulating a simple linear regression model.

  2. Model structure selection in convolutive mixtures

    DEFF Research Database (Denmark)

    Dyrholm, Mads; Makeig, Scott; Hansen, Lars Kai

    2006-01-01

    The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimoneous represent......The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimoneous...... representation in many practical mixtures. The new filter-CICAAR allows Bayesian model selection and can help answer questions like: 'Are we actually dealing with a convolutive mixture?'. We try to answer this question for EEG data....

  3. Model structure selection in convolutive mixtures

    DEFF Research Database (Denmark)

    Dyrholm, Mads; Makeig, S.; Hansen, Lars Kai

    2006-01-01

    The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimonious represent......The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimonious...... representation in many practical mixtures. The new filter-CICAAR allows Bayesian model selection and can help answer questions like: ’Are we actually dealing with a convolutive mixture?’. We try to answer this question for EEG data....

  4. Skewed factor models using selection mechanisms

    KAUST Repository

    Kim, Hyoung-Moon

    2015-12-21

    Traditional factor models explicitly or implicitly assume that the factors follow a multivariate normal distribution; that is, only moments up to order two are involved. However, it may happen in real data problems that the first two moments cannot explain the factors. Based on this motivation, here we devise three new skewed factor models, the skew-normal, the skew-tt, and the generalized skew-normal factor models depending on a selection mechanism on the factors. The ECME algorithms are adopted to estimate related parameters for statistical inference. Monte Carlo simulations validate our new models and we demonstrate the need for skewed factor models using the classic open/closed book exam scores dataset.

  5. Behavioral optimization models for multicriteria portfolio selection

    Directory of Open Access Journals (Sweden)

    Mehlawat Mukesh Kumar

    2013-01-01

    Full Text Available In this paper, behavioral construct of suitability is used to develop a multicriteria decision making framework for portfolio selection. To achieve this purpose, we rely on multiple methodologies. Analytical hierarchy process technique is used to model the suitability considerations with a view to obtaining the suitability performance score in respect of each asset. A fuzzy multiple criteria decision making method is used to obtain the financial quality score of each asset based upon investor's rating on the financial criteria. Two optimization models are developed for optimal asset allocation considering simultaneously financial and suitability criteria. An empirical study is conducted on randomly selected assets from National Stock Exchange, Mumbai, India to demonstrate the effectiveness of the proposed methodology.

  6. Multi-dimensional model order selection

    Directory of Open Access Journals (Sweden)

    Roemer Florian

    2011-01-01

    Full Text Available Abstract Multi-dimensional model order selection (MOS techniques achieve an improved accuracy, reliability, and robustness, since they consider all dimensions jointly during the estimation of parameters. Additionally, from fundamental identifiability results of multi-dimensional decompositions, it is known that the number of main components can be larger when compared to matrix-based decompositions. In this article, we show how to use tensor calculus to extend matrix-based MOS schemes and we also present our proposed multi-dimensional model order selection scheme based on the closed-form PARAFAC algorithm, which is only applicable to multi-dimensional data. In general, as shown by means of simulations, the Probability of correct Detection (PoD of our proposed multi-dimensional MOS schemes is much better than the PoD of matrix-based schemes.

  7. Model selection and comparison for independents sinusoids

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2014-01-01

    this method by considering the problem in a full Bayesian framework instead of the approximate formulation, on which the asymptotic MAP criterion is based. This leads to a new model selection and comparison method, the lp-BIC, whose computational complexity is of the same order as the asymptotic MAP criterion......In the signal processing literature, many methods have been proposed for estimating the number of sinusoidal basis functions from a noisy data set. The most popular method is the asymptotic MAP criterion, which is sometimes also referred to as the BIC. In this paper, we extend and improve....... Through simulations, we demonstrate that the lp-BIC outperforms the asymptotic MAP criterion and other state of the art methods in terms of model selection, de-noising and prediction performance. The simulation code is available online....

  8. Tracking Models for Optioned Portfolio Selection

    Science.gov (United States)

    Liang, Jianfeng

    In this paper we study a target tracking problem for the portfolio selection involving options. In particular, the portfolio in question contains a stock index and some European style options on the index. A refined tracking-error-variance methodology is adopted to formulate this problem as a multi-stage optimization model. We derive the optimal solutions based on stochastic programming and optimality conditions. Attention is paid to the structure of the optimal payoff function, which is shown to possess rich properties.

  9. New insights in portfolio selection modeling

    OpenAIRE

    Zareei, Abalfazl

    2016-01-01

    Recent advancements in the field of network theory commence a new line of developments in portfolio selection techniques that stands on the ground of perceiving financial market as a network with assets as nodes and links accounting for various types of relationships among financial assets. In the first chapter, we model the shock propagation mechanism among assets via network theory and provide an approach to construct well-diversified portfolios that are resilient to shock propagation and c...

  10. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  11. Bayesian model selection in Gaussian regression

    CERN Document Server

    Abramovich, Felix

    2009-01-01

    We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the optimality properties of the resulting estimator. We establish the oracle inequality and specify conditions on the prior that imply its asymptotic minimaxity within a wide range of sparse and dense settings for "nearly-orthogonal" and "multicollinear" designs.

  12. Model Selection in Data Analysis Competitions

    DEFF Research Database (Denmark)

    Wind, David Kofoed; Winther, Ole

    2014-01-01

    The use of data analysis competitions for selecting the most appropriate model for a problem is a recent innovation in the field of predictive machine learning. Two of the most well-known examples of this trend was the Netflix Competition and recently the competitions hosted on the online platform...... Kaggle. In this paper, we will state and try to verify a set of qualitative hypotheses about predictive modelling, both in general and in the scope of data analysis competitions. To verify our hypotheses we will look at previous competitions and their outcomes, use qualitative interviews with top...

  13. Selective embolization in the treatment of intractable epistaxis

    DEFF Research Database (Denmark)

    Andersen, Pia Juul; Kjeldsen, Anette Drøhse; Nepper-Rasmussen, Jørgen

    2005-01-01

    CONCLUSIONS: In skilled hands, selective embolization is a safe procedure and represents an effective treatment for prolonged epistaxis. Embolization therapy can be repeated if necessary. OBJECTIVE: Severe posterior epistaxis is a common clinical problem in an ENT department and controlling the b...

  14. Parental assessment and treatment of food selectivity in natural settings.

    Science.gov (United States)

    Najdowski, Adel C; Wallace, Michele D; Doney, Janice K; Ghezzi, Patrick M

    2003-01-01

    This study evaluated the effects of a parent-conducted functional analysis and treatment consisting of differential reinforcement of an alternative behavior, escape extinction, and demand fading on food selectivity in a young child with autism. Increases in food acceptance at home and in a restaurant were obtained.

  15. Assessment and Treatment of Selective Mutism with English Language Learners

    Science.gov (United States)

    Mayworm, Ashley M.; Dowdy, Erin; Knights, Kezia; Rebelez, Jennica

    2015-01-01

    Selective mutism (SM) is a type of anxiety disorder that involves the persistent failure to speak in contexts where speech is typically expected (e.g., school), despite speaking in other contexts (e.g., home). Research on the etiology and treatment of SM is limited, as it is a rare disorder and few clinical trials evaluating SM interventions have…

  16. [Selective serotonine reuptake inhibitors (SSRI) in the treatment of paraphilia].

    Science.gov (United States)

    Kraus, C; Strohm, K; Hill, A; Habermann, N; Berner, W; Briken, P

    2007-06-01

    For about 15 years selective serotonine reuptake inhibitors (SSRI) have been used in the treatment of paraphilias. In an open, uncontrolled, retrospective study, which was the first in the German speaking countries we investigated 16 male outpatients, who have been treated for different paraphilias with SSRI and psychotherapy. There was a marked reduction in paraphilic symptoms. Despite high rates of sexual side effects most patients reported a high overall treatment satisfaction. SSRI are an important addition in pharmacological treatment of paraphilic patients, especially with a risk of so called "hands-off" delinquency.

  17. Selection of an optimal treatment method for acute periodontitis disease.

    Science.gov (United States)

    Aliev, Rafik A; Aliyev, B F; Gardashova, Latafat A; Huseynov, Oleg H

    2012-04-01

    The present paper is devoted to selection of an optimal treatment method for acute periodontitis by using fuzzy Choquet integral-based approach. We consider application of different treatment methods depending on development stages and symptoms of the disease. The effectiveness of application of different treatment methods in each stage of the disease is linguistically evaluated by a dentist. The stages of the disease are also linguistically described by a dentist. Dentist's linguistic evaluations are represented by fuzzy sets. The total effectiveness of the each considered treatment method is calculated by using fuzzy Choquet integral with fuzzy number-valued integrand and fuzzy number-valued fuzzy measure. The most effective treatment method is determined by using fuzzy ranking method.

  18. Inflation model selection meets dark radiation

    Science.gov (United States)

    Tram, Thomas; Vallance, Robert; Vennin, Vincent

    2017-01-01

    We investigate how inflation model selection is affected by the presence of additional free-streaming relativistic degrees of freedom, i.e. dark radiation. We perform a full Bayesian analysis of both inflation parameters and cosmological parameters taking reheating into account self-consistently. We compute the Bayesian evidence for a few representative inflation scenarios in both the standard ΛCDM model and an extension including dark radiation parametrised by its effective number of relativistic species Neff. Using a minimal dataset (Planck low-l polarisation, temperature power spectrum and lensing reconstruction), we find that the observational status of most inflationary models is unchanged. The exceptions are potentials such as power-law inflation that predict large values for the scalar spectral index that can only be realised when Neff is allowed to vary. Adding baryon acoustic oscillations data and the B-mode data from BICEP2/Keck makes power-law inflation disfavoured, while adding local measurements of the Hubble constant H0 makes power-law inflation slightly favoured compared to the best single-field plateau potentials. This illustrates how the dark radiation solution to the H0 tension would have deep consequences for inflation model selection.

  19. Efficiently adapting graphical models for selectivity estimation

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2013-01-01

    of the selectivities of the constituent predicates. However, this independence assumption is more often than not wrong, and is considered to be the most common cause of sub-optimal query execution plans chosen by modern query optimizers. We take a step towards a principled and practical approach to performing...... cardinality estimation without making the independence assumption. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution over all the attributes in the database into small, usually two-dimensional distributions, without a significant loss......Query optimizers rely on statistical models that succinctly describe the underlying data. Models are used to derive cardinality estimates for intermediate relations, which in turn guide the optimizer to choose the best query execution plan. The quality of the resulting plan is highly dependent...

  20. The Markowitz model for portfolio selection

    Directory of Open Access Journals (Sweden)

    MARIAN ZUBIA ZUBIAURRE

    2002-06-01

    Full Text Available Since its first appearance, The Markowitz model for portfolio selection has been a basic theoretical reference, opening several new development options. However, practically it has not been used among portfolio managers and investment analysts in spite of its success in the theoretical field. With our paper we would like to show how The Markowitz model may be of great help in real stock markets. Through an empirical study we want to verify the capability of Markowitz’s model to present portfolios with higher profitability and lower risk than the portfolio represented by IBEX-35 and IGBM indexes. Furthermore, we want to test suggested efficiency of these indexes as representatives of market theoretical-portfolio.

  1. Model selection for Poisson processes with covariates

    CERN Document Server

    Sart, Mathieu

    2011-01-01

    We observe $n$ inhomogeneous Poisson processes with covariates and aim at estimating their intensities. To handle this problem, we assume that the intensity of each Poisson process is of the form $s (\\cdot, x)$ where $x$ is the covariate and where $s$ is an unknown function. We propose a model selection approach where the models are used to approximate the multivariate function $s$. We show that our estimator satisfies an oracle-type inequality under very weak assumptions both on the intensities and the models. By using an Hellinger-type loss, we establish non-asymptotic risk bounds and specify them under various kind of assumptions on the target function $s$ such as being smooth or composite. Besides, we show that our estimation procedure is robust with respect to these assumptions.

  2. Information criteria for astrophysical model selection

    CERN Document Server

    Liddle, A R

    2007-01-01

    Model selection is the problem of distinguishing competing models, perhaps featuring different numbers of parameters. The statistics literature contains two distinct sets of tools, those based on information theory such as the Akaike Information Criterion (AIC), and those on Bayesian inference such as the Bayesian evidence and Bayesian Information Criterion (BIC). The Deviance Information Criterion combines ideas from both heritages; it is readily computed from Monte Carlo posterior samples and, unlike the AIC and BIC, allows for parameter degeneracy. I describe the properties of the information criteria, and as an example compute them from WMAP3 data for several cosmological models. I find that at present the information theory and Bayesian approaches give significantly different conclusions from that data.

  3. Entropic Priors and Bayesian Model Selection

    CERN Document Server

    Brewer, Brendon J

    2009-01-01

    We demonstrate that the principle of maximum relative entropy (ME), used judiciously, can ease the specification of priors in model selection problems. The resulting effect is that models that make sharp predictions are disfavoured, weakening the usual Bayesian "Occam's Razor". This is illustrated with a simple example involving what Jaynes called a "sure thing" hypothesis. Jaynes' resolution of the situation involved introducing a large number of alternative "sure thing" hypotheses that were possible before we observed the data. However, in more complex situations, it may not be possible to explicitly enumerate large numbers of alternatives. The entropic priors formalism produces the desired result without modifying the hypothesis space or requiring explicit enumeration of alternatives; all that is required is a good model for the prior predictive distribution for the data. This idea is illustrated with a simple rigged-lottery example, and we outline how this idea may help to resolve a recent debate amongst ...

  4. Appropriate model selection methods for nonstationary generalized extreme value models

    Science.gov (United States)

    Kim, Hanbeen; Kim, Sooyoung; Shin, Hongjoon; Heo, Jun-Haeng

    2017-04-01

    Several evidences of hydrologic data series being nonstationary in nature have been found to date. This has resulted in the conduct of many studies in the area of nonstationary frequency analysis. Nonstationary probability distribution models involve parameters that vary over time. Therefore, it is not a straightforward process to apply conventional goodness-of-fit tests to the selection of an appropriate nonstationary probability distribution model. Tests that are generally recommended for such a selection include the Akaike's information criterion (AIC), corrected Akaike's information criterion (AICc), Bayesian information criterion (BIC), and likelihood ratio test (LRT). In this study, the Monte Carlo simulation was performed to compare the performances of these four tests, with regard to nonstationary as well as stationary generalized extreme value (GEV) distributions. Proper model selection ratios and sample sizes were taken into account to evaluate the performances of all the four tests. The BIC demonstrated the best performance with regard to stationary GEV models. In case of nonstationary GEV models, the AIC proved to be better than the other three methods, when relatively small sample sizes were considered. With larger sample sizes, the AIC, BIC, and LRT presented the best performances for GEV models which have nonstationary location and/or scale parameters, respectively. Simulation results were then evaluated by applying all four tests to annual maximum rainfall data of selected sites, as observed by the Korea Meteorological Administration.

  5. Ancestral process and diffusion model with selection

    CERN Document Server

    Mano, Shuhei

    2008-01-01

    The ancestral selection graph in population genetics introduced by Krone and Neuhauser (1997) is an analogue to the coalescent genealogy. The number of ancestral particles, backward in time, of a sample of genes is an ancestral process, which is a birth and death process with quadratic death and linear birth rate. In this paper an explicit form of the number of ancestral particle is obtained, by using the density of the allele frequency in the corresponding diffusion model obtained by Kimura (1955). It is shown that fixation is convergence of the ancestral process to the stationary measure. The time to fixation of an allele is studied in terms of the ancestral process.

  6. Model application for acid mine drainage treatment processes

    Directory of Open Access Journals (Sweden)

    Nantaporn Noosai, Vineeth Vijayan, Khokiat Kengskool

    2014-01-01

    Full Text Available This paper presents the utilization of the geochemical model, PHREEQC, to investigate the chemical treatment system for Acid Mine Drainage (AMD prior to the discharge. The selected treatment system consists of treatment processes commonly used for AMD including settling pond, vertical flow pond (VFP and caustic soda pond were considered in this study. The use of geochemical model for the treatment process analysis enhances the understanding of the changes in AMD’s chemistry (precipitation, reduction of metals, etc. in each process, thus, the chemical requirements (i.e., CaCO3 and NaOH for the system and the system’s treatment efficiency can be determined. The selected treatment system showed that the final effluent meet the discharge standard. The utilization of geochemical model to investigate AMD treatment processes can assist in the process design.

  7. Selecting Great Lakes streams for lampricide treatment based on larval sea lamprey surveys

    Science.gov (United States)

    Christie, Gavin C.; Adams, Jean V.; Steeves, Todd B.; Slade, Jeffrey W.; Cuddy, Douglas W.; Fodale, Michael F.; Young, Robert J.; Kuc, Miroslaw; Jones, Michael L.

    2003-01-01

    The Empiric Stream Treatment Ranking (ESTR) system is a data-driven, model-based, decision tool for selecting Great Lakes streams for treatment with lampricide, based on estimates from larval sea lamprey (Petromyzon marinus) surveys conducted throughout the basin. The 2000 ESTR system was described and applied to larval assessment surveys conducted from 1996 to 1999. A comparative analysis of stream survey and selection data was conducted and improvements to the stream selection process were recommended. Streams were selected for treatment based on treatment cost, predicted treatment effectiveness, and the projected number of juvenile sea lampreys produced. On average, lampricide treatments were applied annually to 49 streams with 1,075 ha of larval habitat, killing 15 million larval and 514,000 juvenile sea lampreys at a total cost of $5.3 million, and marginal and mean costs of $85 and $10 per juvenile killed. The numbers of juvenile sea lampreys killed for given treatment costs showed a pattern of diminishing returns with increasing investment. Of the streams selected for treatment, those with > 14 ha of larval habitat targeted 73% of the juvenile sea lampreys for 60% of the treatment cost. Suggested improvements to the ESTR system were to improve accuracy and precision of model estimates, account for uncertainty in estimates, include all potentially productive streams in the process (not just those surveyed in the current year), consider the value of all larvae killed during treatment (not just those predicted to metamorphose the following year), use lake-specific estimates of damage, and establish formal suppression targets.

  8. Improving randomness characterization through Bayesian model selection

    CERN Document Server

    R., Rafael Díaz-H; Martínez, Alí M Angulo; U'Ren, Alfred B; Hirsch, Jorge G; Marsili, Matteo; Castillo, Isaac Pérez

    2016-01-01

    Nowadays random number generation plays an essential role in technology with important applications in areas ranging from cryptography, which lies at the core of current communication protocols, to Monte Carlo methods, and other probabilistic algorithms. In this context, a crucial scientific endeavour is to develop effective methods that allow the characterization of random number generators. However, commonly employed methods either lack formality (e.g. the NIST test suite), or are inapplicable in principle (e.g. the characterization derived from the Algorithmic Theory of Information (ATI)). In this letter we present a novel method based on Bayesian model selection, which is both rigorous and effective, for characterizing randomness in a bit sequence. We derive analytic expressions for a model's likelihood which is then used to compute its posterior probability distribution. Our method proves to be more rigorous than NIST's suite and the Borel-Normality criterion and its implementation is straightforward. We...

  9. SELECTION OF CHEMICAL TREATMENT PROGRAM FOR OILY WASTEWATER

    Directory of Open Access Journals (Sweden)

    Miguel Díaz

    2017-04-01

    Full Text Available When selecting a chemical treatment program for wastewater to achieve an effective flocculation and coagulation is crucial to understand how individual colloids interact. The coagulation process requires a rapid mixing while flocculation process needs a slow mixing. The behavior of colloids in water is strongly influenced by the electrokinetic charge, where each colloidal particle carries its own charge, which in its nature is usually negative. Polymers, which are long chains of high molecular weight and high charge, when added to water begin to form longer chains, allowing removing numerous particles of suspended matter. A study of physico-chemical treatment by addition of coagulant and flocculant was carried out in order to determine a chemical program for oily wastewater coming from the gravity separation process in a crude oil refinery. The tests were carried out in a Jar Test equipment, where commercial products: aluminum polychloride (PAC, aluminum sulfate and Sintec D50 were evaluated with five different flocculants. The selected chemical program was evaluated with fluids at three temperatures to know its sensitivity to this parameter and the mixing energy in the coagulation and flocculation. The chemical program and operational characteristics for physico-chemical treatment with PAC were determined, obtaining a removal of more than 93% for suspended matter and 96% for total hydrocarbons for the selected coagulant / flocculant combination.

  10. Inflation Model Selection meets Dark Radiation

    CERN Document Server

    Tram, Thomas; Vennin, Vincent

    2016-01-01

    We investigate how inflation model selection is affected by the presence of additional free-streaming relativistic degrees of freedom, i.e. dark radiation. We perform a full Bayesian analysis of both inflation parameters and cosmological parameters taking reheating into account self-consistently. We compute the Bayesian evidence for a few representative inflation scenarios in both the standard $\\Lambda\\mathrm{CDM}$ model and an extension including dark radiation parametrised by its effective number of relativistic species $N_\\mathrm{eff}$. We find that the observational status of most inflationary models is unchanged, with the exception of potentials such as power-law inflation that predict a value for the scalar spectral index that is too large in $\\Lambda\\mathrm{CDM}$ but which can be accommodated when $N_\\mathrm{eff}$ is allowed to vary. In this case, cosmic microwave background data indicate that power-law inflation is one of the best models together with plateau potentials. However, contrary to plateau p...

  11. Appropriate selection for omalizumab treatment in patients with severe asthma?

    DEFF Research Database (Denmark)

    Nygaard, Leo; Henriksen, Daniel Pilsgaard; Madsen, Hanne

    2017-01-01

    from the Electronic Patient Journal of OUH and Odense Pharmaco-Epidemiological Database. Guideline criteria for omalizumab treatment were used to evaluate the appropriateness of omalizumab candidate selection, and the Asthma Control Test (ACT) to assess the clinical effects of omalizumab at weeks 16...... to guidelines, and the clinical effect of omalizumab treatment over time. Design: We performed a retrospective observational study on adult patients with asthma treated with omalizumab during 2006-2015 at the Department of Respiratory Medicine at Odense University Hospital (OUH), Denmark. Data were obtained...... and 52 from treatment initiation. Results: During the observation period, 24 patients received omalizumab, but only 10 patients (42%) fulfilled criteria recommended by international guidelines. The main reasons for not fulfilling the criteria were inadequately reduced lung function, insufficient number...

  12. High-dimensional model estimation and model selection

    CERN Document Server

    CERN. Geneva

    2015-01-01

    I will review concepts and algorithms from high-dimensional statistics for linear model estimation and model selection. I will particularly focus on the so-called p>>n setting where the number of variables p is much larger than the number of samples n. I will focus mostly on regularized statistical estimators that produce sparse models. Important examples include the LASSO and its matrix extension, the Graphical LASSO, and more recent non-convex methods such as the TREX. I will show the applicability of these estimators in a diverse range of scientific applications, such as sparse interaction graph recovery and high-dimensional classification and regression problems in genomics.

  13. Fuzzy modelling for selecting headgear types.

    Science.gov (United States)

    Akçam, M Okan; Takada, Kenji

    2002-02-01

    The purpose of this study was to develop a computer-assisted inference model for selecting appropriate types of headgear appliance for orthodontic patients and to investigate its clinical versatility as a decision-making aid for inexperienced clinicians. Fuzzy rule bases were created for degrees of overjet, overbite, and mandibular plane angle variables, respectively, according to subjective criteria based on the clinical experience and knowledge of the authors. The rules were then transformed into membership functions and the geometric mean aggregation was performed to develop the inference model. The resultant fuzzy logic was then tested on 85 cases in which the patients had been diagnosed as requiring headgear appliances. Eight experienced orthodontists judged each of the cases, and decided if they 'agreed', 'accepted', or 'disagreed' with the recommendations of the computer system. Intra-examiner agreements were investigated using repeated judgements of a set of 30 orthodontic cases and the kappa statistic. All of the examiners exceeded a kappa score of 0.7, allowing them to participate in the test run of the validity of the proposed inference model. The examiners' agreement with the system's recommendations was evaluated statistically. The average satisfaction rate of the examiners was 95.6 per cent and, for 83 out of the 85 cases, 97.6 per cent. The majority of the examiners (i.e. six or more out of the eight) were satisfied with the recommendations of the system. Thus, the usefulness of the proposed inference logic was confirmed.

  14. SLAM: A Connectionist Model for Attention in Visual Selection Tasks.

    Science.gov (United States)

    Phaf, R. Hans; And Others

    1990-01-01

    The SeLective Attention Model (SLAM) performs visual selective attention tasks and demonstrates that object selection and attribute selection are both necessary and sufficient for visual selection. The SLAM is described, particularly with regard to its ability to represent an individual subject performing filtering tasks. (TJH)

  15. Estimation of a multivariate mean under model selection uncertainty

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2014-05-01

    Full Text Available Model selection uncertainty would occur if we selected a model based on one data set and subsequently applied it for statistical inferences, because the "correct" model would not be selected with certainty.  When the selection and inference are based on the same dataset, some additional problems arise due to the correlation of the two stages (selection and inference. In this paper model selection uncertainty is considered and model averaging is proposed. The proposal is related to the theory of James and Stein of estimating more than three parameters from independent normal observations. We suggest that a model averaging scheme taking into account the selection procedure could be more appropriate than model selection alone. Some properties of this model averaging estimator are investigated; in particular we show using Stein's results that it is a minimax estimator and can outperform Stein-type estimators.

  16. Endoscopic bronchial valve treatment: patient selection and special considerations

    Directory of Open Access Journals (Sweden)

    Eberhardt R

    2015-10-01

    Full Text Available Ralf Eberhardt,1,2 Daniela Gompelmann,1,2 Felix JF Herth,1,2 Maren Schuhmann1 1Pneumology and Critical Care Medicine, Thoraxklinik at the University of Heidelberg, 2Translational Lung Research Center, Member of the German Center for Lung Research, Heidelberg, Germany Abstract: As well as lung volume reduction surgery, different minimally invasive endoscopic techniques are available to achieve lung volume reduction in patients with severe emphysema and significant hyperinflation. Lung function parameters and comorbidities of the patient, as well as the extent and distribution of the emphysema are factors to be considered when choosing the patient and the intervention. Endoscopic bronchial valve placement with complete occlusion of one lobe in patients with heterogeneous emphysema is the preferred technique because of its reversibility. The presence of high interlobar collateral ventilation will hinder successful treatment; therefore, endoscopic coil placement, polymeric lung volume reduction, or bronchoscopic thermal vapor ablation as well as lung volume reduction surgery can be used for treating patients with incomplete fissures. The effect of endoscopic lung volume reduction in patients with a homogeneous distribution of emphysema is still unclear and this subgroup should be treated only in clinical trials. Precise patient selection is necessary for interventions and to improve the outcome and reduce the risk and possible complications. Therefore, the patients should be discussed in a multidisciplinary approach prior to determining the most appropriate treatment for lung volume reduction. Keywords: lung emphysema, valve treatment, collateral ventilation, patient selection, outcome

  17. [Evaluation and selection of VOCs treatment technologies in packaging and printing industry].

    Science.gov (United States)

    Wang, Hai-Lin; Wang, Jun-Hui; Zhu, Chun-Lei; Nie, Lei; Hao, Zheng-Ping

    2014-07-01

    Volatile organic compounds (VOCs) play an important role in urban air pollution. Activities of industries including the packaging and printing industries are regarded as the major sources. How to select the suitable treating techniques is the major problem for emission control. In this article, based on the VOCs emission characteristics of the packaging and printing industry and the existing treatment technologies, using the analytic hierarchy process (AHP) model, an evaluation system for VOCs selection was established and all the technologies used for treatment were assessed. It showed that the priority selection was in the following order: Carbon Fiber Adsorption-Desorption > Granular Carbon Adsorption-Desorption > Thermal Combustion > Regenerative Combustion > Catalytic combustion > Rotary adsorption-concentration and combustion > Granular Carbon adsorption-concentration and combustion. Carbon Fiber Adsorption-Desorption was selected as the best available technology due to its highest weight among those technologies.

  18. Selection of medical treatment in stable angina pectoris

    DEFF Research Database (Denmark)

    Ardissino, D; Savonitto, S; Egstrup, K

    1995-01-01

    the best pharmacologic approach in individual patients. METHODS: In this prospective multicenter study, 280 patients with stable angina pectoris were enrolled in 25 European centers. After baseline evaluation, consisting of an exercise test and a questionnaire investigating patients' anginal symptoms......OBJECTIVES: The present study was designed to investigate which characteristics of anginal symptoms or exercise test results could predict the favorable anti-ischemic effect of the beta-adrenergic blocking agent metoprolol and the calcium antagonist nifedipine in patients with stable angina...... pectoris. BACKGROUND: The characteristics of anginal symptoms and the results of exercise testing are considered of great importance for selecting medical treatment in patients with chronic stable angina pectoris. However, little information is available on how this first evaluation may be used to select...

  19. Hidden Markov Model for Stock Selection

    Directory of Open Access Journals (Sweden)

    Nguyet Nguyen

    2015-10-01

    Full Text Available The hidden Markov model (HMM is typically used to predict the hidden regimes of observation data. Therefore, this model finds applications in many different areas, such as speech recognition systems, computational molecular biology and financial market predictions. In this paper, we use HMM for stock selection. We first use HMM to make monthly regime predictions for the four macroeconomic variables: inflation (consumer price index (CPI, industrial production index (INDPRO, stock market index (S&P 500 and market volatility (VIX. At the end of each month, we calibrate HMM’s parameters for each of these economic variables and predict its regimes for the next month. We then look back into historical data to find the time periods for which the four variables had similar regimes with the forecasted regimes. Within those similar periods, we analyze all of the S&P 500 stocks to identify which stock characteristics have been well rewarded during the time periods and assign scores and corresponding weights for each of the stock characteristics. A composite score of each stock is calculated based on the scores and weights of its features. Based on this algorithm, we choose the 50 top ranking stocks to buy. We compare the performances of the portfolio with the benchmark index, S&P 500. With an initial investment of $100 in December 1999, over 15 years, in December 2014, our portfolio had an average gain per annum of 14.9% versus 2.3% for the S&P 500.

  20. Selective serotonin reuptake inhibitors in the treatment of premature ejaculation

    Institute of Scientific and Technical Information of China (English)

    WANG Wei-fu; CHANG Le; Suks Minhas; David J Ralph

    2007-01-01

    Objective To review and assess the update studies regarding se lective serotonin reuptake inhibitors (SSRIs) in the treatment of premature ejaculation (PE) and then provide practical recommendations and possible mechanisms concerning state of the art knowledge for the use of SSRIs in alleviating PE.Data sources Using the Medline, 48 articles published from January 1st, 1996 to August 1st, 2006 concerning the use of SSRIs and their possible mechanisms in alleviating PE were found and reviewed.Study selection PE, rapid ejaculation, early ejaculation and SSRIs were employed as the keywords, and relevant articles about the use of SSRIs and their possible mechanisms in the treatment of PE were selected.Results Many kinds of SSRIs, such as fluoxetine, sertraline, paroxetine and citalopram, have widely been employed to treat PE. However, their effects are moderate and there is no a universal agreement about the kind, dose, protocol and duration. Dapoxetine, as the first prescription treatment of PE, may change this bottle-neck situation. SSRIs are suggested to be used in young men with lifelong PE, and acquired PE when etiological factors are removed but PE still exists. Phosphodiesterase 5 inhibitors (PDE5-Is) are suggested to be employed alone or combined with SSRIs when SSRIs fail to treat PE or sexual dysfunction associated with SSRIs occurs. The protocol of taking drugs on demand based on taking them daily for a suitable period is proposed to be chosen firstly. The possible mechanisms include increasing serotonergic neurotransmission and activating 5-hydroxytryptamine 2C (5-HT2C) receptors, then switching the ejaculatory threshold to a higher level, decreasing the penile sensitivity and their own effect of antidepression.Conclusion The efficacies of the current SSRIs are moderate in the treatment of PE and they have not been approved by the FDA, therefore new SSRI like dapoxetine needs to be further evaluated.

  1. Feature selection and survival modeling in The Cancer Genome Atlas

    Directory of Open Access Journals (Sweden)

    Kim H

    2013-09-01

    Full Text Available Hyunsoo Kim,1 Markus Bredel2 1Department of Pathology, The University of Alabama at Birmingham, Birmingham, AL, USA; 2Department of Radiation Oncology, and Comprehensive Cancer Center, The University of Alabama at Birmingham, Birmingham, AL, USA Purpose: Personalized medicine is predicated on the concept of identifying subgroups of a common disease for better treatment. Identifying biomarkers that predict disease subtypes has been a major focus of biomedical science. In the era of genome-wide profiling, there is controversy as to the optimal number of genes as an input of a feature selection algorithm for survival modeling. Patients and methods: The expression profiles and outcomes of 544 patients were retrieved from The Cancer Genome Atlas. We compared four different survival prediction methods: (1 1-nearest neighbor (1-NN survival prediction method; (2 random patient selection method and a Cox-based regression method with nested cross-validation; (3 least absolute shrinkage and selection operator (LASSO optimization using whole-genome gene expression profiles; or (4 gene expression profiles of cancer pathway genes. Results: The 1-NN method performed better than the random patient selection method in terms of survival predictions, although it does not include a feature selection step. The Cox-based regression method with LASSO optimization using whole-genome gene expression data demonstrated higher survival prediction power than the 1-NN method, but was outperformed by the same method when using gene expression profiles of cancer pathway genes alone. Conclusion: The 1-NN survival prediction method may require more patients for better performance, even when omitting censored data. Using preexisting biological knowledge for survival prediction is reasonable as a means to understand the biological system of a cancer, unless the analysis goal is to identify completely unknown genes relevant to cancer biology. Keywords: brain, feature selection

  2. Technology selection for MSW treatment in Altiplano areas using FMDM.

    Science.gov (United States)

    Jiang, Jianguo; Lou, Zhiying; Hg, Siio; Duo, Ji; Li, Zhong

    2009-10-01

    There are special requirements for municipal solid waste (MSW) treatment caused by lower oxygen content and atmospheric pressure on the Altiplano. The intention of this paper was to analyse the applicability of various technologies to MSW treatment in the Altiplano and select the best one based on the current MSW collection modes and technical levels, using the Fuzzy Mathematical Decision Method (FMDM). Technologies including landfill, incineration, composting, and anaerobic digestion (AD) were compared. The results of the studies showed that AD technology is a new technology which is attractive in economic terms and helpful for environmental harmony. AD can solve the difficulties caused by a high content of organic matter in the MSW, lower atmospheric pressure and oxygen content on the Altiplano. Moreover, it can achieve reduction and recycling of the waste, thereby saving space for treatment and disposal. Using this technology, renewable energy can be recovered to save conventional fuel consumption and the emission of greenhouse gases can be reduced to improve the conservation of the local ecosystem. Putting AD into practice in the Altiplano may be the preferred method of MSW treatment.

  3. Treatment Options for Liquid Radioactive Waste. Factors Important for Selecting of Treatment Methods

    Energy Technology Data Exchange (ETDEWEB)

    Dziewinski, J.J.

    1998-09-28

    The cleanup of liquid streams contaminated with radionuclides is obtained by the selection or a combination of a number of physical and chemical separations, processes or unit operations. Among those are: Chemical treatment; Evaporation; Ion exchange and sorption; Physical separation; Electrodialysis; Osmosis; Electrocoagulation/electroflotation; Biotechnological processes; and Solvent extraction.

  4. Models Predicting Success of Infertility Treatment: A Systematic Review

    Science.gov (United States)

    Zarinara, Alireza; Zeraati, Hojjat; Kamali, Koorosh; Mohammad, Kazem; Shahnazari, Parisa; Akhondi, Mohammad Mehdi

    2016-01-01

    Background: Infertile couples are faced with problems that affect their marital life. Infertility treatment is expensive and time consuming and occasionally isn’t simply possible. Prediction models for infertility treatment have been proposed and prediction of treatment success is a new field in infertility treatment. Because prediction of treatment success is a new need for infertile couples, this paper reviewed previous studies for catching a general concept in applicability of the models. Methods: This study was conducted as a systematic review at Avicenna Research Institute in 2015. Six data bases were searched based on WHO definitions and MESH key words. Papers about prediction models in infertility were evaluated. Results: Eighty one papers were eligible for the study. Papers covered years after 1986 and studies were designed retrospectively and prospectively. IVF prediction models have more shares in papers. Most common predictors were age, duration of infertility, ovarian and tubal problems. Conclusion: Prediction model can be clinically applied if the model can be statistically evaluated and has a good validation for treatment success. To achieve better results, the physician and the couples’ needs estimation for treatment success rate were based on history, the examination and clinical tests. Models must be checked for theoretical approach and appropriate validation. The privileges for applying the prediction models are the decrease in the cost and time, avoiding painful treatment of patients, assessment of treatment approach for physicians and decision making for health managers. The selection of the approach for designing and using these models is inevitable. PMID:27141461

  5. Men's perspectives on selecting their prostate cancer treatment.

    Science.gov (United States)

    Xu, Jinping; Dailey, Rhonda K; Eggly, Susan; Neale, Anne Victoria; Schwartz, Kendra L

    2011-06-01

    In the context of scientific uncertainty, treatment choices for localized prostate cancer vary, but reasons for this variation are unclear. We explored how black and white American men made their treatment decision. Guided by conceptual model, we conducted semistructured interviews of 21 American (14 black and 7 white) men with recently diagnosed localized prostate cancer. Physician recommendation was very important in the treatment decision, but patient self-perception/values and attitudes/beliefs about prostate cancer were also influential. Patients who chose surgery believed it offered the best chance of cure and were more concerned that the cancer might spread if not surgically removed. Patients who chose radiation therapy believed it offered equal efficacy of cure but fewer side effects than surgery. Fear of future consequences was the most common reason to reject watchful waiting. Anecdotal experiences of family and friends were also important, especially in deciding "what not to do." The new technology of robotic-assisted prostatectomy provided optimism for men who wanted surgery but feared morbidity associated with traditional open surgery. Few men seemed aware that treatment did not guarantee improved survival. Most men reported making "the best choice for me" by taking into account medical information and personal factors. Perceptions of treatment efficacy and side effects, which derived mainly from physicians' descriptions and/or anecdotal experiences of family and friends, were the most influential factors in men's treatment decision. By understanding factors that influence patients' treatment decisions, clinicians may be more sensitive to individual patients' preferences/concerns and provide more patient-centered care.

  6. Men’s Perspectives on Selecting Their Prostate Cancer Treatment

    Science.gov (United States)

    Xu, Jinping; Dailey, Rhonda K.; Eggly, Susan; Neale, Anne Victoria; Schwartz, Kendra L.

    2014-01-01

    Objective In the context of scientific uncertainty, treatment choices for localized prostate cancer vary, but reasons for this variation are unclear. We explored how black and white American men made their treatment decision. Methods Guided by conceptual model, we conducted semistructured interviews of 21 American (14 black and 7 white) men with recently diagnosed localized prostate cancer. Results Physician recommendation was very important in the treatment decision, but patient self-perception/values and attitudes/beliefs about prostate cancer were also influential. Patients who chose surgery believed it offered the best chance of cure and were more concerned that the cancer might spread if not surgically removed. Patients who chose radiation therapy believed it offered equal efficacy of cure but fewer side effects than surgery. Fear of future consequences was the most common reason to reject watchful waiting. Anecdotal experiences of family and friends were also important, especially in deciding “what not to do.” The new technology of robotic-assisted prostatectomy provided optimism for men who wanted surgery but feared morbidity associated with traditional open surgery. Few men seemed aware that treatment did not guarantee improved survival. Conclusion Most men reported making “the best choice for me” by taking into account medical information and personal factors. Perceptions of treatment efficacy and side effects, which derived mainly from physicians’ descriptions and/or anecdotal experiences of family and friends, were the most influential factors in men’s treatment decision. By understanding factors that influence patients’ treatment decisions, clinicians may be more sensitive to individual patients’ preferences/ concerns and provide more patient-centered care. PMID:21830629

  7. Modeling selective elimination of quiescent cancer cells from bone marrow.

    Science.gov (United States)

    Cavnar, Stephen P; Rickelmann, Andrew D; Meguiar, Kaille F; Xiao, Annie; Dosch, Joseph; Leung, Brendan M; Cai Lesher-Perez, Sasha; Chitta, Shashank; Luker, Kathryn E; Takayama, Shuichi; Luker, Gary D

    2015-08-01

    Patients with many types of malignancy commonly harbor quiescent disseminated tumor cells in bone marrow. These cells frequently resist chemotherapy and may persist for years before proliferating as recurrent metastases. To test for compounds that eliminate quiescent cancer cells, we established a new 384-well 3D spheroid model in which small numbers of cancer cells reversibly arrest in G1/G0 phase of the cell cycle when cultured with bone marrow stromal cells. Using dual-color bioluminescence imaging to selectively quantify viability of cancer and stromal cells in the same spheroid, we identified single compounds and combination treatments that preferentially eliminated quiescent breast cancer cells but not stromal cells. A treatment combination effective against malignant cells in spheroids also eliminated breast cancer cells from bone marrow in a mouse xenograft model. This research establishes a novel screening platform for therapies that selectively target quiescent tumor cells, facilitating identification of new drugs to prevent recurrent cancer. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  8. The detection of observations possibly influential for model selection

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1991-01-01

    textabstractModel selection can involve several variables and selection criteria. A simple method to detect observations possibly influential for model selection is proposed. The potentials of this method are illustrated with three examples, each of which is taken from related studies.

  9. Selective experimental review of the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Bloom, E.D.

    1985-02-01

    Before disussing experimental comparisons with the Standard Model, (S-M) it is probably wise to define more completely what is commonly meant by this popular term. This model is a gauge theory of SU(3)/sub f/ x SU(2)/sub L/ x U(1) with 18 parameters. The parameters are ..cap alpha../sub s/, ..cap alpha../sub qed/, theta/sub W/, M/sub W/ (M/sub Z/ = M/sub W//cos theta/sub W/, and thus is not an independent parameter), M/sub Higgs/; the lepton masses, M/sub e/, M..mu.., M/sub r/; the quark masses, M/sub d/, M/sub s/, M/sub b/, and M/sub u/, M/sub c/, M/sub t/; and finally, the quark mixing angles, theta/sub 1/, theta/sub 2/, theta/sub 3/, and the CP violating phase delta. The latter four parameters appear in the quark mixing matrix for the Kobayashi-Maskawa and Maiani forms. Clearly, the present S-M covers an enormous range of physics topics, and the author can only lightly cover a few such topics in this report. The measurement of R/sub hadron/ is fundamental as a test of the running coupling constant ..cap alpha../sub s/ in QCD. The author will discuss a selection of recent precision measurements of R/sub hadron/, as well as some other techniques for measuring ..cap alpha../sub s/. QCD also requires the self interaction of gluons. The search for the three gluon vertex may be practically realized in the clear identification of gluonic mesons. The author will present a limited review of recent progress in the attempt to untangle such mesons from the plethora q anti q states of the same quantum numbers which exist in the same mass range. The electroweak interactions provide some of the strongest evidence supporting the S-M that exists. Given the recent progress in this subfield, and particularly with the discovery of the W and Z bosons at CERN, many recent reviews obviate the need for further discussion in this report. In attempting to validate a theory, one frequently searches for new phenomena which would clearly invalidate it. 49 references, 28 figures.

  10. An integrated model for supplier selection process

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In today's highly competitive manufacturing environment, the supplier selection process becomes one of crucial activities in supply chain management. In order to select the best supplier(s) it is not only necessary to continuously tracking and benchmarking performance of suppliers but also to make a tradeoff between tangible and intangible factors some of which may conflict. In this paper an integration of case-based reasoning (CBR), analytical network process (ANP) and linear programming (LP) is proposed to solve the supplier selection problem.

  11. Dealing with selection bias in educational transition models

    DEFF Research Database (Denmark)

    Holm, Anders; Jæger, Mads Meier

    2011-01-01

    This paper proposes the bivariate probit selection model (BPSM) as an alternative to the traditional Mare model for analyzing educational transitions. The BPSM accounts for selection on unobserved variables by allowing for unobserved variables which affect the probability of making educational...... transitions to be correlated across transitions. We use simulated and real data to illustrate how the BPSM improves on the traditional Mare model in terms of correcting for selection bias and providing credible estimates of the effect of family background on educational success. We conclude that models which...... account for selection on unobserved variables and high-quality data are both required in order to estimate credible educational transition models....

  12. Cold atmospheric plasma treatment selectively targets head and neck squamous cell carcinoma cells.

    Science.gov (United States)

    Guerrero-Preston, Rafael; Ogawa, Takenori; Uemura, Mamoru; Shumulinsky, Gary; Valle, Blanca L; Pirini, Francesca; Ravi, Rajani; Sidransky, David; Keidar, Michael; Trink, Barry

    2014-10-01

    The treatment of locoregional recurrence (LRR) of head and neck squamous cell carcinoma (HNSCC) often requires a combination of surgery, radiation therapy and/or chemotherapy. Survival outcomes are poor and the treatment outcomes are morbid. Cold atmospheric plasma (CAP) is an ionized gas produced at room temperature under laboratory conditions. We have previously demonstrated that treatment with a CAP jet device selectively targets cancer cells using in vitro melanoma and in vivo bladder cancer models. In the present study, we wished to examine CAP selectivity in HNSCC in vitro models, and to explore its potential for use as a minimally invasive surgical approach that allows for specific cancer cell or tumor tissue ablation without affecting the surrounding healthy cells and tissues. Four HNSCC cell lines (JHU-022, JHU-028, JHU-029, SCC25) and 2 normal oral cavity epithelial cell lines (OKF6 and NOKsi) were subjected to cold plasma treatment for durations of 10, 30 and 45 sec, and a helium flow of 20 l/min-1 for 10 sec was used as a positive treatment control. We showed that cold plasma selectively diminished HNSCC cell viability in a dose-response manner, as evidenced by MTT assays; the viability of the OKF6 cells was not affected by the cold plasma. The results of colony formation assays also revealed a cell-specific response to cold plasma application. Western blot analysis did not provide evidence that the cleavage of PARP occurred following cold plasma treatment. In conclusion, our results suggest that cold plasma application selectively impairs HNSCC cell lines through non-apoptotic mechanisms, while having a minimal effect on normal oral cavity epithelial cell lines.

  13. [Selective estrogen receptor modulators in treatment of postmenopausal osteoporosis].

    Science.gov (United States)

    Meczekalski, Błazej; Czyzyk, Adam

    2009-03-01

    Postmenopausal osteoporosis is associated with lack of estrogens, therefore, understandably one of the treatment options in osteoporosis is a group of medicines known as selective estrogen receptor modulators (SERMs). They can act as an estrogen receptor agonist in some tissues, whereas as an antagonist in others. In relation to this antago-antagonistic action, SERMs have a positive effect on bones, the serum lipid profile and the cardio-vascular system. Moreover, they can protect against some estrogen-dependent neoplasm development. The first used SERM was tamoxifen, but due to its negative effect on endometrium it is not indicated in osteoporosis. Raloxifen, which is currently in use, besides the reduction of vertebral fractures risk, has beneficial influence on endometrial and breast neoplasm development risk as well. On the other hand, raloxifen intensifies vasomotor symptoms and its bone-protecting effect is limited. At present, new SERMs (ospemifen, lasofoxifen, bazedoxifen, arzoxifen) are being researched in clinical trials. In the current stage of investigations they reveal beneficial influence on skeletal as well as extraskeletal tissues. Implementation of SERMs in combined therapy of osteoporosis is currently under research as well. SERM with parathormone or SERM with bisphosphonate might prove to be an advantageous treatment option for women with severe or resistant osteoporosis. An addition of SERM to conventional hormonal replacement therapy did not bring the anticipated benefits. Future studies on SERMs may result in new preparations adjusted to individual needs of the patients.

  14. Model for personal computer system selection.

    Science.gov (United States)

    Blide, L

    1987-12-01

    Successful computer software and hardware selection is best accomplished by following an organized approach such as the one described in this article. The first step is to decide what you want to be able to do with the computer. Secondly, select software that is user friendly, well documented, bug free, and that does what you want done. Next, you select the computer, printer and other needed equipment from the group of machines on which the software will run. Key factors here are reliability and compatibility with other microcomputers in your facility. Lastly, you select a reliable vendor who will provide good, dependable service in a reasonable time. The ability to correctly select computer software and hardware is a key skill needed by medical record professionals today and in the future. Professionals can make quality computer decisions by selecting software and systems that are compatible with other computers in their facility, allow for future net-working, ease of use, and adaptability for expansion as new applications are identified. The key to success is to not only provide for your present needs, but to be prepared for future rapid expansion and change in your computer usage as technology and your skills grow.

  15. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; van den Berg, Stéphanie Martine

    2017-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  16. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; Berg, van den Stephanie M.

    2016-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  17. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; Berg, van den Stephanie M.

    2017-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  18. Robotic selective postganglionic thoracic sympathectomy for the treatment of hyperhidrosis

    National Research Council Canada - National Science Library

    Coveliers, Hans; Meyer, Mark; Gharagozloo, Farid; Wisselink, Willem; Rauwerda, Jan; Margolis, Marc; Tempesta, Barbara; Strother, Eric

    2013-01-01

    ... maneuverability in a confined space may facilitate the technique of selective sympathectomy (ramicotomy). We present a case series of patients undergoing selective postganglionic thoracic sympathectomy using robotic technology...

  19. Optimization model for the design of distributed wastewater treatment networks

    Directory of Open Access Journals (Sweden)

    Ibrić Nidret

    2012-01-01

    Full Text Available In this paper we address the synthesis problem of distributed wastewater networks using mathematical programming approach based on the superstructure optimization. We present a generalized superstructure and optimization model for the design of the distributed wastewater treatment networks. The superstructure includes splitters, treatment units, mixers, with all feasible interconnections including water recirculation. Based on the superstructure the optimization model is presented. The optimization model is given as a nonlinear programming (NLP problem where the objective function can be defined to minimize the total amount of wastewater treated in treatment operations or to minimize the total treatment costs. The NLP model is extended to a mixed integer nonlinear programming (MINLP problem where binary variables are used for the selection of the wastewater treatment technologies. The bounds for all flowrates and concentrations in the wastewater network are specified as general equations. The proposed models are solved using the global optimization solvers (BARON and LINDOGlobal. The application of the proposed models is illustrated on the two wastewater network problems of different complexity. First one is formulated as the NLP and the second one as the MINLP. For the second one the parametric and structural optimization is performed at the same time where optimal flowrates, concentrations as well as optimal technologies for the wastewater treatment are selected. Using the proposed model both problems are solved to global optimality.

  20. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  1. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  2. The Trimeric Model: A New Model of Periodontal Treatment Planning

    OpenAIRE

    Azouni, Khalid G; Tarakji, Bassel

    2014-01-01

    Treatment of periodontal disease is a complex and multidisciplinary procedure, requiring periodontal, surgical, restorative, and orthodontic treatment modalities. Several authors attempted to formulate models for periodontal treatment that orders the treatment steps in a logical and easy to remember manner. In this article, we discuss two models of periodontal treatment planning from two of the most well-known textbook in the specialty of periodontics internationally. Then modify them to arri...

  3. Evaluation of selective treatment of penetrating abdominal trauma.

    Science.gov (United States)

    Schmelzer, Thomas M; Mostafa, Gamal; Gunter, Oliver L; Norton, H James; Sing, Ronald F

    2008-01-01

    In penetrating abdominal trauma, diagnostic imaging and the application of selective clinical management may avoid negative celiotomy and improve outcome. We prospectively observed patients with penetrating abdominal trauma over 15 months and recorded demographics, presentation, imaging, surgical procedure, and outcome. Patients who underwent immediate laparotomy were compared with patients who were observed and/or had a computed tomography (CT) scan. Outcomes of negative versus positive and immediate versus delayed celiotomy were compared. Chi-square and Student t tests were used. A p value of less than 0.05 was considered significant. A level 1 trauma center. Adult patients who presented with penetrating abdominal injury. In all, 100 consecutive patients (mean age, 32 years) were included (male:female, 91:9; gunshot wound:stab wound, 65:35). Overall, 60 immediate and 10 delayed laparotomies were performed; 30 patients did not undergo surgery. Predictors of immediate celiotomy were hypotension (p = 0.03), anteriorly located entrance wounds (p = 0.0005), and transaxial wounds (p = 0.03). Overall morbidity and mortality was 32% and 2%, respectively. The negative celiotomy rate was 25%. Patients with a positive celiotomy had higher morbidity (p = 0.006) and longer hospital length of stay (p = 0.003) compared with negative celiotomy. A CT scan was employed in 32% of patients, with 100% sensitivity and 94% specificity. Delayed celiotomy (10%) did not adversely impact morbidity (p = 0.70) and was 100% therapeutic, with no deaths. Nonselective immediate celiotomy for penetrating abdominal trauma results in a high rate of unnecessary surgery. Hemodynamically stable patients can safely be observed and/or have contrast CT scans and undergo delayed celiotomy, if indicated. This selective treatment had no adverse effect on patient outcomes and can potentially improve overall outcome.

  4. Cardinality constrained portfolio selection via factor models

    OpenAIRE

    Monge, Juan Francisco

    2017-01-01

    In this paper we propose and discuss different 0-1 linear models in order to solve the cardinality constrained portfolio problem by using factor models. Factor models are used to build portfolios to track indexes, together with other objectives, also need a smaller number of parameters to estimate than the classical Markowitz model. The addition of the cardinality constraints limits the number of securities in the portfolio. Restricting the number of securities in the portfolio allows us to o...

  5. Selection of patients for intra-arterial treatment for acute ischaemic stroke : Development and validation of a clinical decision tool in two randomised trials

    NARCIS (Netherlands)

    E. Venema (Esmee); M.J.H.L. Mulder (Maxim); B. Roozenbeek (Bob); J.P. Broderick (Joseph P.); S.D. Yeatts (Sharon D.); P. Khatri (Pooja); O.A. Berkhemer (Olvert); B.J. Emmer (Bart J.); Y.B.W.E.M. Roos (Yvo); C.B. Majoie (Charles); R.J. Van Oostenbrugge; W.H. van Zwam (Wim); A. van der Lugt (Aad); E.W. Steyerberg (Ewout); D.W.J. Dippel (Diederik); H.F. Lingsma (Hester)

    2017-01-01

    markdownabstract__Objective__ To improve the selection of patients with acute ischaemic stroke for intra-arterial treatment using a clinical decision tool to predict individual treatment benefit. __Design__ Multivariable regression modelling with data from two randomised controlled clinical

  6. Evidence accumulation as a model for lexical selection

    NARCIS (Netherlands)

    Anders, R.; Riès, S.; van Maanen, L.; Alario, F.-X.

    2015-01-01

    We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of

  7. The Optimal Selection for Restricted Linear Models with Average Estimator

    Directory of Open Access Journals (Sweden)

    Qichang Xie

    2014-01-01

    Full Text Available The essential task of risk investment is to select an optimal tracking portfolio among various portfolios. Statistically, this process can be achieved by choosing an optimal restricted linear model. This paper develops a statistical procedure to do this, based on selecting appropriate weights for averaging approximately restricted models. The method of weighted average least squares is adopted to estimate the approximately restricted models under dependent error setting. The optimal weights are selected by minimizing a k-class generalized information criterion (k-GIC, which is an estimate of the average squared error from the model average fit. This model selection procedure is shown to be asymptotically optimal in the sense of obtaining the lowest possible average squared error. Monte Carlo simulations illustrate that the suggested method has comparable efficiency to some alternative model selection techniques.

  8. [Indications for selective arterial embolization in the treatment of epistaxis].

    Science.gov (United States)

    Romagnoli, M; Marina, R; Sordo, L; Gaini, R M

    2000-10-01

    After posterior packing has failed, the treatment of choice for severe, recurrent posterior epistaxis is arterial ligature, usually through a transantral approach to the Internal Maxillary artery (LTA) or selective percutaneous embolization (EP). The advantages and disadvantages of each technique are discussed by various Authors. A critical review of the literature brings to light the discrepancies between the results of various studies: in a series by Strong et al. and in a review of the literature EP proved more effective than LTA (90-94% vs. 85-89%). On the contrary, using personal data Cullen and Tami reported that the results are analogous. As regards complications, these proved slightly more frequent, but minor, with LTA while the rare complications with EP were more serious. The per-patient costs fundamentally depend on the type of hospital management and the availability of a treatment center; the results of the various studies are not analogous in this regard. The specific indications for the choice of which technique to use include: LTA: ethmoid artery hemorrhage, severe arteriosclerosis of the carotid compartment and allergy to the contrast medium; EP: cardiovascular instability, severe anemia and all conditions which are contraindications for general anesthesia. In the cases studied by the Authors, of the total 203 patients admitted to hospital for posterior epistaxis between May 1995 and November 1999, 12 (5.9%; on the average 2.6 pt/yr) showed values lower than those found at other Centers. A total of 13 EP procedures were performed and the result was positive (stopping the hemorrhage) in 11 (91.7%). In one post-traumatic case there was a recurrence which could not be controlled by EP and thus the Authors resorted to surgical ligature. All the patients underwent fibroscopy after the posterior packing was removed and before establishing the indications for EP. A full 50% of the patients treated showed arterial hypertension and in all patients except

  9. Robustness and epistasis in mutation-selection models

    Science.gov (United States)

    Wolff, Andrea; Krug, Joachim

    2009-09-01

    We investigate the fitness advantage associated with the robustness of a phenotype against deleterious mutations using deterministic mutation-selection models of a quasispecies type equipped with a mesa-shaped fitness landscape. We obtain analytic results for the robustness effect which become exact in the limit of infinite sequence length. Thereby, we are able to clarify a seeming contradiction between recent rigorous work and an earlier heuristic treatment based on mapping to a Schrödinger equation. We exploit the quantum mechanical analogy to calculate a correction term for finite sequence lengths and verify our analytic results by numerical studies. In addition, we investigate the occurrence of an error threshold for a general class of epistatic landscapes and show that diminishing epistasis is a necessary but not sufficient condition for error threshold behaviour.

  10. Selection of Temporal Lags When Modeling Economic and Financial Processes.

    Science.gov (United States)

    Matilla-Garcia, Mariano; Ojeda, Rina B; Marin, Manuel Ruiz

    2016-10-01

    This paper suggests new nonparametric statistical tools and procedures for modeling linear and nonlinear univariate economic and financial processes. In particular, the tools presented help in selecting relevant lags in the model description of a general linear or nonlinear time series; that is, nonlinear models are not a restriction. The tests seem to be robust to the selection of free parameters. We also show that the test can be used as a diagnostic tool for well-defined models.

  11. HIV models for treatment interruption: Adaptation and comparison

    Science.gov (United States)

    Hillmann, Andreas; Crane, Martin; Ruskin, Heather J.

    2017-10-01

    In recent years, Antiretroviral Therapy (ART) has become commonplace for treating HIV infections, although a cure remains elusive, given reservoirs of replicating latently-infected cells, which are resistant to normal treatment regimes. Treatment interruptions, whether ad hoc or structured, are known to cause a rapid increase in viral production to detectable levels, but numerous clinical trials remain inconclusive on the dangers inherent in this resurgence. In consequence, interest in examining interruption strategies has recently been rekindled. This overview considers modelling approaches, which have been used to explore the issue of treatment interruption. We highlight their purpose and the formalisms employed and examine ways in which clinical data have been used. Implementation of selected models is demonstrated, illustrative examples provided and model performance compared for these cases. Possible extensions to bottom-up modelling techniques for treatment interruptions are briefly discussed.

  12. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...

  13. Astrophysical Model Selection in Gravitational Wave Astronomy

    Science.gov (United States)

    Adams, Matthew R.; Cornish, Neil J.; Littenberg, Tyson B.

    2012-01-01

    Theoretical studies in gravitational wave astronomy have mostly focused on the information that can be extracted from individual detections, such as the mass of a binary system and its location in space. Here we consider how the information from multiple detections can be used to constrain astrophysical population models. This seemingly simple problem is made challenging by the high dimensionality and high degree of correlation in the parameter spaces that describe the signals, and by the complexity of the astrophysical models, which can also depend on a large number of parameters, some of which might not be directly constrained by the observations. We present a method for constraining population models using a hierarchical Bayesian modeling approach which simultaneously infers the source parameters and population model and provides the joint probability distributions for both. We illustrate this approach by considering the constraints that can be placed on population models for galactic white dwarf binaries using a future space-based gravitational wave detector. We find that a mission that is able to resolve approximately 5000 of the shortest period binaries will be able to constrain the population model parameters, including the chirp mass distribution and a characteristic galaxy disk radius to within a few percent. This compares favorably to existing bounds, where electromagnetic observations of stars in the galaxy constrain disk radii to within 20%.

  14. On Optimal Input Design and Model Selection for Communication Channels

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yanyan [ORNL; Djouadi, Seddik M [ORNL; Olama, Mohammed M [ORNL

    2013-01-01

    In this paper, the optimal model (structure) selection and input design which minimize the worst case identification error for communication systems are provided. The problem is formulated using metric complexity theory in a Hilbert space setting. It is pointed out that model selection and input design can be handled independently. Kolmogorov n-width is used to characterize the representation error introduced by model selection, while Gel fand and Time n-widths are used to represent the inherent error introduced by input design. After the model is selected, an optimal input which minimizes the worst case identification error is shown to exist. In particular, it is proven that the optimal model for reducing the representation error is a Finite Impulse Response (FIR) model, and the optimal input is an impulse at the start of the observation interval. FIR models are widely popular in communication systems, such as, in Orthogonal Frequency Division Multiplexing (OFDM) systems.

  15. Model and Variable Selection Procedures for Semiparametric Time Series Regression

    Directory of Open Access Journals (Sweden)

    Risa Kato

    2009-01-01

    Full Text Available Semiparametric regression models are very useful for time series analysis. They facilitate the detection of features resulting from external interventions. The complexity of semiparametric models poses new challenges for issues of nonparametric and parametric inference and model selection that frequently arise from time series data analysis. In this paper, we propose penalized least squares estimators which can simultaneously select significant variables and estimate unknown parameters. An innovative class of variable selection procedure is proposed to select significant variables and basis functions in a semiparametric model. The asymptotic normality of the resulting estimators is established. Information criteria for model selection are also proposed. We illustrate the effectiveness of the proposed procedures with numerical simulations.

  16. Using multilevel models to quantify heterogeneity in resource selection

    Science.gov (United States)

    Wagner, T.; Diefenbach, D.R.; Christensen, S.A.; Norton, A.S.

    2011-01-01

    Models of resource selection are being used increasingly to predict or model the effects of management actions rather than simply quantifying habitat selection. Multilevel, or hierarchical, models are an increasingly popular method to analyze animal resource selection because they impose a relatively weak stochastic constraint to model heterogeneity in habitat use and also account for unequal sample sizes among individuals. However, few studies have used multilevel models to model coefficients as a function of predictors that may influence habitat use at different scales or quantify differences in resource selection among groups. We used an example with white-tailed deer (Odocoileus virginianus) to illustrate how to model resource use as a function of distance to road that varies among deer by road density at the home range scale. We found that deer avoidance of roads decreased as road density increased. Also, we used multilevel models with sika deer (Cervus nippon) and white-tailed deer to examine whether resource selection differed between species. We failed to detect differences in resource use between these two species and showed how information-theoretic and graphical measures can be used to assess how resource use may have differed. Multilevel models can improve our understanding of how resource selection varies among individuals and provides an objective, quantifiable approach to assess differences or changes in resource selection. ?? The Wildlife Society, 2011.

  17. A selection model for accounting for publication bias in a full network meta-analysis.

    Science.gov (United States)

    Mavridis, Dimitris; Welton, Nicky J; Sutton, Alex; Salanti, Georgia

    2014-12-30

    Copas and Shi suggested a selection model to explore the potential impact of publication bias via sensitivity analysis based on assumptions for the probability of publication of trials conditional on the precision of their results. Chootrakool et al. extended this model to three-arm trials but did not fully account for the implications of the consistency assumption, and their model is difficult to generalize for complex network structures with more than three treatments. Fitting these selection models within a frequentist setting requires maximization of a complex likelihood function, and identification problems are common. We have previously presented a Bayesian implementation of the selection model when multiple treatments are compared with a common reference treatment. We now present a general model suitable for complex, full network meta-analysis that accounts for consistency when adjusting results for publication bias. We developed a design-by-treatment selection model to describe the mechanism by which studies with different designs (sets of treatments compared in a trial) and precision may be selected for publication. We fit the model in a Bayesian setting because it avoids the numerical problems encountered in the frequentist setting, it is generalizable with respect to the number of treatments and study arms, and it provides a flexible framework for sensitivity analysis using external knowledge. Our model accounts for the additional uncertainty arising from publication bias more successfully compared to the standard Copas model or its previous extensions. We illustrate the methodology using a published triangular network for the failure of vascular graft or arterial patency.

  18. Python Program to Select HII Region Models

    Science.gov (United States)

    Miller, Clare; Lamarche, Cody; Vishwas, Amit; Stacey, Gordon J.

    2016-01-01

    HII regions are areas of singly ionized Hydrogen formed by the ionizing radiaiton of upper main sequence stars. The infrared fine-structure line emissions, particularly Oxygen, Nitrogen, and Neon, can give important information about HII regions including gas temperature and density, elemental abundances, and the effective temperature of the stars that form them. The processes involved in calculating this information from observational data are complex. Models, such as those provided in Rubin 1984 and those produced by Cloudy (Ferland et al, 2013) enable one to extract physical parameters from observational data. However, the multitude of search parameters can make sifting through models tedious. I digitized Rubin's models and wrote a Python program that is able to take observed line ratios and their uncertainties and find the Rubin or Cloudy model that best matches the observational data. By creating a Python script that is user friendly and able to quickly sort through models with a high level of accuracy, this work increases efficiency and reduces human error in matching HII region models to observational data.

  19. Methods for model selection in applied science and engineering.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2004-10-01

    Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

  20. Anaerobic/aerobic treatment of selected azo dyes in wastewater

    Energy Technology Data Exchange (ETDEWEB)

    Seshadri, S.; Bishop, P.L. (Univ. of Cincinnati, OH (United States). Dept. of Civil and Environmental Engineering); Agha, A.M. (Univ. of Aleppo (Syrian Arab Republic). Faculty of Civil Engineering)

    1994-01-01

    Azo dyes represent the largest class of dyes in use today. Current environmental concern with these dyes revolves around the potential carcinogenic health risk presented by these dyes or their intermediate biodegradation products when exposed to microflora in the human digestive tract. These dyes may build up in the environment, since many wastewater treatment plants allow these dyes to pass through the system virtually untreated. The initial step in the degradation of these dyes is the cleavage of the Azo bond. This cleavage is often impossible under aerobic conditions, but has been readily demonstrated under anaerobic conditions. The focus of the study was to determine the feasibility of using an anaerobic fluidized-bed reactor to accomplish this cleavage. The effects of typical process variables such as hydraulic retention time (HRT), influent dye concentration levels, and degree of bed fluidization on removal efficiencies were also studied. The four dyes selected for this study were Acid-Orange 7, Acid-Orange 8, Acid-Orange 10, and Acid-Red 14. The effectiveness of using a bench-scale-activated sludge reactor as a sequenced second stage was also examined. Results indicate that nearly complete cleavage of the Azo bond is easily accomplished for each of the four dyes under hydraulic retention times of either 12 or 24 h. Initial results indicate, though, that aromatic amine by-products remain. The sequenced second stage was able to remove the remaining Chemical Oxygen Demand (COD) load to acceptable levels. Work is presently underway to determine the face of the anaerobic by-products in the aerobic second stage.

  1. Bayesian Model Selection for LISA Pathfinder

    CERN Document Server

    Karnesis, Nikolaos; Sopuerta, Carlos F; Gibert, Ferran; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Ferraioli, Luigi; Hewitson, Martin; Hueller, Mauro; Korsakova, Natalia; Plagnol, Eric; Vitale, and Stefano

    2013-01-01

    The main goal of the LISA Pathfinder (LPF) mission is to fully characterize the acceleration noise models and to test key technologies for future space-based gravitational-wave observatories similar to the LISA/eLISA concept. The Data Analysis (DA) team has developed complex three-dimensional models of the LISA Technology Package (LTP) experiment on-board LPF. These models are used for simulations, but more importantly, they will be used for parameter estimation purposes during flight operations. One of the tasks of the DA team is to identify the physical effects that contribute significantly to the properties of the instrument noise. A way of approaching to this problem is to recover the essential parameters of the LTP which describe the data. Thus, we want to define the simplest model that efficiently explains the observations. To do so, adopting a Bayesian framework, one has to estimate the so-called Bayes Factor between two competing models. In our analysis, we use three main different methods to estimate...

  2. Model selection in kernel ridge regression

    DEFF Research Database (Denmark)

    Exterkate, Peter

    2013-01-01

    Kernel ridge regression is a technique to perform ridge regression with a potentially infinite number of nonlinear transformations of the independent variables as regressors. This method is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts....... The influence of the choice of kernel and the setting of tuning parameters on forecast accuracy is investigated. Several popular kernels are reviewed, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. The latter two kernels are interpreted in terms of their smoothing properties......, and the tuning parameters associated to all these kernels are related to smoothness measures of the prediction function and to the signal-to-noise ratio. Based on these interpretations, guidelines are provided for selecting the tuning parameters from small grids using cross-validation. A Monte Carlo study...

  3. Model Selection in Kernel Ridge Regression

    DEFF Research Database (Denmark)

    Exterkate, Peter

    Kernel ridge regression is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts. This paper investigates the influence of the choice of kernel and the setting of tuning parameters on forecast accuracy. We review several popular kernels......, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. We interpret the latter two kernels in terms of their smoothing properties, and we relate the tuning parameters associated to all these kernels to smoothness measures of the prediction function and to the signal-to-noise ratio. Based...... on these interpretations, we provide guidelines for selecting the tuning parameters from small grids using cross-validation. A Monte Carlo study confirms the practical usefulness of these rules of thumb. Finally, the flexible and smooth functional forms provided by the Gaussian and Sinc kernels makes them widely...

  4. Development of SPAWM: selection program for available watershed models.

    Science.gov (United States)

    Cho, Yongdeok; Roesner, Larry A

    2014-01-01

    A selection program for available watershed models (also known as SPAWM) was developed. Thirty-three commonly used watershed models were analyzed in depth and classified in accordance to their attributes. These attributes consist of: (1) land use; (2) event or continuous; (3) time steps; (4) water quality; (5) distributed or lumped; (6) subsurface; (7) overland sediment; and (8) best management practices. Each of these attributes was further classified into sub-attributes. Based on user selected sub-attributes, the most appropriate watershed model is selected from the library of watershed models. SPAWM is implemented using Excel Visual Basic and is designed for use by novices as well as by experts on watershed modeling. It ensures that the necessary sub-attributes required by the user are captured and made available in the selected watershed model.

  5. Parametric or nonparametric? A parametricness index for model selection

    CERN Document Server

    Liu, Wei; 10.1214/11-AOS899

    2012-01-01

    In model selection literature, two classes of criteria perform well asymptotically in different situations: Bayesian information criterion (BIC) (as a representative) is consistent in selection when the true model is finite dimensional (parametric scenario); Akaike's information criterion (AIC) performs well in an asymptotic efficiency when the true model is infinite dimensional (nonparametric scenario). But there is little work that addresses if it is possible and how to detect the situation that a specific model selection problem is in. In this work, we differentiate the two scenarios theoretically under some conditions. We develop a measure, parametricness index (PI), to assess whether a model selected by a potentially consistent procedure can be practically treated as the true model, which also hints on AIC or BIC is better suited for the data for the goal of estimating the regression function. A consequence is that by switching between AIC and BIC based on the PI, the resulting regression estimator is si...

  6. Boosting model performance and interpretation by entangling preprocessing selection and variable selection.

    Science.gov (United States)

    Gerretzen, Jan; Szymańska, Ewa; Bart, Jacob; Davies, Antony N; van Manen, Henk-Jan; van den Heuvel, Edwin R; Jansen, Jeroen J; Buydens, Lutgarde M C

    2016-09-28

    The aim of data preprocessing is to remove data artifacts-such as a baseline, scatter effects or noise-and to enhance the contextually relevant information. Many preprocessing methods exist to deliver one or more of these benefits, but which method or combination of methods should be used for the specific data being analyzed is difficult to select. Recently, we have shown that a preprocessing selection approach based on Design of Experiments (DoE) enables correct selection of highly appropriate preprocessing strategies within reasonable time frames. In that approach, the focus was solely on improving the predictive performance of the chemometric model. This is, however, only one of the two relevant criteria in modeling: interpretation of the model results can be just as important. Variable selection is often used to achieve such interpretation. Data artifacts, however, may hamper proper variable selection by masking the true relevant variables. The choice of preprocessing therefore has a huge impact on the outcome of variable selection methods and may thus hamper an objective interpretation of the final model. To enhance such objective interpretation, we here integrate variable selection into the preprocessing selection approach that is based on DoE. We show that the entanglement of preprocessing selection and variable selection not only improves the interpretation, but also the predictive performance of the model. This is achieved by analyzing several experimental data sets of which the true relevant variables are available as prior knowledge. We show that a selection of variables is provided that complies more with the true informative variables compared to individual optimization of both model aspects. Importantly, the approach presented in this work is generic. Different types of models (e.g. PCR, PLS, …) can be incorporated into it, as well as different variable selection methods and different preprocessing methods, according to the taste and experience of

  7. Mathematical modeling of endovenous laser treatment (ELT

    Directory of Open Access Journals (Sweden)

    Wassmer Benjamin

    2006-04-01

    Full Text Available Abstract Background and objectives Endovenous laser treatment (ELT has been recently proposed as an alternative in the treatment of reflux of the Great Saphenous Vein (GSV and Small Saphenous Vein (SSV. Successful ELT depends on the selection of optimal parameters required to achieve an optimal vein damage while avoiding side effects. Mathematical modeling of ELT could provide a better understanding of the ELT process and could determine the optimal dosage as a function of vein diameter. Study design/materials and methods The model is based on calculations describing the light distribution using the diffusion approximation of the transport theory, the temperature rise using the bioheat equation and the laser-induced injury using the Arrhenius damage model. The geometry to simulate ELT was based on a 2D model consisting of a cylindrically symmetric blood vessel including a vessel wall and surrounded by an infinite homogenous tissue. The mathematical model was implemented using the Macsyma-Pdease2D software (Macsyma Inc., Arlington, MA, USA. Damage to the vein wall for CW and single shot energy was calculated for 3 and 5 mm vein diameters. In pulsed mode, the pullback distance (3, 5 and 7 mm was considered. For CW mode simulation, the pullback speed (1, 2, 3 mm/s was the variable. The total dose was expressed as joules per centimeter in order to perform comparison to results already reported in clinical studies. Results In pulsed mode, for a 3 mm vein diameter, irrespective of the pullback distance (2, 5 or 7 mm, a minimum fluence of 15 J/cm is required to obtain a permanent damage of the intima. For a 5 mm vein diameter, 50 J/cm (15W-2s is required. In continuous mode, for a 3 mm and 5 mm vein diameter, respectively 65 J/cm and 100 J/cm are required to obtain a permanent damage of the vessel wall. Finally, the use of different wavelengths (810 nm or 980 nm played only a minor influence on these results. Discussion and conclusion The parameters

  8. Quantile hydrologic model selection and model structure deficiency assessment: 2. Applications

    NARCIS (Netherlands)

    Pande, S.

    2013-01-01

    Quantile hydrologic model selection and structure deficiency assessment is applied in three case studies. The performance of quantile model selection problem is rigorously evaluated using a model structure on the French Broad river basin data set. The case study shows that quantile model selection

  9. Hybrid Sludge Modeling in Water Treatment Processes

    OpenAIRE

    Brenda, Marian

    2015-01-01

    Sludge occurs in many waste water and drinking water treatment processes. The numeric modeling of sludge is therefore crucial for developing and optimizing water treatment processes. Numeric single-phase sludge models mainly include settling and viscoplastic behavior. Even though many investigators emphasize the importance of modeling the rheology of sludge for good simulation results, it is difficult to measure, because of settling and the viscoplastic behavior. In this thesis, a new method ...

  10. The genealogy of samples in models with selection.

    Science.gov (United States)

    Neuhauser, C; Krone, S M

    1997-02-01

    We introduce the genealogy of a random sample of genes taken from a large haploid population that evolves according to random reproduction with selection and mutation. Without selection, the genealogy is described by Kingman's well-known coalescent process. In the selective case, the genealogy of the sample is embedded in a graph with a coalescing and branching structure. We describe this graph, called the ancestral selection graph, and point out differences and similarities with Kingman's coalescent. We present simulations for a two-allele model with symmetric mutation in which one of the alleles has a selective advantage over the other. We find that when the allele frequencies in the population are already in equilibrium, then the genealogy does not differ much from the neutral case. This is supported by rigorous results. Furthermore, we describe the ancestral selection graph for other selective models with finitely many selection classes, such as the K-allele models, infinitely-many-alleles models. DNA sequence models, and infinitely-many-sites models, and briefly discuss the diploid case.

  11. Adapting AIC to conditional model selection

    NARCIS (Netherlands)

    M. van Ommen (Matthijs)

    2012-01-01

    textabstractIn statistical settings such as regression and time series, we can condition on observed information when predicting the data of interest. For example, a regression model explains the dependent variables $y_1, \\ldots, y_n$ in terms of the independent variables $x_1, \\ldots, x_n$.

  12. Random effect selection in generalised linear models

    DEFF Research Database (Denmark)

    Denwood, Matt; Houe, Hans; Forkman, Björn;

    We analysed abattoir recordings of meat inspection codes with possible relevance to onfarm animal welfare in cattle. Random effects logistic regression models were used to describe individual-level data obtained from 461,406 cattle slaughtered in Denmark. Our results demonstrate that the largest...

  13. A Decision Model for Selecting Participants in Supply Chain

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In order to satisfy the rapid changing requirements of customers, enterprises must cooperate with each other to form supply chain. The first and the most important stage in the forming of supply chain is the selection of participants. The article proposes a two-staged decision model to select partners. The first stage is the inter company comparison in each business process to select highefficiency candidate based on inside variables. The next stage is to analyse the combination of different candidates in order to select the most perfect partners according to a goal-programming model.

  14. Modeling treatment couches in the Pinnacle treatment planning system: Especially important for arc therapy

    Energy Technology Data Exchange (ETDEWEB)

    Duggar, William Neil, E-mail: wduggar@umc.edu [Department of Radiation Oncology, University of Mississippi Medical Center, Jackson, MS (United States); Nguyen, Alex [Ironwood Cancer and Research Center, Chandler, AZ (United States); Stanford, Jason; Morris, Bart; Yang, Claus C. [Department of Radiation Oncology, University of Mississippi Medical Center, Jackson, MS (United States)

    2016-04-01

    This study is to demonstrate the importance and a method of properly modeling the treatment couch for dose calculation in patient treatment using arc therapy. The 2 treatment couch tops—Aktina AK550 and Elekta iBEAM evo—of Elekta LINACs were scanned using Philips Brilliance Big Bore CT Simulator. Various parts of the couch tops were contoured, and their densities were measured and recorded on the Pinnacle treatment planning system (TPS) using the established computed tomography density table. These contours were saved as organ models to be placed beneath the patient during planning. Relative attenuation measurements were performed following procedures outlined by TG-176 as well as absolute dose comparison of static fields of 10 × 10 cm{sup 2} that were delivered through the couch tops with that calculated in the TPS with the couch models. A total of 10 random arc therapy treatment plans (5 volumetric-modulated arc therapy [VMAT] and 5 stereotactic body radiation therapy [SBRT]), using 24 beams, were selected for this study. All selected plans were calculated with and without couch modeling. Each beam was evaluated using the Delta{sup 4} dosimetry system (Delta{sup 4}). The Student t-test was used to determine statistical significance. Independent reviews were exploited as per the Imaging and Radiation Oncology Core head and neck credentialing phantom. The selected plans were calculated on the actual patient anatomies with and without couch modeling to determine potential clinical effects. Large relative beam attenuations were noted dependent on which part of the couch top beams were passing through. Substantial improvements were also noted for static fields both calculated with the TPS and delivered physically when the couch models were included in the calculation. A statistically significant increase in agreement was noted for dose difference, distance to agreement, and γ-analysis with the Delta{sup 4} on VMAT and SBRT plans. A credentialing review showed

  15. Ecotoxicological and chemical characterization of selected treatment process effluents of municipal sewage treatment plant.

    Science.gov (United States)

    Wang, Chunxia; Wang, Yi; Kiefer, F; Yediler, A; Wang, Zijian; Kettrup, A

    2003-10-01

    The triolein-containing semipermeable membrane devices (SPMDs) were deployed for 4 weeks in a sewage treatment plant in Beijing, China, to sample and concentrate priority hydrophobic organic pollutants in a sewage treatment process. The chemical analyses and ecotoxicities of the residuals of SPMDs dialysate were examined. The data from the chemical analyses by gas chromatography-mass spectrometry selected ion monitoring mode indicated the lower removal for polychlorinated biphenyls (PCB) congeners and polycyclic aromatic hydrocarbons (PAHs) coincided with the persistence of them in the environment. The acute toxicity examined by bioluminescence test with Vibrio fischeri revealed approximately only 20% decrease in the overall toxicity of the influent after the activate sludge treatment process. The ethoxy resorufin-O-deethylase (EROD) induction with a micro-EROD assay in vitro using H4-IIE rat hepatoma cell cultures demonstrated the presence of persistent organics in influent and sequency effluents. Results obtained suggested that integration of the SPMD technique and chemical analyses and bioassay might be a valuable approach for the risk assessment of hydrophobic organic pollutants in water ecosystem. It revealed the necessity for organic pollutants monitoring and ecotoxicities examining of sewage treatment plants.

  16. Selective approach in the treatment of esophageal perforations

    NARCIS (Netherlands)

    Amir, AI; von Dullemen, H; Plukker, JTM

    2004-01-01

    Background: Treatment of esophageal perforation remains controversial and recommendations vary from initially non-operative to aggressive surgical management. Several factors are responsible for this life-threatening event, which has led to more individualized treatment ensuring adequate pleuromedia

  17. Model selection in systems biology depends on experimental design.

    Science.gov (United States)

    Silk, Daniel; Kirk, Paul D W; Barnes, Chris P; Toni, Tina; Stumpf, Michael P H

    2014-06-01

    Experimental design attempts to maximise the information available for modelling tasks. An optimal experiment allows the inferred models or parameters to be chosen with the highest expected degree of confidence. If the true system is faithfully reproduced by one of the models, the merit of this approach is clear - we simply wish to identify it and the true parameters with the most certainty. However, in the more realistic situation where all models are incorrect or incomplete, the interpretation of model selection outcomes and the role of experimental design needs to be examined more carefully. Using a novel experimental design and model selection framework for stochastic state-space models, we perform high-throughput in-silico analyses on families of gene regulatory cascade models, to show that the selected model can depend on the experiment performed. We observe that experimental design thus makes confidence a criterion for model choice, but that this does not necessarily correlate with a model's predictive power or correctness. Finally, in the special case of linear ordinary differential equation (ODE) models, we explore how wrong a model has to be before it influences the conclusions of a model selection analysis.

  18. Modeling HIV-1 drug resistance as episodic directional selection.

    Directory of Open Access Journals (Sweden)

    Ben Murrell

    Full Text Available The evolution of substitutions conferring drug resistance to HIV-1 is both episodic, occurring when patients are on antiretroviral therapy, and strongly directional, with site-specific resistant residues increasing in frequency over time. While methods exist to detect episodic diversifying selection and continuous directional selection, no evolutionary model combining these two properties has been proposed. We present two models of episodic directional selection (MEDS and EDEPS which allow the a priori specification of lineages expected to have undergone directional selection. The models infer the sites and target residues that were likely subject to directional selection, using either codon or protein sequences. Compared to its null model of episodic diversifying selection, MEDS provides a superior fit to most sites known to be involved in drug resistance, and neither one test for episodic diversifying selection nor another for constant directional selection are able to detect as many true positives as MEDS and EDEPS while maintaining acceptable levels of false positives. This suggests that episodic directional selection is a better description of the process driving the evolution of drug resistance.

  19. Asset pricing model selection: Indonesian Stock Exchange

    OpenAIRE

    Pasaribu, Rowland Bismark Fernando

    2010-01-01

    The Capital Asset Pricing Model (CAPM) has dominated finance theory for over thirty years; it suggests that the market beta alone is sufficient to explain stock returns. However evidence shows that the cross-section of stock returns cannot be described solely by the one-factor CAPM. Therefore, the idea is to add other factors in order to complete the beta in explaining the price movements in the stock exchange. The Arbitrage Pricing Theory (APT) has been proposed as the first multifactor succ...

  20. A mixed model reduction method for preserving selected physical information

    Science.gov (United States)

    Zhang, Jing; Zheng, Gangtie

    2017-03-01

    A new model reduction method in the frequency domain is presented. By mixedly using the model reduction techniques from both the time domain and the frequency domain, the dynamic model is condensed to selected physical coordinates, and the contribution of slave degrees of freedom is taken as a modification to the model in the form of effective modal mass of virtually constrained modes. The reduced model can preserve the physical information related to the selected physical coordinates such as physical parameters and physical space positions of corresponding structure components. For the cases of non-classical damping, the method is extended to the model reduction in the state space but still only contains the selected physical coordinates. Numerical results are presented to validate the method and show the effectiveness of the model reduction.

  1. Two-step variable selection in quantile regression models

    Directory of Open Access Journals (Sweden)

    FAN Yali

    2015-06-01

    Full Text Available We propose a two-step variable selection procedure for high dimensional quantile regressions,in which the dimension of the covariates, pn is much larger than the sample size n. In the first step, we perform l1 penalty, and we demonstrate that the first step penalized estimator with the LASSO penalty can reduce the model from an ultra-high dimensional to a model whose size has the same order as that of the true model, and the selected model can cover the true model. The second step excludes the remained irrelevant covariates by applying the adaptive LASSO penalty to the reduced model obtained from the first step. Under some regularity conditions, we show that our procedure enjoys the model selection consistency. We conduct a simulation study and a real data analysis to evaluate the finite sample performance of the proposed approach.

  2. Selection of probability based weighting models for Boolean retrieval system

    Energy Technology Data Exchange (ETDEWEB)

    Ebinuma, Y. (Japan Atomic Energy Research Inst., Tokai, Ibaraki. Tokai Research Establishment)

    1981-09-01

    Automatic weighting models based on probability theory were studied if they can be applied to boolean search logics including logical sum. The INIS detabase was used for searching of one particular search formula. Among sixteen models three with good ranking performance were selected. These three models were further applied to searching of nine search formulas in the same database. It was found that two models among them show slightly better average ranking performance while the other model, the simplest one, seems also practical.

  3. Model Selection Through Sparse Maximum Likelihood Estimation

    CERN Document Server

    Banerjee, Onureena; D'Aspremont, Alexandre

    2007-01-01

    We consider the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse. Our approach is to solve a maximum likelihood problem with an added l_1-norm penalty term. The problem as formulated is convex but the memory requirements and complexity of existing interior point methods are prohibitive for problems with more than tens of nodes. We present two new algorithms for solving problems with at least a thousand nodes in the Gaussian case. Our first algorithm uses block coordinate descent, and can be interpreted as recursive l_1-norm penalized regression. Our second algorithm, based on Nesterov's first order method, yields a complexity estimate with a better dependence on problem size than existing interior point methods. Using a log determinant relaxation of the log partition function (Wainwright & Jordan (2006)), we show that these same algorithms can be used to solve an approximate sparse maximum likelihood problem for...

  4. Sensitivity of resource selection and connectivity models to landscape definition

    Science.gov (United States)

    Katherine A. Zeller; Kevin McGarigal; Samuel A. Cushman; Paul Beier; T. Winston Vickers; Walter M. Boyce

    2017-01-01

    Context: The definition of the geospatial landscape is the underlying basis for species-habitat models, yet sensitivity of habitat use inference, predicted probability surfaces, and connectivity models to landscape definition has received little attention. Objectives: We evaluated the sensitivity of resource selection and connectivity models to four landscape...

  5. A Working Model of Natural Selection Illustrated by Table Tennis

    Science.gov (United States)

    Dinc, Muhittin; Kilic, Selda; Aladag, Caner

    2013-01-01

    Natural selection is one of the most important topics in biology and it helps to clarify the variety and complexity of organisms. However, students in almost every stage of education find it difficult to understand the mechanism of natural selection and they can develop misconceptions about it. This article provides an active model of natural…

  6. Elementary Teachers' Selection and Use of Visual Models

    Science.gov (United States)

    Lee, Tammy D.; Gail Jones, M.

    2017-07-01

    As science grows in complexity, science teachers face an increasing challenge of helping students interpret models that represent complex science systems. Little is known about how teachers select and use models when planning lessons. This mixed methods study investigated the pedagogical approaches and visual models used by elementary in-service and preservice teachers in the development of a science lesson about a complex system (e.g., water cycle). Sixty-seven elementary in-service and 69 elementary preservice teachers completed a card sort task designed to document the types of visual models (e.g., images) that teachers choose when planning science instruction. Quantitative and qualitative analyses were conducted to analyze the card sort task. Semistructured interviews were conducted with a subsample of teachers to elicit the rationale for image selection. Results from this study showed that both experienced in-service teachers and novice preservice teachers tended to select similar models and use similar rationales for images to be used in lessons. Teachers tended to select models that were aesthetically pleasing and simple in design and illustrated specific elements of the water cycle. The results also showed that teachers were not likely to select images that represented the less obvious dimensions of the water cycle. Furthermore, teachers selected visual models more as a pedagogical tool to illustrate specific elements of the water cycle and less often as a tool to promote student learning related to complex systems.

  7. Fluctuating selection models and McDonald-Kreitman type analyses.

    Directory of Open Access Journals (Sweden)

    Toni I Gossmann

    Full Text Available It is likely that the strength of selection acting upon a mutation varies through time due to changes in the environment. However, most population genetic theory assumes that the strength of selection remains constant. Here we investigate the consequences of fluctuating selection pressures on the quantification of adaptive evolution using McDonald-Kreitman (MK style approaches. In agreement with previous work, we show that fluctuating selection can generate evidence of adaptive evolution even when the expected strength of selection on a mutation is zero. However, we also find that the mutations, which contribute to both polymorphism and divergence tend, on average, to be positively selected during their lifetime, under fluctuating selection models. This is because mutations that fluctuate, by chance, to positive selected values, tend to reach higher frequencies in the population than those that fluctuate towards negative values. Hence the evidence of positive adaptive evolution detected under a fluctuating selection model by MK type approaches is genuine since fixed mutations tend to be advantageous on average during their lifetime. Never-the-less we show that methods tend to underestimate the rate of adaptive evolution when selection fluctuates.

  8. Effect of Selected Washing Treatments and Drying Temperatures on ...

    African Journals Online (AJOL)

    Microbiological analyses for aerobic bacteria counts and pathogens (coliforms, E. coli, S. aureus, Salmonella spp and V. cholerae) were carried out on small ... with selected solutions i.e. salted (3% sodium chloride), chlorinated solutions ...

  9. The Optimal Portfolio Selection Model under g -Expectation

    National Research Council Canada - National Science Library

    Li Li

    2014-01-01

      This paper solves the optimal portfolio selection model under the framework of the prospect theory proposed by Kahneman and Tversky in the 1970s with decision rule replaced by the g -expectation introduced by Peng...

  10. Wastewater treatment modelling: dealing with uncertainties

    DEFF Research Database (Denmark)

    Belia, E.; Amerlinck, Y.; Benedetti, L.;

    2009-01-01

    This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...... of a wastewater treatment system. It briefly references the methods currently used to evaluate prediction accuracy and uncertainty and discusses the relevance of uncertainty evaluations in model applications. The paper aims to raise awareness and initiate a comprehensive discussion among professionals on model...

  11. Robust Decision-making Applied to Model Selection

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Laboratory

    2012-08-06

    The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define each of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.

  12. Information-theoretic model selection applied to supernovae data

    CERN Document Server

    Biesiada, M

    2007-01-01

    There are several different theoretical ideas invoked to explain the dark energy with relatively little guidance of which one of them might be right. Therefore the emphasis of ongoing and forthcoming research in this field shifts from estimating specific parameters of cosmological model to the model selection. In this paper we apply information-theoretic model selection approach based on Akaike criterion as an estimator of Kullback-Leibler entropy. In particular, we present the proper way of ranking the competing models based on Akaike weights (in Bayesian language - posterior probabilities of the models). Out of many particular models of dark energy we focus on four: quintessence, quintessence with time varying equation of state, brane-world and generalized Chaplygin gas model and test them on Riess' Gold sample. As a result we obtain that the best model - in terms of Akaike Criterion - is the quintessence model. The odds suggest that although there exist differences in the support given to specific scenario...

  13. Lactate dehydrogenase as a selection criterion for ipilimumab treatment in metastatic melanoma

    DEFF Research Database (Denmark)

    Kelderman, Sander; Heemskerk, Bianca; van Tinteren, Harm;

    2014-01-01

    OS was 7.5 months, and OS at 1 year was 37.8 % and at 2 years was 22.9 %. In a multivariate model, baseline serum lactate dehydrogenase (LDH) was demonstrated to be the strongest predictive factor for OS. These findings were validated in an independent cohort of 64 patients from the UK. In both...... the NL and UK cohorts, long-term benefit of ipilimumab treatment was unlikely for patients with baseline serum LDH greater than twice the upper limit of normal. In the absence of prospective data, clinicians treating melanoma may wish to consider the data presented here to guide patient selection...

  14. Sensor Optimization Selection Model Based on Testability Constraint

    Institute of Scientific and Technical Information of China (English)

    YANG Shuming; QIU Jing; LIU Guanjun

    2012-01-01

    Sensor selection and optimization is one of the important parts in design for testability.To address the problems that the traditional sensor optimization selection model does not take the requirements of prognostics and health management especially fault prognostics for testability into account and does not consider the impacts of sensor actual attributes on fault detectability,a novel sensor optimization selection model is proposed.Firstly,a universal architecture for sensor selection and optimization is provided.Secondly,a new testability index named fault predictable rate is defined to describe fault prognostics requirements for testability.Thirdly,a sensor selection and optimization model for prognostics and health management is constructed,which takes sensor cost as objective finction and the defined testability indexes as constraint conditions.Due to NP-hard property of the model,a generic algorithm is designed to obtain the optimal solution.At last,a case study is presented to demonstrate the sensor selection approach for a stable tracking servo platform.The application results and comparison analysis show the proposed model and algorithm are effective and feasible.This approach can be used to select sensors for prognostics and health management of any system.

  15. SELECTION MOMENTS AND GENERALIZED METHOD OF MOMENTS FOR HETEROSKEDASTIC MODELS

    Directory of Open Access Journals (Sweden)

    Constantin ANGHELACHE

    2016-06-01

    Full Text Available In this paper, the authors describe the selection methods for moments and the application of the generalized moments method for the heteroskedastic models. The utility of GMM estimators is found in the study of the financial market models. The selection criteria for moments are applied for the efficient estimation of GMM for univariate time series with martingale difference errors, similar to those studied so far by Kuersteiner.

  16. Modeling Suspicious Email Detection using Enhanced Feature Selection

    OpenAIRE

    2013-01-01

    The paper presents a suspicious email detection model which incorporates enhanced feature selection. In the paper we proposed the use of feature selection strategies along with classification technique for terrorists email detection. The presented model focuses on the evaluation of machine learning algorithms such as decision tree (ID3), logistic regression, Na\\"ive Bayes (NB), and Support Vector Machine (SVM) for detecting emails containing suspicious content. In the literature, various algo...

  17. RUC at TREC 2014: Select Resources Using Topic Models

    Science.gov (United States)

    2014-11-01

    them being observed (i.e. sampled). To infer the topic Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the...Selection. In CIKM 2009, pages 1277-1286. [10] M. Baillie, M. Carmen, and F. Crestani. A Multiple- Collection Latent Topic Model for Federated...RUC at TREC 2014: Select Resources Using Topic Models Qiuyue Wang, Shaochen Shi, Wei Cao School of Information Renmin University of China Beijing

  18. Time Selection of Acupuncture Treatment for Facial Paralysis

    Institute of Scientific and Technical Information of China (English)

    SHE Li-xia; SHAO Ming-hai

    2007-01-01

    Objective: To investigate the optimal time for treating facial paralysis with acupuncture therapy. Methods: Eighty-six patients with facial paralysis in different disease durations were treated with same needling technique. Patients of the treatment group at the developing stage were dealt with moderate stimulation, and at the stationary stage and the recovery stage with strong stimulation. Patients of the control group at the developing stage were treated with drugs improving micro-circulation and nerve functions, and glucocorticoids, at the stationary stage and the recovery stage with the same methods as in the treatment group.Results: The cure rate in the treatment group and the control group were 88.1% and 68.2%respectively, and the former has shorter treatment course. Conclusion: Acupuncture therapy has better effect on facial paralysis than routine Westem medicine, and shorter treatment course.

  19. Selection Criteria in Regime Switching Conditional Volatility Models

    Directory of Open Access Journals (Sweden)

    Thomas Chuffart

    2015-05-01

    Full Text Available A large number of nonlinear conditional heteroskedastic models have been proposed in the literature. Model selection is crucial to any statistical data analysis. In this article, we investigate whether the most commonly used selection criteria lead to choice of the right specification in a regime switching framework. We focus on two types of models: the Logistic Smooth Transition GARCH and the Markov-Switching GARCH models. Simulation experiments reveal that information criteria and loss functions can lead to misspecification ; BIC sometimes indicates the wrong regime switching framework. Depending on the Data Generating Process used in the experiments, great care is needed when choosing a criterion.

  20. Anhedonia Predicts Poorer Recovery among Youth with Selective Serotonin Reuptake Inhibitor Treatment-Resistant Depression

    Science.gov (United States)

    McMakin, Dana L.; Olino, Thomas M.; Porta, Giovanna; Dietz, Laura J.; Emslie, Graham; Clarke, Gregory; Wagner, Karen Dineen; Asarnow, Joan R.; Ryan, Neal D.; Birmaher, Boris; Shamseddeen, Wael; Mayes, Taryn; Kennard, Betsy; Spirito, Anthony; Keller, Martin; Lynch, Frances L.; Dickerson, John F.; Brent, David A.

    2012-01-01

    Objective: To identify symptom dimensions of depression that predict recovery among selective serotonin reuptake inhibitor (SSRI) treatment-resistant adolescents undergoing second-step treatment. Method: The Treatment of Resistant Depression in Adolescents (TORDIA) trial included 334 SSRI treatment-resistant youth randomized to a medication…

  1. Anhedonia Predicts Poorer Recovery among Youth with Selective Serotonin Reuptake Inhibitor Treatment-Resistant Depression

    Science.gov (United States)

    McMakin, Dana L.; Olino, Thomas M.; Porta, Giovanna; Dietz, Laura J.; Emslie, Graham; Clarke, Gregory; Wagner, Karen Dineen; Asarnow, Joan R.; Ryan, Neal D.; Birmaher, Boris; Shamseddeen, Wael; Mayes, Taryn; Kennard, Betsy; Spirito, Anthony; Keller, Martin; Lynch, Frances L.; Dickerson, John F.; Brent, David A.

    2012-01-01

    Objective: To identify symptom dimensions of depression that predict recovery among selective serotonin reuptake inhibitor (SSRI) treatment-resistant adolescents undergoing second-step treatment. Method: The Treatment of Resistant Depression in Adolescents (TORDIA) trial included 334 SSRI treatment-resistant youth randomized to a medication…

  2. Colletotrichum gloeosporioides growth-no-growth interface after selected microwave treatments.

    Science.gov (United States)

    Sosa-Morales, M E; Garcia, H S; López-Malo, A

    2009-07-01

    To study microwave heating for potential postharvest treatments against anthracnose disease, Colletotrichum gloeosporioides growth-no-growth response after selected microwave treatments (2,450 MHz) was fitted by using a logistic regression model. Evaluated variables were power level, exposure time, presence or absence of water in the medium during treatment, and incubation-observation time. Depending on the setting, the applied power ranged from 77.2 to 435.6 W. For the experiments on dry medium (mold spores over filter paper), exposure times were 1, 2, 3, or 4 min, whereas spores dispersed in potato dextrose agar, a wet medium, had exposure times of 3, 6, or 9 s. Growth (response = 1) or no growth (response = 0) was observed after two different incubation-observation times (4 or 10 days). As expected, high power levels and long exposure times resulted in complete inhibition of C. gloeosporioides spore germination. In a number of cases (such as low power levels and short treatment times), only a delay in mold growth was observed. Scanning electron micrographs showed signs of mycelia dehydration and structural collapse in the spores of the studied mold. Cell damage was attributed to heating during microwave exposure. Reduced logistic models included variables and interactions that significantly (P Microwave treatments (4 min at any of the studied power levels in dry medium, and 9 s at power levels of 30% or more for wet medium) proved effective in the inhibition of C. gloeosporioides in model systems. These no-growth conditions will be tested further on fresh fruits in order to develop feasible postharvest microwave treatments.

  3. The Use of Evolution in a Central Action Selection Model

    Directory of Open Access Journals (Sweden)

    F. Montes-Gonzalez

    2007-01-01

    Full Text Available The use of effective central selection provides flexibility in design by offering modularity and extensibility. In earlier papers we have focused on the development of a simple centralized selection mechanism. Our current goal is to integrate evolutionary methods in the design of non-sequential behaviours and the tuning of specific parameters of the selection model. The foraging behaviour of an animal robot (animat has been modelled in order to integrate the sensory information from the robot to perform selection that is nearly optimized by the use of genetic algorithms. In this paper we present how selection through optimization finally arranges the pattern of presented behaviours for the foraging task. Hence, the execution of specific parts in a behavioural pattern may be ruled out by the tuning of these parameters. Furthermore, the intensive use of colour segmentation from a colour camera for locating a cylinder sets a burden on the calculations carried out by the genetic algorithm.

  4. Partner Selection Optimization Model of Agricultural Enterprises in Supply Chain

    Directory of Open Access Journals (Sweden)

    Feipeng Guo

    2013-10-01

    Full Text Available With more and more importance of correctly selecting partners in supply chain of agricultural enterprises, a large number of partner evaluation techniques are widely used in the field of agricultural science research. This study established a partner selection model to optimize the issue of agricultural supply chain partner selection. Firstly, it constructed a comprehensive evaluation index system after analyzing the real characteristics of agricultural supply chain. Secondly, a heuristic method for attributes reduction based on rough set theory and principal component analysis was proposed which can reduce multiple attributes into some principal components, yet retaining effective evaluation information. Finally, it used improved BP neural network which has self-learning function to select partners. The empirical analysis on an agricultural enterprise shows that this model is effective and feasible for practical partner selection.

  5. A Hybrid Multiple Criteria Decision Making Model for Supplier Selection

    Directory of Open Access Journals (Sweden)

    Chung-Min Wu

    2013-01-01

    Full Text Available The sustainable supplier selection would be the vital part in the management of a sustainable supply chain. In this study, a hybrid multiple criteria decision making (MCDM model is applied to select optimal supplier. The fuzzy Delphi method, which can lead to better criteria selection, is used to modify criteria. Considering the interdependence among the selection criteria, analytic network process (ANP is then used to obtain their weights. To avoid calculation and additional pairwise comparisons of ANP, a technique for order preference by similarity to ideal solution (TOPSIS is used to rank the alternatives. The use of a combination of the fuzzy Delphi method, ANP, and TOPSIS, proposing an MCDM model for supplier selection, and applying these to a real case are the unique features of this study.

  6. Bayesian model evidence for order selection and correlation testing.

    Science.gov (United States)

    Johnston, Leigh A; Mareels, Iven M Y; Egan, Gary F

    2011-01-01

    Model selection is a critical component of data analysis procedures, and is particularly difficult for small numbers of observations such as is typical of functional MRI datasets. In this paper we derive two Bayesian evidence-based model selection procedures that exploit the existence of an analytic form for the linear Gaussian model class. Firstly, an evidence information criterion is proposed as a model order selection procedure for auto-regressive models, outperforming the commonly employed Akaike and Bayesian information criteria in simulated data. Secondly, an evidence-based method for testing change in linear correlation between datasets is proposed, which is demonstrated to outperform both the traditional statistical test of the null hypothesis of no correlation change and the likelihood ratio test.

  7. Statistical model selection with “Big Data”

    Directory of Open Access Journals (Sweden)

    Jurgen A. Doornik

    2015-12-01

    Full Text Available Big Data offer potential benefits for statistical modelling, but confront problems including an excess of false positives, mistaking correlations for causes, ignoring sampling biases and selecting by inappropriate methods. We consider the many important requirements when searching for a data-based relationship using Big Data, and the possible role of Autometrics in that context. Paramount considerations include embedding relationships in general initial models, possibly restricting the number of variables to be selected over by non-statistical criteria (the formulation problem, using good quality data on all variables, analyzed with tight significance levels by a powerful selection procedure, retaining available theory insights (the selection problem while testing for relationships being well specified and invariant to shifts in explanatory variables (the evaluation problem, using a viable approach that resolves the computational problem of immense numbers of possible models.

  8. Selection Bias in Educational Transition Models: Theory and Empirical Evidence

    DEFF Research Database (Denmark)

    Holm, Anders; Jæger, Mads

    Most studies using Mare’s (1980, 1981) seminal model of educational transitions find that the effect of family background decreases across transitions. Recently, Cameron and Heckman (1998, 2001) have argued that the “waning coefficients” in the Mare model are driven by selection on unobserved...... the United States, United Kingdom, Denmark, and the Netherlands shows that when we take selection into account the effect of family background variables on educational transitions is largely constant across transitions. We also discuss several difficulties in estimating educational transition models which...... variables. This paper, first, explains theoretically how selection on unobserved variables leads to waning coefficients and, second, illustrates empirically how selection leads to biased estimates of the effect of family background on educational transitions. Our empirical analysis using data from...

  9. Multicriteria framework for selecting a process modelling language

    Science.gov (United States)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  10. Selecting the Right Patient for Surgical Treatment of Hyperhidrosis.

    Science.gov (United States)

    Cameron, Alan Edmond Parsons

    2016-11-01

    This article presents a personal view of the indications for surgical treatment of patients with hyperhidrosis based on long clinical experience. Endoscopic thoracic sympathectomy is the preferred opinion for palmar sweating. It is also useful when there is additional axillary sweating but is not the first choice for isolated armpit symptoms. Surgical treatment of craniofacial sweating is much more likely to be followed by undesirable side-effects. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Comparison of climate envelope models developed using expert-selected variables versus statistical selection

    Science.gov (United States)

    Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romanach, Stephanie; Watling, James I.; Mazzotti, Frank J.

    2017-01-01

    Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (models had high performance metrics (>0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using

  12. Amyloid precursor protein selective gamma-secretase inhibitors for treatment of Alzheimer's disease.

    Science.gov (United States)

    Basi, Guriqbal S; Hemphill, Susanna; Brigham, Elizabeth F; Liao, Anna; Aubele, Danielle L; Baker, Jeanne; Barbour, Robin; Bova, Michael; Chen, Xiao-Hua; Dappen, Michael S; Eichenbaum, Tovah; Goldbach, Erich; Hawkinson, Jon; Lawler-Herbold, Rose; Hu, Kang; Hui, Terence; Jagodzinski, Jacek J; Keim, Pamela S; Kholodenko, Dora; Latimer, Lee H; Lee, Mike; Marugg, Jennifer; Mattson, Matthew N; McCauley, Scott; Miller, James L; Motter, Ruth; Mutter, Linda; Neitzel, Martin L; Ni, Huifang; Nguyen, Lan; Quinn, Kevin; Ruslim, Lany; Semko, Christopher M; Shapiro, Paul; Smith, Jenifer; Soriano, Ferdie; Szoke, Balazs; Tanaka, Kevin; Tang, Pearl; Tucker, John A; Ye, Xiacong Michael; Yu, Mei; Wu, Jing; Xu, Ying-Zi; Garofalo, Albert W; Sauer, John Michael; Konradi, Andrei W; Ness, Daniel; Shopp, George; Pleiss, Michael A; Freedman, Stephen B; Schenk, Dale

    2010-12-29

    Inhibition of gamma-secretase presents a direct target for lowering Aβ production in the brain as a therapy for Alzheimer's disease (AD). However, gamma-secretase is known to process multiple substrates in addition to amyloid precursor protein (APP), most notably Notch, which has limited clinical development of inhibitors targeting this enzyme. It has been postulated that APP substrate selective inhibitors of gamma-secretase would be preferable to non-selective inhibitors from a safety perspective for AD therapy. In vitro assays monitoring inhibitor potencies at APP γ-site cleavage (equivalent to Aβ40), and Notch ε-site cleavage, in conjunction with a single cell assay to simultaneously monitor selectivity for inhibition of Aβ production vs. Notch signaling were developed to discover APP selective gamma-secretase inhibitors. In vivo efficacy for acute reduction of brain Aβ was determined in the PDAPP transgene model of AD, as well as in wild-type FVB strain mice. In vivo selectivity was determined following seven days x twice per day (b.i.d.) treatment with 15 mg/kg/dose to 1,000 mg/kg/dose ELN475516, and monitoring brain Aβ reduction vs. Notch signaling endpoints in periphery. The APP selective gamma-secretase inhibitors ELN318463 and ELN475516 reported here behave as classic gamma-secretase inhibitors, demonstrate 75- to 120-fold selectivity for inhibiting Aβ production compared with Notch signaling in cells, and displace an active site directed inhibitor at very high concentrations only in the presence of substrate. ELN318463 demonstrated discordant efficacy for reduction of brain Aβ in the PDAPP compared with wild-type FVB, not observed with ELN475516. Improved in vivo safety of ELN475516 was demonstrated in the 7d repeat dose study in wild-type mice, where a 33% reduction of brain Aβ was observed in mice terminated three hours post last dose at the lowest dose of inhibitor tested. No overt in-life or post-mortem indications of systemic toxicity, nor

  13. Selective Androgen Receptor Modulator (SARM) treatment prevents bone loss and reduces body fat in ovariectomized rats.

    Science.gov (United States)

    Kearbey, Jeffrey D; Gao, Wenqing; Narayanan, Ramesh; Fisher, Scott J; Wu, Di; Miller, Duane D; Dalton, James T

    2007-02-01

    This study was conducted to examine the bone and body composition effects of S-4, an aryl-propionamide derived Selective Androgen Receptor Modulator (SARM) in an ovariectomy induced model of accelerated bone loss. One hundred twenty female Sprague-Dawley rats aged to twenty-three weeks were randomly assigned to twelve treatment groups. Drug treatment was initiated immediately following ovariectomy and continued for one hundred twenty days. Whole body bone mineral density (BMD), body composition, and lumbar vertebrae BMD were measured by dual energy x-ray absorptiometry. More stringent regional pQCT and biomechanical strength testing was performed on excised femurs. We found that S-4 treatment maintained whole body and trabecular BMD, cortical content, and increased bone strength while decreasing body fat in these animals. The data presented herein show the protective skeletal effects of S-4. Our previous reports have shown the tissue selectivity and muscle anabolic activity of S-4. Together these data suggest that S-4 could reduce the incidence of fracture via two different mechanisms (i.e., via direct effects in bone and reducing the incidence of falls through increased muscle strength). This approach to fracture reduction would be advantageous over current therapies in these patients which are primarily antiresorptive in nature.

  14. Selective Androgen Receptor Modulator (SARM) Treatment Prevents Bone Loss and Reduces Body Fat in Ovariectomized Rats

    Science.gov (United States)

    Kearbey, Jeffrey D.; Gao, Wenqing; Narayanan, Ramesh; Fisher, Scott J.; Wu, Di; Miller, Duane D.; Dalton, James T.

    2007-01-01

    Purpose This study was conducted to examine the bone and body composition effects of S-4, an arylpropionamide derived Selective Androgen Receptor Modulator (SARM) in an ovariectomy induced model of accelerated bone loss. Methods One hundred twenty female Sprague-Dawley rats aged to twenty-three weeks were randomly assigned to twelve treatment groups. Drug treatment was initiated immediately following ovariectomy and continued for one hundred twenty days. Whole body bone mineral density (BMD), body composition, and lumbar vertebrae BMD were measured by dual energy x-ray absorptiometry. More stringent regional pQCT and biomechanical strength testing was performed on excised femurs. Results We found that S-4 treatment maintained whole body and trabecular BMD, cortical content, and increased bone strength while decreasing body fat in these animals. Conclusions The data presented herein show the protective skeletal effects of S-4. Our previous reports have shown the tissue selectivity and muscle anabolic activity of S-4. Together these data suggest that S-4 could reduce the incidence of fracture via two different mechanisms (i.e., via direct effects in bone and reducing the incidence of falls through increased muscle strength). This approach to fracture reduction would be advantageous over current therapies in these patients which are primarily antiresorptive in nature. PMID:17063395

  15. Interloper treatment in dynamical modelling of galaxy clusters

    CERN Document Server

    Wojtak, R; Mamon, G A; Gottlöber, S; Prada, F; Moles, M; Wojtak, Radoslaw; Lokas, Ewa L.; Mamon, Gary A.; Gottloeber, Stefan; Prada, Francisco; Moles, Mariano

    2006-01-01

    The aim of this paper is to study the efficiency of different approaches to interloper treatment in dynamical modelling of galaxy clusters. Using cosmological N-body simulation of standard LCDM model we select 10 massive dark matter haloes and use their particles to emulate mock kinematic data in terms of projected galaxy positions and velocities as they would be measured by a distant observer. Taking advantage of the full 3D information available from the simulation we select samples of interlopers defined with different criteria. The interlopers thus selected provide means to assess the efficiency of different interloper removal schemes. We study direct methods of interloper removal based on dynamical or statistical restrictions imposed on ranges of positions and velocities available to cluster members. In determining these ranges we use either the velocity dispersion criterion or a maximum velocity profile. We find that the direct methods exclude on average 60-70 percent of unbound particles producing a sa...

  16. Models of microbiome evolution incorporating host and microbial selection.

    Science.gov (United States)

    Zeng, Qinglong; Wu, Steven; Sukumaran, Jeet; Rodrigo, Allen

    2017-09-25

    Numerous empirical studies suggest that hosts and microbes exert reciprocal selective effects on their ecological partners. Nonetheless, we still lack an explicit framework to model the dynamics of both hosts and microbes under selection. In a previous study, we developed an agent-based forward-time computational framework to simulate the neutral evolution of host-associated microbial communities in a constant-sized, unstructured population of hosts. These neutral models allowed offspring to sample microbes randomly from parents and/or from the environment. Additionally, the environmental pool of available microbes was constituted by fixed and persistent microbial OTUs and by contributions from host individuals in the preceding generation. In this paper, we extend our neutral models to allow selection to operate on both hosts and microbes. We do this by constructing a phenome for each microbial OTU consisting of a sample of traits that influence host and microbial fitnesses independently. Microbial traits can influence the fitness of hosts ("host selection") and the fitness of microbes ("trait-mediated microbial selection"). Additionally, the fitness effects of traits on microbes can be modified by their hosts ("host-mediated microbial selection"). We simulate the effects of these three types of selection, individually or in combination, on microbiome diversities and the fitnesses of hosts and microbes over several thousand generations of hosts. We show that microbiome diversity is strongly influenced by selection acting on microbes. Selection acting on hosts only influences microbiome diversity when there is near-complete direct or indirect parental contribution to the microbiomes of offspring. Unsurprisingly, microbial fitness increases under microbial selection. Interestingly, when host selection operates, host fitness only increases under two conditions: (1) when there is a strong parental contribution to microbial communities or (2) in the absence of a strong

  17. Periodic Integration: Further Results on Model Selection and Forecasting

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1996-01-01

    textabstractThis paper considers model selection and forecasting issues in two closely related models for nonstationary periodic autoregressive time series [PAR]. Periodically integrated seasonal time series [PIAR] need a periodic differencing filter to remove the stochastic trend. On the other

  18. Utilizing Antecedent Manipulations and Reinforcement in the Treatment of Food Selectivity by Texture

    Science.gov (United States)

    Najdowski, Adel C.; Tarbox, Jonathan; Wilke, Arthur E.

    2012-01-01

    Food selectivity by texture is relatively common in children. Treatments for food selectivity by texture have included components such as stimulus fading, reinforcement, and escape extinction. The purpose of the current study was to attempt to treat food selectivity by texture utilizing antecedent manipulations and reinforcement in the absence of…

  19. Quantile hydrologic model selection and model structure deficiency assessment: 1. Theory

    NARCIS (Netherlands)

    Pande, S.

    2013-01-01

    A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies

  20. Quantile hydrologic model selection and model structure deficiency assessment: 1. Theory

    NARCIS (Netherlands)

    Pande, S.

    2013-01-01

    A theory for quantile based hydrologic model selection and model structure deficiency assessment is presented. The paper demonstrates that the degree to which a model selection problem is constrained by the model structure (measured by the Lagrange multipliers of the constraints) quantifies structur

  1. AN EXPERT SYSTEM MODEL FOR THE SELECTION OF TECHNICAL PERSONNEL

    Directory of Open Access Journals (Sweden)

    Emine COŞGUN

    2005-03-01

    Full Text Available In this study, a model has been developed for the selection of the technical personnel. In the model Visual Basic has been used as user interface, Microsoft Access has been utilized as database system and CLIPS program has been used as expert system program. The proposed model has been developed by utilizing expert system technology. In the personnel selection process, only the pre-evaluation of the applicants has been taken into consideration. Instead of replacing the expert himself, a decision support program has been developed to analyze the data gathered from the job application forms. The attached study will assist the expert to make faster and more accurate decisions.

  2. Novel web service selection model based on discrete group search.

    Science.gov (United States)

    Zhai, Jie; Shao, Zhiqing; Guo, Yi; Zhang, Haiteng

    2014-01-01

    In our earlier work, we present a novel formal method for the semiautomatic verification of specifications and for describing web service composition components by using abstract concepts. After verification, the instantiations of components were selected to satisfy the complex service performance constraints. However, selecting an optimal instantiation, which comprises different candidate services for each generic service, from a large number of instantiations is difficult. Therefore, we present a new evolutionary approach on the basis of the discrete group search service (D-GSS) model. With regard to obtaining the optimal multiconstraint instantiation of the complex component, the D-GSS model has competitive performance compared with other service selection models in terms of accuracy, efficiency, and ability to solve high-dimensional service composition component problems. We propose the cost function and the discrete group search optimizer (D-GSO) algorithm and study the convergence of the D-GSS model through verification and test cases.

  3. Plasma electrolytic treatment of products after selective laser melting

    Science.gov (United States)

    Kashapov, L. N.; Kashapov, N. F.; Kashapov, R. N.; Denisov, D. G.

    2016-01-01

    The aim of the work was to study the possibilities of plasma electrolytic treatment for cleaning surfaces of metal products obtained by the SLM-technology. We found that the most effective cleaning from the large alloy particles occurs in the "hydrodynamic" mode, when the occurrence of hydrodynamic pulses observed. Further smoothing of irregularities eliminated by a stable burning of discharge in vapor shell. Analysis the morphology of the surface of difficult specialized products, such as crown conical gears, after plasma hydrodynamic treatment showed efficiency and advantages in comparison to conventional methods of final cleaning such as shot blasting.

  4. Robotic selective postganglionic thoracic sympathectomy for the treatment of hyperhidrosis.

    Science.gov (United States)

    Coveliers, Hans; Meyer, Mark; Gharagozloo, Farid; Wisselink, Willem; Rauwerda, Jan; Margolis, Marc; Tempesta, Barbara; Strother, Eric

    2013-01-01

    The surgical management of hyperhidrosis is controversial. Robotic surgical systems with their high-definition magnified 3-dimensional view and increased maneuverability in a confined space may facilitate the technique of selective sympathectomy (ramicotomy). We present a case series of patients undergoing selective postganglionic thoracic sympathectomy using robotic technology. This study is a case series analysis of patients who underwent selective postganglionic thoracic sympathectomy from July 2006 to November 2011. The operation was performed on a video-assisted thoracoscopic surgery (VATS) platform. The robot was used for pleural dissection and division of the postganglionic sympathetic fibers and the communicating rami. The success of sympathectomy was assessed by intraoperative temperature measurement of the ipsilateral upper extremity, patient interviews, and scoring of the symptomatic nature of hyperhidrosis based on the Hyperhidrosis Disease Severity Scale. There were 110 sympathectomies performed in 55 patients (25 men, 30 women). Simultaneous bilateral sympathectomy was performed in all patients. Median age was 28 years (range, 16 to 65 years). There was no conversion to thoracotomy. Complications were minor and were seen in 5 of 55 patients (9%). There were no deaths. Median hospital stay was 1 day (range, 1 to 4 days). Of the 55 patients, 53 (96%) had sustained relief of their hyperhidrosis at a median follow-up of 24 months (range, 3 to 36 months), and compensatory sweating was seen in 4 patients (7.2%). Robotic thoracoscopic selective sympathectomy is an effective, feasible, and safe procedure with excellent relief of hyperhidrosis and low rates of compensatory sweating and complications. Copyright © 2013 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  5. ECONOMIC ANALYSIS OF SELECTED ADSORPTION BEDS USED FOR WATER TREATMENT

    Directory of Open Access Journals (Sweden)

    Iwona Skoczko

    2016-02-01

    Full Text Available In the paper an economic analysis of the sorption deposits used for water purification was performed. Such masses of carbon as Organosorb 10 deposit, Norit ROW 0.8 Supra deposit, Hydroantracyt N deposit and K110 deposit were selected. The economic substantiation for the selection of specific deposits was based on the purchase cost of filter deposits, the environmental fee for water intake, the cost of equipment to aerate water, the cost of chemicals for deposits regeneration, the cost of the dosing pump reagents for regeneration generated during the process of adsorption using each of the deposits selected to the analysis. K110 deposit turned out to be the cheapest in the exploitation and Norit ROW 0.8 Supra was most expensive. The operating costs of the adsorption deposits depend mainly on the purchase of the adsorbent and the speed of adsorption. Moreover, environmental fee for water intake constitutes a significant share in the costs of exploitation of carbon deposits , which is greater than the deposit costs.

  6. A genetic algorithm for variable selection in logistic regression analysis of radiotherapy treatment outcomes.

    Science.gov (United States)

    Gayou, Olivier; Das, Shiva K; Zhou, Su-Min; Marks, Lawrence B; Parda, David S; Miften, Moyed

    2008-12-01

    A given outcome of radiotherapy treatment can be modeled by analyzing its correlation with a combination of dosimetric, physiological, biological, and clinical factors, through a logistic regression fit of a large patient population. The quality of the fit is measured by the combination of the predictive power of this particular set of factors and the statistical significance of the individual factors in the model. We developed a genetic algorithm (GA), in which a small sample of all the possible combinations of variables are fitted to the patient data. New models are derived from the best models, through crossover and mutation operations, and are in turn fitted. The process is repeated until the sample converges to the combination of factors that best predicts the outcome. The GA was tested on a data set that investigated the incidence of lung injury in NSCLC patients treated with 3DCRT. The GA identified a model with two variables as the best predictor of radiation pneumonitis: the V30 (p=0.048) and the ongoing use of tobacco at the time of referral (p=0.074). This two-variable model was confirmed as the best model by analyzing all possible combinations of factors. In conclusion, genetic algorithms provide a reliable and fast way to select significant factors in logistic regression analysis of large clinical studies.

  7. Selection of climate change scenario data for impact modelling

    DEFF Research Database (Denmark)

    Sloth Madsen, M; Fox Maule, C; MacKellar, N

    2012-01-01

    Impact models investigating climate change effects on food safety often need detailed climate data. The aim of this study was to select climate change projection data for selected crop phenology and mycotoxin impact models. Using the ENSEMBLES database of climate model output, this study...... illustrates how the projected climate change signal of important variables as temperature, precipitation and relative humidity depends on the choice of the climate model. Using climate change projections from at least two different climate models is recommended to account for model uncertainty. To make...... the climate projections suitable for impact analysis at the local scale a weather generator approach was adopted. As the weather generator did not treat all the necessary variables, an ad-hoc statistical method was developed to synthesise realistic values of missing variables. The method is presented...

  8. Fuzzy MCDM Model for Risk Factor Selection in Construction Projects

    Directory of Open Access Journals (Sweden)

    Pejman Rezakhani

    2012-11-01

    Full Text Available Risk factor selection is an important step in a successful risk management plan. There are many risk factors in a construction project and by an effective and systematic risk selection process the most critical risks can be distinguished to have more attention. In this paper through a comprehensive literature survey, most significant risk factors in a construction project are classified in a hierarchical structure. For an effective risk factor selection, a modified rational multi criteria decision making model (MCDM is developed. This model is a consensus rule based model and has the optimization property of rational models. By applying fuzzy logic to this model, uncertainty factors in group decision making such as experts` influence weights, their preference and judgment for risk selection criteria will be assessed. Also an intelligent checking process to check the logical consistency of experts` preferences will be implemented during the decision making process. The solution inferred from this method is in the highest degree of acceptance of group members. Also consistency of individual preferences is checked by some inference rules. This is an efficient and effective approach to prioritize and select risks based on decisions made by group of experts in construction projects. The applicability of presented method is assessed through a case study.

  9. Modeling and Hemofiltration Treatment of Acute Inflammation

    Directory of Open Access Journals (Sweden)

    Robert S. Parker

    2016-10-01

    Full Text Available The body responds to endotoxins by triggering the acute inflammatory response system to eliminate the threat posed by gram-negative bacteria (endotoxin and restore health. However, an uncontrolled inflammatory response can lead to tissue damage, organ failure, and ultimately death; this is clinically known as sepsis. Mathematical models of acute inflammatory disease have the potential to guide treatment decisions in critically ill patients. In this work, an 8-state (8-D differential equation model of the acute inflammatory response system to endotoxin challenge was developed. Endotoxin challenges at 3 and 12 mg/kg were administered to rats, and dynamic cytokine data for interleukin (IL-6, tumor necrosis factor (TNF, and IL-10 were obtained and used to calibrate the model. Evaluation of competing model structures was performed by analyzing model predictions at 3, 6, and 12 mg/kg endotoxin challenges with respect to experimental data from rats. Subsequently, a model predictive control (MPC algorithm was synthesized to control a hemoadsorption (HA device, a blood purification treatment for acute inflammation. A particle filter (PF algorithm was implemented to estimate the full state vector of the endotoxemic rat based on time series cytokine measurements. Treatment simulations show that: (i the apparent primary mechanism of HA efficacy is white blood cell (WBC capture, with cytokine capture a secondary benefit; and (ii differential filtering of cytokines and WBC does not provide substantial improvement in treatment outcomes vs. existing HA devices.

  10. A Hybrid Program Projects Selection Model for Nonprofit TV Stations

    Directory of Open Access Journals (Sweden)

    Kuei-Lun Chang

    2015-01-01

    Full Text Available This study develops a hybrid multiple criteria decision making (MCDM model to select program projects for nonprofit TV stations on the basis of managers’ perceptions. By the concept of balanced scorecard (BSC and corporate social responsibility (CSR, we collect criteria for selecting the best program project. Fuzzy Delphi method, which can lead to better criteria selection, is used to modify criteria. Next, considering the interdependence among the selection criteria, analytic network process (ANP is then used to obtain the weights of them. To avoid calculation and additional pairwise comparisons of ANP, technique for order preference by similarity to ideal solution (TOPSIS is used to rank the alternatives. A case study is presented to demonstrate the applicability of the proposed model.

  11. A SUPPLIER SELECTION MODEL FOR SOFTWARE DEVELOPMENT OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Hancu Lucian-Viorel

    2010-12-01

    Full Text Available This paper presents a multi-criteria decision making model used for supplier selection for software development outsourcing on e-marketplaces. This model can be used in auctions. The supplier selection process becomes complex and difficult on last twenty years since the Internet plays an important role in business management. Companies have to concentrate their efforts on their core activities and the others activities should be realized by outsourcing. They can achieve significant cost reduction by using e-marketplaces in their purchase process and by using decision support systems on supplier selection. In the literature were proposed many approaches for supplier evaluation and selection process. The performance of potential suppliers is evaluated using multi criteria decision making methods rather than considering a single factor cost.

  12. Adverse Selection Models with Three States of Nature

    Directory of Open Access Journals (Sweden)

    Daniela MARINESCU

    2011-02-01

    Full Text Available In the paper we analyze an adverse selection model with three states of nature, where both the Principal and the Agent are risk neutral. When solving the model, we use the informational rents and the efforts as variables. We derive the optimal contract in the situation of asymmetric information. The paper ends with the characteristics of the optimal contract and the main conclusions of the model.

  13. Bayesian model selection for constrained multivariate normal linear models

    NARCIS (Netherlands)

    Mulder, J.

    2010-01-01

    The expectations that researchers have about the structure in the data can often be formulated in terms of equality constraints and/or inequality constraints on the parameters in the model that is used. In a (M)AN(C)OVA model, researchers have expectations about the differences between the

  14. Genetic signatures of natural selection in a model invasive ascidian

    Science.gov (United States)

    Lin, Yaping; Chen, Yiyong; Yi, Changho; Fong, Jonathan J.; Kim, Won; Rius, Marc; Zhan, Aibin

    2017-01-01

    Invasive species represent promising models to study species’ responses to rapidly changing environments. Although local adaptation frequently occurs during contemporary range expansion, the associated genetic signatures at both population and genomic levels remain largely unknown. Here, we use genome-wide gene-associated microsatellites to investigate genetic signatures of natural selection in a model invasive ascidian, Ciona robusta. Population genetic analyses of 150 individuals sampled in Korea, New Zealand, South Africa and Spain showed significant genetic differentiation among populations. Based on outlier tests, we found high incidence of signatures of directional selection at 19 loci. Hitchhiking mapping analyses identified 12 directional selective sweep regions, and all selective sweep windows on chromosomes were narrow (~8.9 kb). Further analyses indentified 132 candidate genes under selection. When we compared our genetic data and six crucial environmental variables, 16 putatively selected loci showed significant correlation with these environmental variables. This suggests that the local environmental conditions have left significant signatures of selection at both population and genomic levels. Finally, we identified “plastic” genomic regions and genes that are promising regions to investigate evolutionary responses to rapid environmental change in C. robusta. PMID:28266616

  15. IT vendor selection model by using structural equation model & analytical hierarchy process

    Science.gov (United States)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  16. Effect of solvent selection on froth treatment for mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Hamza, H.A. [Natural Resources Canada, Devon, AB (Canada). CANMET Energy Technology Centre

    2003-07-01

    The heavy oil (bitumen) industry is important for the Canadian economy. The oil sands are unique to Canada and government support is crucial for continued research into the development of technologies and products which make less of an impact on the environment. This paper describes a successful collaboration between the Canada Centre for Mineral and Energy Technology (CANMET) and Shell Canada which led to the 2003 award for Advanced Separation Technologies (AST). This PowerPoint presentation reviews feeds for a naphtha solvent treatment versus a paraffinic solvent treatment. The feeds include bitumen froth, low quality bitumen emulsions, in-situ heavy oil emulsions, and bitumen. Microscope images for each of these feeds were illustrated. The typical elements of the solvent treatment process were also outlined in a flow chart. The main feature of the naphtha process is that it is well established plus it renders a product with 1.5 to 4 per cent water and 0.5 per cent solids. It needs chemical demulsifiers and no asphaltene is rejected. The main feature of the paraffinic process is that it produces a product with low water/solids content and asphaltene levels can be controlled. The process requires less solvent for pipelining because of its reduced viscosity and if offers flexibility in terms of upgrading options. The presentation listed achievements in AST by Albian Sands, TrueNorth, Canadian Natural Resources Ltd., Suncor, Syncrude, and AEC Pipelines Ltd. tabs., figs.

  17. Robust model selection and the statistical classification of languages

    Science.gov (United States)

    García, J. E.; González-López, V. A.; Viola, M. L. L.

    2012-10-01

    In this paper we address the problem of model selection for the set of finite memory stochastic processes with finite alphabet, when the data is contaminated. We consider m independent samples, with more than half of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We devise a model selection procedure such that for a sample size large enough, the selected process is the one with law Q. Our model selection strategy is based on estimating relative entropies to select a subset of samples that are realizations of the same law. Although the procedure is valid for any family of finite order Markov models, we will focus on the family of variable length Markov chain models, which include the fixed order Markov chain model family. We define the asymptotic breakdown point (ABDP) for a model selection procedure, and we show the ABDP for our procedure. This means that if the proportion of contaminated samples is smaller than the ABDP, then, as the sample size grows our procedure selects a model for the process with law Q. We also use our procedure in a setting where we have one sample conformed by the concatenation of sub-samples of two or more stochastic processes, with most of the subsamples having law Q. We conducted a simulation study. In the application section we address the question of the statistical classification of languages according to their rhythmic features using speech samples. This is an important open problem in phonology. A persistent difficulty on this problem is that the speech samples correspond to several sentences produced by diverse speakers, corresponding to a mixture of distributions. The usual procedure to deal with this problem has been to choose a subset of the original sample which seems to best represent each language. The selection is made by listening to the samples. In our application we use the full dataset without any preselection of samples. We apply our robust methodology estimating

  18. Selecting Optimal Subset of Features for Student Performance Model

    Directory of Open Access Journals (Sweden)

    Hany M. Harb

    2012-09-01

    Full Text Available Educational data mining (EDM is a new growing research area and the essence of data mining concepts are used in the educational field for the purpose of extracting useful information on the student behavior in the learning process. Classification methods like decision trees, rule mining, and Bayesian network, can be applied on the educational data for predicting the student behavior like performance in an examination. This prediction may help in student evaluation. As the feature selection influences the predictive accuracy of any performance model, it is essential to study elaborately the effectiveness of student performance model in connection with feature selection techniques. The main objective of this work is to achieve high predictive performance by adopting various feature selection techniques to increase the predictive accuracy with least number of features. The outcomes show a reduction in computational time and constructional cost in both training and classification phases of the student performance model.

  19. Short-Run Asset Selection using a Logistic Model

    Directory of Open Access Journals (Sweden)

    Walter Gonçalves Junior

    2011-06-01

    Full Text Available Investors constantly look for significant predictors and accurate models to forecast future results, whose occasional efficacy end up being neutralized by market efficiency. Regardless, such predictors are widely used for seeking better (and more unique perceptions. This paper aims to investigate to what extent some of the most notorious indicators have discriminatory power to select stocks, and if it is feasible with such variables to build models that could anticipate those with good performance. In order to do that, logistical regressions were conducted with stocks traded at Bovespa using the selected indicators as explanatory variables. Investigated in this study were the outputs of Bovespa Index, liquidity, the Sharpe Ratio, ROE, MB, size and age evidenced to be significant predictors. Also examined were half-year, logistical models, which were adjusted in order to check the potential acceptable discriminatory power for the asset selection.

  20. Sample selection and taste correlation in discrete choice transport modelling

    DEFF Research Database (Denmark)

    Mabit, Stefan Lindhard

    2008-01-01

    the question for a broader class of models. It is shown that the original result may be somewhat generalised. Another question investigated is whether mode choice operates as a self-selection mechanism in the estimation of the value of travel time. The results show that self-selection can at least partly...... explain counterintuitive results in value of travel time estimation. However, the results also point at the difficulty of finding suitable instruments for the selection mechanism. Taste heterogeneity is another important aspect of discrete choice modelling. Mixed logit models are designed to capture...... of taste correlation in willingness-to-pay estimation are presented. The first contribution addresses how to incorporate taste correlation in the estimation of the value of travel time for public transport. Given a limited dataset the approach taken is to use theory on the value of travel time as guidance...

  1. Financial applications of a Tabu search variable selection model

    Directory of Open Access Journals (Sweden)

    Zvi Drezner

    2001-01-01

    Full Text Available We illustrate how a comparatively new technique, a Tabu search variable selection model [Drezner, Marcoulides and Salhi (1999], can be applied efficiently within finance when the researcher must select a subset of variables from among the whole set of explanatory variables under consideration. Several types of problems in finance, including corporate and personal bankruptcy prediction, mortgage and credit scoring, and the selection of variables for the Arbitrage Pricing Model, require the researcher to select a subset of variables from a larger set. In order to demonstrate the usefulness of the Tabu search variable selection model, we: (1 illustrate its efficiency in comparison to the main alternative search procedures, such as stepwise regression and the Maximum R2 procedure, and (2 show how a version of the Tabu search procedure may be implemented when attempting to predict corporate bankruptcy. We accomplish (2 by indicating that a Tabu Search procedure increases the predictability of corporate bankruptcy by up to 10 percentage points in comparison to Altman's (1968 Z-Score model.

  2. The Properties of Model Selection when Retaining Theory Variables

    DEFF Research Database (Denmark)

    Hendry, David F.; Johansen, Søren

    Economic theories are often fitted directly to data to avoid possible model selection biases. We show that embedding a theory model that specifies the correct set of m relevant exogenous variables, x{t}, within the larger set of m+k candidate variables, (x{t},w{t}), then selection over the second...... set by their statistical significance can be undertaken without affecting the estimator distribution of the theory parameters. This strategy returns the theory-parameter estimates when the theory is correct, yet protects against the theory being under-specified because some w{t} are relevant....

  3. Fretting of AISI 9310 and selected fretting resistant surface treatments

    Science.gov (United States)

    Bill, R. C.

    1977-01-01

    Fretting wear experiments were conducted with uncoated AISI 9310 mating surfaces, and with combinations incorporating a selected coating to one of the mating surfaces. Wear measurements and SEM observations indicated that surface fatigue, as made evident by spallation and surface crack formation, is an important mechanism in promoting fretting wear to uncoated 9310. Increasing humidity resulted in accelerated fretting, and a very noticeable difference in nature of the fretting debris. Of the coatings evaluated, aluminum bronze with a polyester additive was most effective at reducing wear and minimizing fretting damage to the mating uncoated surface, by means of a selflubricating film that developed on the fretting surfaces. Chromium plate performed as an effective protective coating, itself resisting fretting and not accelerating damage to the uncoated surface.

  4. TIME SERIES FORECASTING WITH MULTIPLE CANDIDATE MODELS: SELECTING OR COMBINING?

    Institute of Scientific and Technical Information of China (English)

    YU Lean; WANG Shouyang; K. K. Lai; Y.Nakamori

    2005-01-01

    Various mathematical models have been commonly used in time series analysis and forecasting. In these processes, academic researchers and business practitioners often come up against two important problems. One is whether to select an appropriate modeling approach for prediction purposes or to combine these different individual approaches into a single forecast for the different/dissimilar modeling approaches. Another is whether to select the best candidate model for forecasting or to mix the various candidate models with different parameters into a new forecast for the same/similar modeling approaches. In this study, we propose a set of computational procedures to solve the above two issues via two judgmental criteria. Meanwhile, in view of the problems presented in the literature, a novel modeling technique is also proposed to overcome the drawbacks of existing combined forecasting methods. To verify the efficiency and reliability of the proposed procedure and modeling technique, the simulations and real data examples are conducted in this study.The results obtained reveal that the proposed procedure and modeling technique can be used as a feasible solution for time series forecasting with multiple candidate models.

  5. Bayesian selection of nucleotide substitution models and their site assignments.

    Science.gov (United States)

    Wu, Chieh-Hsi; Suchard, Marc A; Drummond, Alexei J

    2013-03-01

    Probabilistic inference of a phylogenetic tree from molecular sequence data is predicated on a substitution model describing the relative rates of change between character states along the tree for each site in the multiple sequence alignment. Commonly, one assumes that the substitution model is homogeneous across sites within large partitions of the alignment, assigns these partitions a priori, and then fixes their underlying substitution model to the best-fitting model from a hierarchy of named models. Here, we introduce an automatic model selection and model averaging approach within a Bayesian framework that simultaneously estimates the number of partitions, the assignment of sites to partitions, the substitution model for each partition, and the uncertainty in these selections. This new approach is implemented as an add-on to the BEAST 2 software platform. We find that this approach dramatically improves the fit of the nucleotide substitution model compared with existing approaches, and we show, using a number of example data sets, that as many as nine partitions are required to explain the heterogeneity in nucleotide substitution process across sites in a single gene analysis. In some instances, this improved modeling of the substitution process can have a measurable effect on downstream inference, including the estimated phylogeny, relative divergence times, and effective population size histories.

  6. An Integrated Model For Online shopping, Using Selective Models

    Directory of Open Access Journals (Sweden)

    Fereshteh Rabiei Dastjerdi

    Full Text Available As in traditional shopping, customer acquisition and retention are critical issues in the success of an online store. Many factors impact how, and if, customers accept online shopping. Models presented in recent years, only focus on behavioral or technolo ...

  7. Selecting global climate models for regional climate change studies

    OpenAIRE

    Pierce, David W.; Barnett, Tim P.; Santer, Benjamin D.; Gleckler, Peter J.

    2009-01-01

    Regional or local climate change modeling studies currently require starting with a global climate model, then downscaling to the region of interest. How should global models be chosen for such studies, and what effect do such choices have? This question is addressed in the context of a regional climate detection and attribution (D&A) study of January-February-March (JFM) temperature over the western U.S. Models are often selected for a regional D&A analysis based on the quality of the simula...

  8. Treatment of acne with oral contraceptives: criteria for pill selection.

    Science.gov (United States)

    Koulianos, G T

    2000-10-01

    Combination oral contraceptives (OCs) (those that contain estrogen and progestin) are widely used in the treatment of acne because they modify an excessively androgenic hormonal environment and can decrease lesions. Dermatologists' knowledge of the most appropriate OC may be hampered by an incomplete understanding of these agents, misleading promotion, and confusion surrounding the new generation of OCs. Despite reports attributing significance to the degree of androgenicity of the progestin components of OCs, in vitro and animal bioassays of androgenicity have little clinical relevance. Because all of today's low-dose combination OCs are estrogen dominant, they are equally beneficial in women with androgenic conditions such as acne. Use of the OC containing the lowest dose of each hormone, consistent with the patient's needs, can enhance compliance by preventing or limiting common early-cycle side effects (e.g., nausea/vomiting, breast tenderness, weight gain, headache), while providing acne improvement.

  9. Selection of approach and fixation in the treatment of

    Directory of Open Access Journals (Sweden)

    QI Xin

    2010-06-01

    Full Text Available Unlike the common distal humeral fracture in children, the prevalence of fractures at this site is not high in adults and accounts for only 2% of all fractures.Moreover, intraarticular fractures, especially the type C fracture, are rare and usually caused by high energy injury. After fractures occur, there are problems of disconnection between the condyles and diaphysis, and of disruption between the medial and lateral condyles. The two condyle fragments rotate along their own axes and the trochlea is usually intact. All these factors create difficulties for reduction and fixation. The currently employed surgical approaches are usually not able to provide enough exposure, and sufficient stability can not be achieved by internal fixation to allow early functional exercises. There are still difficulties in the management of intra-articular fractures of the distal humerus in adults.1, 2 Our objective is to study the surgical treatment for distal humeral fractures in adults by patients’ follow-up.

  10. Spatial Fleming-Viot models with selection and mutation

    CERN Document Server

    Dawson, Donald A

    2014-01-01

    This book constructs a rigorous framework for analysing selected phenomena in evolutionary theory of populations arising due to the combined effects of migration, selection and mutation in a spatial stochastic population model, namely the evolution towards fitter and fitter types through punctuated equilibria. The discussion is based on a number of new methods, in particular multiple scale analysis, nonlinear Markov processes and their entrance laws, atomic measure-valued evolutions and new forms of duality (for state-dependent mutation and multitype selection) which are used to prove ergodic theorems in this context and are applicable for many other questions and renormalization analysis for a variety of phenomena (stasis, punctuated equilibrium, failure of naive branching approximations, biodiversity) which occur due to the combination of rare mutation, mutation, resampling, migration and selection and make it necessary to mathematically bridge the gap (in the limit) between time and space scales.

  11. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  12. Selecting an optimal mixed products using grey relationship model

    Directory of Open Access Journals (Sweden)

    Farshad Faezy Razi

    2013-06-01

    Full Text Available This paper presents an integrated supplier selection and inventory management using grey relationship model (GRM as well as multi-objective decision making process. The proposed model of this paper first ranks different suppliers based on GRM technique and then determines the optimum level of inventory by considering different objectives. To show the implementation of the proposed model, we use some benchmark data presented by Talluri and Baker [Talluri, S., & Baker, R. C. (2002. A multi-phase mathematical programming approach for effective supply chain design. European Journal of Operational Research, 141(3, 544-558.]. The preliminary results indicate that the proposed model of this paper is capable of handling different criteria for supplier selection.

  13. Replication-selective oncolytic viruses in the treatment of cancer.

    Science.gov (United States)

    Everts, Bart; van der Poel, Henk G

    2005-02-01

    In the search for novel strategies, oncolytic virotherapy has recently emerged as a viable approach to specifically kill tumor cells. Unlike conventional gene therapy, it uses replication competent viruses that are able to spread through tumor tissue by virtue of viral replication and concomitant cell lysis. Recent advances in molecular biology have allowed the design of several genetically modified viruses, such as adenovirus and herpes simplex virus that specifically replicate in, and kill tumor cells. On the other hand, viruses with intrinsic oncolytic capacity are also being evaluated for therapeutic purposes. In this review, an overview is given of the general mechanisms and genetic modifications by which these viruses achieve tumor cell-specific replication and antitumor efficacy. However, although generally the oncolytic efficacy of these approaches has been demonstrated in preclinical studies the therapeutic efficacy in clinical trails is still not optimal. Therefore, strategies are evaluated that could further enhance the oncolytic potential of conditionally replicating viruses. In this respect, the use of tumor-selective viruses in conjunction with other standard therapies seems most promising. However, still several hurdles regarding clinical limitations and safety issues should be overcome before this mode of therapy can become of clinical relevance.

  14. A topic evolution model with sentiment and selective attention

    Science.gov (United States)

    Si, Xia-Meng; Wang, Wen-Dong; Zhai, Chun-Qing; Ma, Yan

    2017-04-01

    Topic evolution is a hybrid dynamics of information propagation and opinion interaction. The dynamics of opinion interaction is inherently interwoven with the dynamics of information propagation in the network, owing to the bidirectional influences between interaction and diffusion. The degree of sentiment determines if the topic can continue to spread from this node, and the selective attention determines the information flow direction and communicatee selection. For this end, we put forward a sentiment-based mixed dynamics model with selective attention, and applied the Bayesian updating rules on it. Our model can indirectly describe the isolated users who seem isolated from a topic due to some reasons even everybody around them has heard about it. Numerical simulations show that, more insiders initially and fewer simultaneous spreaders can lessen the extremism. To promote the topic diffusion or restrain the prevailing of extremism, fewer agents with constructive motivation and more agents with no involving motivation are encouraged.

  15. Evidence accumulation as a model for lexical selection.

    Science.gov (United States)

    Anders, R; Riès, S; van Maanen, L; Alario, F X

    2015-11-01

    We propose and demonstrate evidence accumulation as a plausible theoretical and/or empirical model for the lexical selection process of lexical retrieval. A number of current psycholinguistic theories consider lexical selection as a process related to selecting a lexical target from a number of alternatives, which each have varying activations (or signal supports), that are largely resultant of an initial stimulus recognition. We thoroughly present a case for how such a process may be theoretically explained by the evidence accumulation paradigm, and we demonstrate how this paradigm can be directly related or combined with conventional psycholinguistic theory and their simulatory instantiations (generally, neural network models). Then with a demonstrative application on a large new real data set, we establish how the empirical evidence accumulation approach is able to provide parameter results that are informative to leading psycholinguistic theory, and that motivate future theoretical development. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Second-order model selection in mixture experiments

    Energy Technology Data Exchange (ETDEWEB)

    Redgate, P.E.; Piepel, G.F.; Hrma, P.R.

    1992-07-01

    Full second-order models for q-component mixture experiments contain q(q+l)/2 terms, which increases rapidly as q increases. Fitting full second-order models for larger q may involve problems with ill-conditioning and overfitting. These problems can be remedied by transforming the mixture components and/or fitting reduced forms of the full second-order mixture model. Various component transformation and model reduction approaches are discussed. Data from a 10-component nuclear waste glass study are used to illustrate ill-conditioning and overfitting problems that can be encountered when fitting a full second-order mixture model. Component transformation, model term selection, and model evaluation/validation techniques are discussed and illustrated for the waste glass example.

  17. Selecting crop models for decision making in wheat insurance

    NARCIS (Netherlands)

    Castaneda Vera, A.; Leffelaar, P.A.; Alvaro-Fuentes, J.; Cantero-Martinez, C.; Minguez, M.I.

    2015-01-01

    In crop insurance, the accuracy with which the insurer quantifies the actual risk is highly dependent on the availability on actual yield data. Crop models might be valuable tools to generate data on expected yields for risk assessment when no historical records are available. However, selecting a c

  18. Cross-validation criteria for SETAR model selection

    NARCIS (Netherlands)

    de Gooijer, J.G.

    2001-01-01

    Three cross-validation criteria, denoted C, C_c, and C_u are proposed for selecting the orders of a self-exciting threshold autoregressive SETAR) model when both the delay and the threshold value are unknown. The derivatioon of C is within a natural cross-validation framework. The crietion C_c is si

  19. Lightweight Graphical Models for Selectivity Estimation Without Independence Assumptions

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2011-01-01

    ’s optimizers are frequently caused by missed correlations between attributes. We present a selectivity estimation approach that does not make the independence assumptions. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution of all...

  20. Selecting crop models for decision making in wheat insurance

    NARCIS (Netherlands)

    Castaneda Vera, A.; Leffelaar, P.A.; Alvaro-Fuentes, J.; Cantero-Martinez, C.; Minguez, M.I.

    2015-01-01

    In crop insurance, the accuracy with which the insurer quantifies the actual risk is highly dependent on the availability on actual yield data. Crop models might be valuable tools to generate data on expected yields for risk assessment when no historical records are available. However, selecting a

  1. Treatment of pathological gambling - integrative systemic model.

    Science.gov (United States)

    Mladenović, Ivica; Lažetić, Goran; Lečić-Toševski, Dušica; Dimitrijević, Ivan

    2015-03-01

    Pathological gambling was classified under impulse control disorders within the International Classification of Diseases (ICD-10) (WHO 1992), but the most recent Diagnostic and Statistical Manual, 5th edition (DSM-V), (APA 2013), has recognized pathological gambling as a first disorder within a new diagnostic category of behavioral addictions - Gambling disorder. Pathological gambling is a disorder in progression, and we hope that our experience in the treatment of pathological gambling in the Daily Hospital for Addictions at The Institute of Mental Health, through the original "Integrative - systemic model" would be of use to colleagues, dealing with this pathology. This model of treatment of pathological gambling is based on multi-systemic approach and it primarily represents an integration of family and cognitive-behavioral therapy, with traces of psychodynamic, existential and pharmacotherapy. The model is based on the book "Pathological gambling - with self-help manual" by Dr Mladenovic and Dr Lazetic, and has been designed in the form of a program that lasts 10 weeks in the intensive phase, and then continues for two years in the form of "extended treatment" ("After care"). The intensive phase is divided into three segments: educational, insight with initial changes and analysis of the achieved changes with the definition of plans and areas that need to be addressed in the extended treatment. "Extended treatment" lasts for two years in the form of group therapy, during which there is a second order change of the identified patient, but also of other family members. Pathological gambling has been treated in the form of systemic-family therapy for more than 10 years at the Institute of Mental Health (IMH), in Belgrade. For second year in a row the treatment is carried out by the modern "Integrative-systemic model". If abstinence from gambling witihin the period of one year after completion of the intensive phase of treatment is taken as the main criterion of

  2. Accurate model selection of relaxed molecular clocks in bayesian phylogenetics.

    Science.gov (United States)

    Baele, Guy; Li, Wai Lok Sibon; Drummond, Alexei J; Suchard, Marc A; Lemey, Philippe

    2013-02-01

    Recent implementations of path sampling (PS) and stepping-stone sampling (SS) have been shown to outperform the harmonic mean estimator (HME) and a posterior simulation-based analog of Akaike's information criterion through Markov chain Monte Carlo (AICM), in bayesian model selection of demographic and molecular clock models. Almost simultaneously, a bayesian model averaging approach was developed that avoids conditioning on a single model but averages over a set of relaxed clock models. This approach returns estimates of the posterior probability of each clock model through which one can estimate the Bayes factor in favor of the maximum a posteriori (MAP) clock model; however, this Bayes factor estimate may suffer when the posterior probability of the MAP model approaches 1. Here, we compare these two recent developments with the HME, stabilized/smoothed HME (sHME), and AICM, using both synthetic and empirical data. Our comparison shows reassuringly that MAP identification and its Bayes factor provide similar performance to PS and SS and that these approaches considerably outperform HME, sHME, and AICM in selecting the correct underlying clock model. We also illustrate the importance of using proper priors on a large set of empirical data sets.

  3. Rank-based model selection for multiple ions quantum tomography

    Science.gov (United States)

    Guţă, Mădălin; Kypraios, Theodore; Dryden, Ian

    2012-10-01

    The statistical analysis of measurement data has become a key component of many quantum engineering experiments. As standard full state tomography becomes unfeasible for large dimensional quantum systems, one needs to exploit prior information and the ‘sparsity’ properties of the experimental state in order to reduce the dimensionality of the estimation problem. In this paper we propose model selection as a general principle for finding the simplest, or most parsimonious explanation of the data, by fitting different models and choosing the estimator with the best trade-off between likelihood fit and model complexity. We apply two well established model selection methods—the Akaike information criterion (AIC) and the Bayesian information criterion (BIC)—two models consisting of states of fixed rank and datasets such as are currently produced in multiple ions experiments. We test the performance of AIC and BIC on randomly chosen low rank states of four ions, and study the dependence of the selected rank with the number of measurement repetitions for one ion states. We then apply the methods to real data from a four ions experiment aimed at creating a Smolin state of rank 4. By applying the two methods together with the Pearson χ2 test we conclude that the data can be suitably described with a model whose rank is between 7 and 9. Additionally we find that the mean square error of the maximum likelihood estimator for pure states is close to that of the optimal over all possible measurements.

  4. Treatment of severe chronic hypotonic hyponatremia: a new treatment model

    Directory of Open Access Journals (Sweden)

    Antonio Burgio

    2013-03-01

    Full Text Available Recommended treatment of severe hypotonic hyponatremia is based on the infusion of 3% sodium chloride solution, with a daily correction rate below 10 mEq/L of sodium concentration, according to the Adrogué and Madias formula that includes the current desired change in sodium concentrations. However, such treatment needs close monitoring of the rate of infusion and does not take into account the body weight or age of the patient. This may result in hypercorrection and neurological damage. We made an inverse calculation using the same algorithms of the Adrogué and Madias formula to estimate the number of vials of sodium chloride needed to reach a correction rate of the serum sodium concentration below 0.4 mEq/h, taking into account the body weight and age of the patient. Three tables have been produced, each containing the number of vials to be infused, according to the patient’s age and body weight, the serum sodium concentration, and the rate of correction over 24 h to avoid the risk of brain damage. We propose a new practical model to calculate the need of sodium chloride infusate to safely correct the hyponatremia. The tables make treatment easier to manage in daily clinical practice in a wide range of patient ages and body weights.

  5. Analysis on the model selection and influence factors of manure treatment in dairy farm-a case study of Beijing%奶牛养殖场粪污处理模式选择及影响因素分析--以北京市为例

    Institute of Scientific and Technical Information of China (English)

    丁凡琳; 董晓霞; 王建芬; 姜小平

    2015-01-01

    基于北京市延庆、昌平、大兴、房山、密云5个区(县)和首农集团107个奶牛养殖场的调研数据,运用多元 Logit 模型对影响北京市奶牛养殖场粪污处理模式选择的因素进行计量分析。结果表明:养殖规模是影响牛场粪污处理模式选择的主要因素。以还田模式为参照,相关设备齐全度显著影响工业化处理模式的选择,而综合性处理模式更多受法人素质和补贴的影响。基于模型结论,建议按不同规模养殖场实际情况选择适宜的粪污处理模式,并加强法人环保意识,优化政策扶持方式,酌情为企业提供金融支持。%Based on the data of 107 dairy farms in Yanqing,Changping,Daxing,Fangshan,Miyun and Bei-jing Capital Agribusiness Group,the factors affecting the model selection of manure treatment in Beijing dairy farms were analyzed by multivariate Logit model.The results suggested that the breeding scale was the main factor affecting the mode selection of cattle manure treatment.Taking the field pattern as the reference,related equip-ment affected the choice of industrial processing mode significantly,while the comprehensive treatment model was more influenced by the legal person quality and the subsidy.Based on the model conclusion,the appropriate ma-nure treatment mode should be chosen according to the actual situation of different scale farms,,and strengthe-ning environmental protection awareness of legal person,optimizing policy support model,providing financial sup-port to enterprises appropriately.

  6. Selective refinement and selection of near-native models in protein structure prediction.

    Science.gov (United States)

    Zhang, Jiong; Barz, Bogdan; Zhang, Jingfen; Xu, Dong; Kosztin, Ioan

    2015-10-01

    In recent years in silico protein structure prediction reached a level where fully automated servers can generate large pools of near-native structures. However, the identification and further refinement of the best structures from the pool of models remain problematic. To address these issues, we have developed (i) a target-specific selective refinement (SR) protocol; and (ii) molecular dynamics (MD) simulation based ranking (SMDR) method. In SR the all-atom refinement of structures is accomplished via the Rosetta Relax protocol, subject to specific constraints determined by the size and complexity of the target. The best-refined models are selected with SMDR by testing their relative stability against gradual heating through all-atom MD simulations. Through extensive testing we have found that Mufold-MD, our fully automated protein structure prediction server updated with the SR and SMDR modules consistently outperformed its previous versions.

  7. Model selection for the extraction of movement primitives.

    Science.gov (United States)

    Endres, Dominik M; Chiovetto, Enrico; Giese, Martin A

    2013-01-01

    A wide range of blind source separation methods have been used in motor control research for the extraction of movement primitives from EMG and kinematic data. Popular examples are principal component analysis (PCA), independent component analysis (ICA), anechoic demixing, and the time-varying synergy model (d'Avella and Tresch, 2002). However, choosing the parameters of these models, or indeed choosing the type of model, is often done in a heuristic fashion, driven by result expectations as much as by the data. We propose an objective criterion which allows to select the model type, number of primitives and the temporal smoothness prior. Our approach is based on a Laplace approximation to the posterior distribution of the parameters of a given blind source separation model, re-formulated as a Bayesian generative model. We first validate our criterion on ground truth data, showing that it performs at least as good as traditional model selection criteria [Bayesian information criterion, BIC (Schwarz, 1978) and the Akaike Information Criterion (AIC) (Akaike, 1974)]. Then, we analyze human gait data, finding that an anechoic mixture model with a temporal smoothness constraint on the sources can best account for the data.

  8. Model selection for the extraction of movement primitives

    Directory of Open Access Journals (Sweden)

    Dominik M Endres

    2013-12-01

    Full Text Available A wide range of blind source separation methods have been used in motor control research for the extraction of movement primitives from EMG and kinematic data. Popular examples are principal component analysis (PCA,independent component analysis (ICA, anechoic demixing, and the time-varying synergy model. However, choosing the parameters of these models, or indeed choosing the type of model, is often done in a heuristic fashion, driven by result expectations as much as by the data. We propose an objective criterion which allows to select the model type, number of primitives and the temporal smoothness prior. Our approach is based on a Laplace approximation to the posterior distribution of the parameters of a given blind source separation model, re-formulated as a Bayesian generative model.We first validate our criterion on ground truth data, showing that it performs at least as good as traditional model selection criteria (Bayesian information criterion, BIC and the Akaike Information Criterion (AIC. Then, we analyze human gait data, finding that an anechoic mixture model with a temporal smoothness constraint on the sources can best account for the data.

  9. Sequential Salinomycin Treatment Results in Resistance Formation through Clonal Selection of Epithelial-Like Tumor Cells

    Directory of Open Access Journals (Sweden)

    Florian Kopp

    2014-12-01

    Full Text Available Acquiring therapy resistance is one of the major obstacles in the treatment of patients with cancer. The discovery of the cancer stem cell (CSC–specific drug salinomycin raised hope for improved treatment options by targeting therapy-refractory CSCs and mesenchymal cancer cells. However, the occurrence of an acquired salinomycin resistance in tumor cells remains elusive. To study the formation of salinomycin resistance, mesenchymal breast cancer cells were sequentially treated with salinomycin in an in vitro cell culture assay, and the resulting differences in gene expression and salinomycin susceptibility were analyzed. We demonstrated that long-term salinomycin treatment of mesenchymal cancer cells resulted in salinomycin-resistant cells with elevated levels of epithelial markers, such as E-cadherin and miR-200c, a decreased migratory capability, and a higher susceptibility to the classic chemotherapeutic drug doxorubicin. The formation of salinomycin resistance through the acquisition of epithelial traits was further validated by inducing mesenchymal-epithelial transition through an overexpression of miR-200c. The transition from a mesenchymal to a more epithelial-like phenotype of salinomycin-treated tumor cells was moreover confirmed in vivo, using syngeneic and, for the first time, transgenic mouse tumor models. These results suggest that the acquisition of salinomycin resistance through the clonal selection of epithelial-like cancer cells could become exploited for improved cancer therapies by antagonizing the tumor-progressive effects of epithelial-mesenchymal transition.

  10. Sequential Salinomycin Treatment Results in Resistance Formation through Clonal Selection of Epithelial-Like Tumor Cells.

    Science.gov (United States)

    Kopp, Florian; Hermawan, Adam; Oak, Prajakta Shirish; Ulaganathan, Vijay Kumar; Herrmann, Annika; Elnikhely, Nefertiti; Thakur, Chitra; Xiao, Zhiguang; Knyazev, Pjotr; Ataseven, Beyhan; Savai, Rajkumar; Wagner, Ernst; Roidl, Andreas

    2014-12-01

    Acquiring therapy resistance is one of the major obstacles in the treatment of patients with cancer. The discovery of the cancer stem cell (CSC)-specific drug salinomycin raised hope for improved treatment options by targeting therapy-refractory CSCs and mesenchymal cancer cells. However, the occurrence of an acquired salinomycin resistance in tumor cells remains elusive. To study the formation of salinomycin resistance, mesenchymal breast cancer cells were sequentially treated with salinomycin in an in vitro cell culture assay, and the resulting differences in gene expression and salinomycin susceptibility were analyzed. We demonstrated that long-term salinomycin treatment of mesenchymal cancer cells resulted in salinomycin-resistant cells with elevated levels of epithelial markers, such as E-cadherin and miR-200c, a decreased migratory capability, and a higher susceptibility to the classic chemotherapeutic drug doxorubicin. The formation of salinomycin resistance through the acquisition of epithelial traits was further validated by inducing mesenchymal-epithelial transition through an overexpression of miR-200c. The transition from a mesenchymal to a more epithelial-like phenotype of salinomycin-treated tumor cells was moreover confirmed in vivo, using syngeneic and, for the first time, transgenic mouse tumor models. These results suggest that the acquisition of salinomycin resistance through the clonal selection of epithelial-like cancer cells could become exploited for improved cancer therapies by antagonizing the tumor-progressive effects of epithelial-mesenchymal transition.

  11. How many separable sources? Model selection in independent components analysis.

    Science.gov (United States)

    Woods, Roger P; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian.

  12. Statistical modelling in biostatistics and bioinformatics selected papers

    CERN Document Server

    Peng, Defen

    2014-01-01

    This book presents selected papers on statistical model development related mainly to the fields of Biostatistics and Bioinformatics. The coverage of the material falls squarely into the following categories: (a) Survival analysis and multivariate survival analysis, (b) Time series and longitudinal data analysis, (c) Statistical model development and (d) Applied statistical modelling. Innovations in statistical modelling are presented throughout each of the four areas, with some intriguing new ideas on hierarchical generalized non-linear models and on frailty models with structural dispersion, just to mention two examples. The contributors include distinguished international statisticians such as Philip Hougaard, John Hinde, Il Do Ha, Roger Payne and Alessandra Durio, among others, as well as promising newcomers. Some of the contributions have come from researchers working in the BIO-SI research programme on Biostatistics and Bioinformatics, centred on the Universities of Limerick and Galway in Ireland and fu...

  13. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysi...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian.......Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...

  14. Culture, Personality, Health, and Family Dynamics: Cultural Competence in the Selection of Culturally Sensitive Treatments

    Science.gov (United States)

    Sperry, Len

    2010-01-01

    Cultural sensitivity and cultural competence in the selection of culturally sensitive treatments is a requisite for effective counseling practice in working with diverse clients and their families, particularly when clients present with health issues or medical problems. Described here is a strategy for selecting culturally sensitive treatments…

  15. PROPOSAL OF AN EMPIRICAL MODEL FOR SUPPLIERS SELECTION

    Directory of Open Access Journals (Sweden)

    Paulo Ávila

    2015-03-01

    Full Text Available The problem of selecting suppliers/partners is a crucial and important part in the process of decision making for companies that intend to perform competitively in their area of activity. The selection of supplier/partner is a time and resource-consuming task that involves data collection and a careful analysis of the factors that can positively or negatively influence the choice. Nevertheless it is a critical process that affects significantly the operational performance of each company. In this work, trough the literature review, there were identified five broad suppliers selection criteria: Quality, Financial, Synergies, Cost, and Production System. Within these criteria, it was also included five sub-criteria. Thereafter, a survey was elaborated and companies were contacted in order to answer which factors have more relevance in their decisions to choose the suppliers. Interpreted the results and processed the data, it was adopted a model of linear weighting to reflect the importance of each factor. The model has a hierarchical structure and can be applied with the Analytic Hierarchy Process (AHP method or Simple Multi-Attribute Rating Technique (SMART. The result of the research undertaken by the authors is a reference model that represents a decision making support for the suppliers/partners selection process.

  16. Supplier Selection in Virtual Enterprise Model of Manufacturing Supply Network

    Science.gov (United States)

    Kaihara, Toshiya; Opadiji, Jayeola F.

    The market-based approach to manufacturing supply network planning focuses on the competitive attitudes of various enterprises in the network to generate plans that seek to maximize the throughput of the network. It is this competitive behaviour of the member units that we explore in proposing a solution model for a supplier selection problem in convergent manufacturing supply networks. We present a formulation of autonomous units of the network as trading agents in a virtual enterprise network interacting to deliver value to market consumers and discuss the effect of internal and external trading parameters on the selection of suppliers by enterprise units.

  17. Efficiency of model selection criteria in flood frequency analysis

    Science.gov (United States)

    Calenda, G.; Volpi, E.

    2009-04-01

    The estimation of high flood quantiles requires the extrapolation of the probability distributions far beyond the usual sample length, involving high estimation uncertainties. The choice of the probability law, traditionally based on the hypothesis testing, is critical to this point. In this study the efficiency of different model selection criteria, seldom applied in flood frequency analysis, is investigated. The efficiency of each criterion in identifying the probability distribution of the hydrological extremes is evaluated by numerical simulations for different parent distributions, coefficients of variation and skewness, and sample sizes. The compared model selection procedures are the Akaike Information Criterion (AIC), the Bayesian Information Criterion (BIC), the Anderson Darling Criterion (ADC) recently discussed by Di Baldassarre et al. (2008) and Sample Quantile Criterion (SQC), recently proposed by the authors (Calenda et al., 2009). The SQC is based on the principle of maximising the probability density of the elements of the sample that are considered relevant to the problem, and takes into account both the accuracy and the uncertainty of the estimate. Since the stress is mainly on extreme events, the SQC involves upper-tail probabilities, where the effect of the model assumption is more critical. The proposed index is equal to the sum of logarithms of the inverse of the sample probability density of the observed quantiles. The definition of this index is based on the principle that the more centred is the sample value in respect to its density distribution (accuracy of the estimate) and the less spread is this distribution (uncertainty of the estimate), the greater is the probability density of the sample quantile. Thus, lower values of the index indicate a better performance of the distribution law. This criterion can operate the selection of the optimum distribution among competing probability models that are estimated using different samples. The

  18. A model-based approach to selection of tag SNPs

    Directory of Open Access Journals (Sweden)

    Sun Fengzhu

    2006-06-01

    Full Text Available Abstract Background Single Nucleotide Polymorphisms (SNPs are the most common type of polymorphisms found in the human genome. Effective genetic association studies require the identification of sets of tag SNPs that capture as much haplotype information as possible. Tag SNP selection is analogous to the problem of data compression in information theory. According to Shannon's framework, the optimal tag set maximizes the entropy of the tag SNPs subject to constraints on the number of SNPs. This approach requires an appropriate probabilistic model. Compared to simple measures of Linkage Disequilibrium (LD, a good model of haplotype sequences can more accurately account for LD structure. It also provides a machinery for the prediction of tagged SNPs and thereby to assess the performances of tag sets through their ability to predict larger SNP sets. Results Here, we compute the description code-lengths of SNP data for an array of models and we develop tag SNP selection methods based on these models and the strategy of entropy maximization. Using data sets from the HapMap and ENCODE projects, we show that the hidden Markov model introduced by Li and Stephens outperforms the other models in several aspects: description code-length of SNP data, information content of tag sets, and prediction of tagged SNPs. This is the first use of this model in the context of tag SNP selection. Conclusion Our study provides strong evidence that the tag sets selected by our best method, based on Li and Stephens model, outperform those chosen by several existing methods. The results also suggest that information content evaluated with a good model is more sensitive for assessing the quality of a tagging set than the correct prediction rate of tagged SNPs. Besides, we show that haplotype phase uncertainty has an almost negligible impact on the ability of good tag sets to predict tagged SNPs. This justifies the selection of tag SNPs on the basis of haplotype

  19. Models of cultural niche construction with selection and assortative mating.

    Science.gov (United States)

    Creanza, Nicole; Fogarty, Laurel; Feldman, Marcus W

    2012-01-01

    Niche construction is a process through which organisms modify their environment and, as a result, alter the selection pressures on themselves and other species. In cultural niche construction, one or more cultural traits can influence the evolution of other cultural or biological traits by affecting the social environment in which the latter traits may evolve. Cultural niche construction may include either gene-culture or culture-culture interactions. Here we develop a model of this process and suggest some applications of this model. We examine the interactions between cultural transmission, selection, and assorting, paying particular attention to the complexities that arise when selection and assorting are both present, in which case stable polymorphisms of all cultural phenotypes are possible. We compare our model to a recent model for the joint evolution of religion and fertility and discuss other potential applications of cultural niche construction theory, including the evolution and maintenance of large-scale human conflict and the relationship between sex ratio bias and marriage customs. The evolutionary framework we introduce begins to address complexities that arise in the quantitative analysis of multiple interacting cultural traits.

  20. Models of cultural niche construction with selection and assortative mating.

    Directory of Open Access Journals (Sweden)

    Nicole Creanza

    Full Text Available Niche construction is a process through which organisms modify their environment and, as a result, alter the selection pressures on themselves and other species. In cultural niche construction, one or more cultural traits can influence the evolution of other cultural or biological traits by affecting the social environment in which the latter traits may evolve. Cultural niche construction may include either gene-culture or culture-culture interactions. Here we develop a model of this process and suggest some applications of this model. We examine the interactions between cultural transmission, selection, and assorting, paying particular attention to the complexities that arise when selection and assorting are both present, in which case stable polymorphisms of all cultural phenotypes are possible. We compare our model to a recent model for the joint evolution of religion and fertility and discuss other potential applications of cultural niche construction theory, including the evolution and maintenance of large-scale human conflict and the relationship between sex ratio bias and marriage customs. The evolutionary framework we introduce begins to address complexities that arise in the quantitative analysis of multiple interacting cultural traits.

  1. Bayesian nonparametric centered random effects models with variable selection.

    Science.gov (United States)

    Yang, Mingan

    2013-03-01

    In a linear mixed effects model, it is common practice to assume that the random effects follow a parametric distribution such as a normal distribution with mean zero. However, in the case of variable selection, substantial violation of the normality assumption can potentially impact the subset selection and result in poor interpretation and even incorrect results. In nonparametric random effects models, the random effects generally have a nonzero mean, which causes an identifiability problem for the fixed effects that are paired with the random effects. In this article, we focus on a Bayesian method for variable selection. We characterize the subject-specific random effects nonparametrically with a Dirichlet process and resolve the bias simultaneously. In particular, we propose flexible modeling of the conditional distribution of the random effects with changes across the predictor space. The approach is implemented using a stochastic search Gibbs sampler to identify subsets of fixed effects and random effects to be included in the model. Simulations are provided to evaluate and compare the performance of our approach to the existing ones. We then apply the new approach to a real data example, cross-country and interlaboratory rodent uterotrophic bioassay.

  2. QOS Aware Formalized Model for Semantic Web Service Selection

    Directory of Open Access Journals (Sweden)

    Divya Sachan

    2014-10-01

    Full Text Available Selecting the most relevant Web Service according to a client requirement is an onerous task, as innumerous number of functionally same Web Services(WS are listed in UDDI registry. WS are functionally same but their Quality and performance varies as per service providers. A web Service Selection Process involves two major points: Recommending the pertinent Web Service and avoiding unjustifiable web service. The deficiency in keyword based searching is that it doesn’t handle the client request accurately as keyword may have ambiguous meaning on different scenarios. UDDI and search engines all are based on keyword search, which are lagging behind on pertinent Web service selection. So the search mechanism must be incorporated with the Semantic behavior of Web Services. In order to strengthen this approach, the proposed model is incorporated with Quality of Services (QoS based Ranking of semantic web services.

  3. Modelling autophagy selectivity by receptor clustering on peroxisomes

    CERN Document Server

    Brown, Aidan I

    2016-01-01

    When subcellular organelles are degraded by autophagy, typically some, but not all, of each targeted organelle type are degraded. Autophagy selectivity must not only select the correct type of organelle, but must discriminate between individual organelles of the same kind. In the context of peroxisomes, we use computational models to explore the hypothesis that physical clustering of autophagy receptor proteins on the surface of each organelle provides an appropriate all-or-none signal for degradation. The pexophagy receptor proteins NBR1 and p62 are well characterized, though only NBR1 is essential for pexophagy (Deosaran {\\em et al.}, 2013). Extending earlier work by addressing the initial nucleation of NBR1 clusters on individual peroxisomes, we find that larger peroxisomes nucleate NBR1 clusters first and lose them due to competitive coarsening last, resulting in significant size-selectivity favouring large peroxisomes. This effect can explain the increased catalase signal that results from experimental s...

  4. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  5. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  6. Exploratory Bayesian model selection for serial genetics data.

    Science.gov (United States)

    Zhao, Jing X; Foulkes, Andrea S; George, Edward I

    2005-06-01

    Characterizing the process by which molecular and cellular level changes occur over time will have broad implications for clinical decision making and help further our knowledge of disease etiology across many complex diseases. However, this presents an analytic challenge due to the large number of potentially relevant biomarkers and the complex, uncharacterized relationships among them. We propose an exploratory Bayesian model selection procedure that searches for model simplicity through independence testing of multiple discrete biomarkers measured over time. Bayes factor calculations are used to identify and compare models that are best supported by the data. For large model spaces, i.e., a large number of multi-leveled biomarkers, we propose a Markov chain Monte Carlo (MCMC) stochastic search algorithm for finding promising models. We apply our procedure to explore the extent to which HIV-1 genetic changes occur independently over time.

  7. Stationary solutions for metapopulation Moran models with mutation and selection

    Science.gov (United States)

    Constable, George W. A.; McKane, Alan J.

    2015-03-01

    We construct an individual-based metapopulation model of population genetics featuring migration, mutation, selection, and genetic drift. In the case of a single "island," the model reduces to the Moran model. Using the diffusion approximation and time-scale separation arguments, an effective one-variable description of the model is developed. The effective description bears similarities to the well-mixed Moran model with effective parameters that depend on the network structure and island sizes, and it is amenable to analysis. Predictions from the reduced theory match the results from stochastic simulations across a range of parameters. The nature of the fast-variable elimination technique we adopt is further studied by applying it to a linear system, where it provides a precise description of the slow dynamics in the limit of large time-scale separation.

  8. Predicting artificailly drained areas by means of selective model ensemble

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Beucher, Amélie; Iversen, Bo Vangsø

    . The approaches employed include decision trees, discriminant analysis, regression models, neural networks and support vector machines amongst others. Several models are trained with each method, using variously the original soil covariates and principal components of the covariates. With a large ensemble...... out since the mid-19th century, and it has been estimated that half of the cultivated area is artificially drained (Olesen, 2009). A number of machine learning approaches can be used to predict artificially drained areas in geographic space. However, instead of choosing the most accurate model....... The study aims firstly to train a large number of models to predict the extent of artificially drained areas using various machine learning approaches. Secondly, the study will develop a method for selecting the models, which give a good prediction of artificially drained areas, when used in conjunction...

  9. Model Selection Framework for Graph-based data

    CERN Document Server

    Caceres, Rajmonda S; Schmidt, Matthew C; Miller, Benjamin A; Campbell, William M

    2016-01-01

    Graphs are powerful abstractions for capturing complex relationships in diverse application settings. An active area of research focuses on theoretical models that define the generative mechanism of a graph. Yet given the complexity and inherent noise in real datasets, it is still very challenging to identify the best model for a given observed graph. We discuss a framework for graph model selection that leverages a long list of graph topological properties and a random forest classifier to learn and classify different graph instances. We fully characterize the discriminative power of our approach as we sweep through the parameter space of two generative models, the Erdos-Renyi and the stochastic block model. We show that our approach gets very close to known theoretical bounds and we provide insight on which topological features play a critical discriminating role.

  10. The Current State of Empirical Support for the Pharmacological Treatment of Selective Mutism

    Science.gov (United States)

    Carlson, John S.; Mitchell, Angela D.; Segool, Natasha

    2008-01-01

    This article reviews the current state of evidence for the psychopharmacological treatment of children diagnosed with selective mutism within the context of its link to social anxiety disorder. An increased focus on potential medication treatment for this disorder has resulted from significant monetary and resource limitations in typical practice,…

  11. Muscle selection for treatment of cervical dystonia with botulinum toxin : A systematic review

    NARCIS (Netherlands)

    Nijmeijer, S. W. R.; Koelman, J. H. T. M.; Kamphuis, D. J.; Tijssen, M. A. J.

    2012-01-01

    Rationale: Cervical dystonia, also called spasmodic torticollis, is the most common form of (primary) dystonia. Intramuscular injections with botulinum toxin are the first line of treatment for cervical dystonia. To optimise the treatment response to botulinum toxin correct muscles should be selecte

  12. Modelling of Activated Sludge Wastewater Treatment

    Directory of Open Access Journals (Sweden)

    Kurtanjeka, Ž.

    2008-02-01

    Full Text Available Activated sludge wastewater treatment is a highly complex physical, chemical and biological process, and variations in wastewater flow rate and its composition, combined with time-varying reactions in a mixed culture of microorganisms, make this process non-linear and unsteady. The efficiency of the process is established by measuring the quantities that indicate quality of the treated wastewater, but they can only be determined at the end of the process, which is when the water has already been processed and is at the outlet of the plant and released into the environment.If the water quality is not acceptable, it is already too late for its improvement, which indicates the need for a feed forward process control based on a mathematical model. Since there is no possibility of retracing the process steps back, all the mistakes in the control of the process could induce an ecological disaster of a smaller or bigger extent. Therefore, models that describe this process well may be used as a basis for monitoring and optimal control of the process development. This work analyzes the process of biological treatment of wastewater in the Velika Gorica plant. Two empirical models for the description of the process were established, multiple linear regression model (MLR with 16 predictor variables and piecewise linear regression model (PLR with 17 predictor variables. These models were developed with the aim to predict COD value of the effluent wastewater at the outlet, after treatment. The development of the models is based on the statistical analysis of experimental data, which are used to determine the relations among individual variables. In this work are applied linear models based on multiple linear regression (MLR and partial least squares (PLR methods. The used data were obtained by everyday measurements of the quantities that indicate the quality of the input and output water, working conditions of the plant and the quality of the activated sludge

  13. Cryogen spray cooling for spatially selective photocoagulation: a feasibility study with potential application for treatment of hemangiomas

    Science.gov (United States)

    Anvari, Bahman; Tanenbaum, B. S.; Milner, Thomas E.; Hoffman, Wendy; Said, Samireh; Chang, Cheng-Jen; Liaw, Lih-Huei L.; Kimel, Sol; Nelson, J. Stuart

    1996-12-01

    The clinical objective in laser treatment of hemangiomas is to photocoagulate the dilated cutaneous blood vessels, while at the same time minimizing nonspecific thermal injury to the overlying epidermis. We present an in-vivo experimental procedure, using a chicken comb animal model, and an infrared feedback system to deliver repetitive cryogen spurts during continuous Nd:YAG laser irradiation. Gross and histologic observations are consistent with calculated thicknesses of protected and damaged tissues, and demonstrate the feasibility of inducing spatially selective photocoagulation when using cryogen spray cooling in conjunction with laser irradiation. Experimental observation of epidermal protection in the chicken comb model suggests selective photocoagulation of subsurface targeted blood vessels for successful treatment of hemangiomas can be achieved by repetitive applications of a cryogen spurt during continuous Nd:YAG laser irradiation.

  14. Ensemble feature selection integrating elitist roles and quantum game model

    Institute of Scientific and Technical Information of China (English)

    Weiping Ding; Jiandong Wang; Zhijin Guan; Quan Shi

    2015-01-01

    To accelerate the selection process of feature subsets in the rough set theory (RST), an ensemble elitist roles based quantum game (EERQG) algorithm is proposed for feature selec-tion. Firstly, the multilevel elitist roles based dynamics equilibrium strategy is established, and both immigration and emigration of elitists are able to be self-adaptive to balance between exploration and exploitation for feature selection. Secondly, the utility matrix of trust margins is introduced to the model of multilevel elitist roles to enhance various elitist roles’ performance of searching the optimal feature subsets, and the win-win utility solutions for feature selec-tion can be attained. Meanwhile, a novel ensemble quantum game strategy is designed as an intriguing exhibiting structure to perfect the dynamics equilibrium of multilevel elitist roles. Final y, the en-semble manner of multilevel elitist roles is employed to achieve the global minimal feature subset, which wil greatly improve the fea-sibility and effectiveness. Experiment results show the proposed EERQG algorithm has superiority compared to the existing feature selection algorithms.

  15. Transitions in a genotype selection model driven by coloured noises

    Institute of Scientific and Technical Information of China (English)

    Wang Can-Jun; Mei Dong-Cheng

    2008-01-01

    This paper investigates a genotype selection model subjected to both a multiplicative coloured noise and an additive coloured noise with different correlation time T1 and T2 by means of the numerical technique.By directly simulating the Langevin Equation,the following results are obtained.(1) The multiplicative coloured noise dominates,however,the effect of the additive coloured noise is not neglected in the practical gene selection process.The selection rate μ decides that the selection is propitious to gene A haploid or gene B haploid.(2) The additive coloured noise intensity α and the correlation time T2 play opposite roles.It is noted that α and T2 can not separate the single peak,while αcan make the peak disappear and T2 can make the peak be sharp.(3) The multiplicative coloured noise intensity D and the correlation time T1 can induce phase transition,at the same time they play opposite roles and the reentrance phenomenon appears.In this case,it is easy to select one type haploid from the group with increasing D and decreasing T1.

  16. Selection of higher order regression models in the analysis of multi-factorial transcription data.

    Directory of Open Access Journals (Sweden)

    Olivia Prazeres da Costa

    Full Text Available INTRODUCTION: Many studies examine gene expression data that has been obtained under the influence of multiple factors, such as genetic background, environmental conditions, or exposure to diseases. The interplay of multiple factors may lead to effect modification and confounding. Higher order linear regression models can account for these effects. We present a new methodology for linear model selection and apply it to microarray data of bone marrow-derived macrophages. This experiment investigates the influence of three variable factors: the genetic background of the mice from which the macrophages were obtained, Yersinia enterocolitica infection (two strains, and a mock control, and treatment/non-treatment with interferon-γ. RESULTS: We set up four different linear regression models in a hierarchical order. We introduce the eruption plot as a new practical tool for model selection complementary to global testing. It visually compares the size and significance of effect estimates between two nested models. Using this methodology we were able to select the most appropriate model by keeping only relevant factors showing additional explanatory power. Application to experimental data allowed us to qualify the interaction of factors as either neutral (no interaction, alleviating (co-occurring effects are weaker than expected from the single effects, or aggravating (stronger than expected. We find a biologically meaningful gene cluster of putative C2TA target genes that appear to be co-regulated with MHC class II genes. CONCLUSIONS: We introduced the eruption plot as a tool for visual model comparison to identify relevant higher order interactions in the analysis of expression data obtained under the influence of multiple factors. We conclude that model selection in higher order linear regression models should generally be performed for the analysis of multi-factorial microarray data.

  17. Forecasting house prices in the 50 states using Dynamic Model Averaging and Dynamic Model Selection

    DEFF Research Database (Denmark)

    Bork, Lasse; Møller, Stig Vinther

    2015-01-01

    We examine house price forecastability across the 50 states using Dynamic Model Averaging and Dynamic Model Selection, which allow for model change and parameter shifts. By allowing the entire forecasting model to change over time and across locations, the forecasting accuracy improves...

  18. Selection between Linear Factor Models and Latent Profile Models Using Conditional Covariances

    Science.gov (United States)

    Halpin, Peter F.; Maraun, Michael D.

    2010-01-01

    A method for selecting between K-dimensional linear factor models and (K + 1)-class latent profile models is proposed. In particular, it is shown that the conditional covariances of observed variables are constant under factor models but nonlinear functions of the conditioning variable under latent profile models. The performance of a convenient…

  19. Selection between Linear Factor Models and Latent Profile Models Using Conditional Covariances

    Science.gov (United States)

    Halpin, Peter F.; Maraun, Michael D.

    2010-01-01

    A method for selecting between K-dimensional linear factor models and (K + 1)-class latent profile models is proposed. In particular, it is shown that the conditional covariances of observed variables are constant under factor models but nonlinear functions of the conditioning variable under latent profile models. The performance of a convenient…

  20. Modeling selective attention using a neuromorphic analog VLSI device.

    Science.gov (United States)

    Indiveri, G

    2000-12-01

    Attentional mechanisms are required to overcome the problem of flooding a limited processing capacity system with information. They are present in biological sensory systems and can be a useful engineering tool for artificial visual systems. In this article we present a hardware model of a selective attention mechanism implemented on a very large-scale integration (VLSI) chip, using analog neuromorphic circuits. The chip exploits a spike-based representation to receive, process, and transmit signals. It can be used as a transceiver module for building multichip neuromorphic vision systems. We describe the circuits that carry out the main processing stages of the selective attention mechanism and provide experimental data for each circuit. We demonstrate the expected behavior of the model at the system level by stimulating the chip with both artificially generated control signals and signals obtained from a saliency map, computed from an image containing several salient features.

  1. Model Order Selection Rules for Covariance Structure Classification in Radar

    Science.gov (United States)

    Carotenuto, Vincenzo; De Maio, Antonio; Orlando, Danilo; Stoica, Petre

    2017-10-01

    The adaptive classification of the interference covariance matrix structure for radar signal processing applications is addressed in this paper. This represents a key issue because many detection architectures are synthesized assuming a specific covariance structure which may not necessarily coincide with the actual one due to the joint action of the system and environment uncertainties. The considered classification problem is cast in terms of a multiple hypotheses test with some nested alternatives and the theory of Model Order Selection (MOS) is exploited to devise suitable decision rules. Several MOS techniques, such as the Akaike, Takeuchi, and Bayesian information criteria are adopted and the corresponding merits and drawbacks are discussed. At the analysis stage, illustrating examples for the probability of correct model selection are presented showing the effectiveness of the proposed rules.

  2. Autoregressive model selection with simultaneous sparse coefficient estimation

    CERN Document Server

    Sang, Hailin

    2011-01-01

    In this paper we propose a sparse coefficient estimation procedure for autoregressive (AR) models based on penalized conditional maximum likelihood. The penalized conditional maximum likelihood estimator (PCMLE) thus developed has the advantage of performing simultaneous coefficient estimation and model selection. Mild conditions are given on the penalty function and the innovation process, under which the PCMLE satisfies a strong consistency, local $N^{-1/2}$ consistency, and oracle property, respectively, where N is sample size. Two penalty functions, least absolute shrinkage and selection operator (LASSO) and smoothly clipped average deviation (SCAD), are considered as examples, and SCAD is shown to have better performances than LASSO. A simulation study confirms our theoretical results. At the end, we provide an application of our method to a historical price data of the US Industrial Production Index for consumer goods, and the result is very promising.

  3. Precise Muscle Selection Using Dynamic Polyelectromyography for Treatment of Post-stroke Dystonia: A Case Report

    OpenAIRE

    Jung, Tae Min; Kim, Ae Ryoung; Lee, Yoonju; Kim, Dae-Hyun; Kim, Deog Young

    2016-01-01

    Dystonia has a wide range of causes, but treatment of dystonia is limited to minimizing the symptoms as there is yet no successful treatment for its cause. One of the optimal treatment methods for dystonia is chemodenervation using botulinum toxin type A (BTX-A), alcohol injection, etc., but its success depends on how precisely the dystonic muscle is selected. Here, we reported a successful experience in a 49-year-old post-stroke female patient who showed paroxysmal repetitive contractions in...

  4. Parameter estimation and model selection in computational biology.

    Directory of Open Access Journals (Sweden)

    Gabriele Lillacci

    2010-03-01

    Full Text Available A central challenge in computational modeling of biological systems is the determination of the model parameters. Typically, only a fraction of the parameters (such as kinetic rate constants are experimentally measured, while the rest are often fitted. The fitting process is usually based on experimental time course measurements of observables, which are used to assign parameter values that minimize some measure of the error between these measurements and the corresponding model prediction. The measurements, which can come from immunoblotting assays, fluorescent markers, etc., tend to be very noisy and taken at a limited number of time points. In this work we present a new approach to the problem of parameter selection of biological models. We show how one can use a dynamic recursive estimator, known as extended Kalman filter, to arrive at estimates of the model parameters. The proposed method follows. First, we use a variation of the Kalman filter that is particularly well suited to biological applications to obtain a first guess for the unknown parameters. Secondly, we employ an a posteriori identifiability test to check the reliability of the estimates. Finally, we solve an optimization problem to refine the first guess in case it should not be accurate enough. The final estimates are guaranteed to be statistically consistent with the measurements. Furthermore, we show how the same tools can be used to discriminate among alternate models of the same biological process. We demonstrate these ideas by applying our methods to two examples, namely a model of the heat shock response in E. coli, and a model of a synthetic gene regulation system. The methods presented are quite general and may be applied to a wide class of biological systems where noisy measurements are used for parameter estimation or model selection.

  5. Avasimibe encapsulated in human serum albumin blocks cholesterol esterification for selective cancer treatment.

    Science.gov (United States)

    Lee, Steve Seung-Young; Li, Junjie; Tai, Jien Nee; Ratliff, Timothy L; Park, Kinam; Cheng, Ji-Xin

    2015-03-24

    Undesirable side effects remain a significant challenge in cancer chemotherapy. Here we report a strategy for cancer-selective chemotherapy by blocking acyl-CoA cholesterol acyltransferase-1 (ACAT-1)-mediated cholesterol esterification. To efficiently block cholesterol esterification in cancer in vivo, we developed a systemically injectable nanoformulation of avasimibe (a potent ACAT-1 inhibitor), called avasimin. In cell lines of human prostate, pancreatic, lung, and colon cancer, avasimin significantly reduced cholesteryl ester storage in lipid droplets and elevated intracellular free cholesterol levels, which led to apoptosis and suppression of proliferation. In xenograft models of prostate cancer and colon cancer, intravenous administration of avasimin caused the concentration of avasimibe in tumors to be 4-fold higher than the IC50 value. Systemic treatment of avasimin notably suppressed tumor growth in mice and extended the length of survival time. No adverse effects of avasimin to normal cells and organs were observed. Together, this study provides an effective approach for selective cancer chemotherapy by targeting altered cholesterol metabolism of cancer cells.

  6. Treatment of Selective Serotonin Reuptake Inhibitor-Resistant Depression in Adolescents: Predictors and Moderators of Treatment Response

    Science.gov (United States)

    Asarnow, Joan Rosenbaum; Emslie, Graham; Clarke, Greg; Wagner, Karen Dineen; Spirito, Anthony; Vitiello, Benedetto; Iyengar, Satish; Shamseddeen, Wael; Ritz, Louise; Birmaher, Boris; Ryan, Neal; Kennard, Betsy; Mayes, Taryn; DeBar, Lynn; McCracken, James; Strober, Michael; Suddath, Robert; Leonard, Henrietta; Porta, Giovanna; Keller, Martin; Brent, David

    2009-01-01

    Adolescents who did not improve with Selective Serotonin Reuptake Inhibitor (SSRI) were provided an alternative SSRI plus cognitive-behavioral therapy (CBT). The superiority of the CBT/combined treatment as compared to medication alone is more evident in youths who had more comorbid disorders, no abuse history, and lower hopelessness.

  7. Treatment of Selective Serotonin Reuptake Inhibitor-Resistant Depression in Adolescents: Predictors and Moderators of Treatment Response

    Science.gov (United States)

    Asarnow, Joan Rosenbaum; Emslie, Graham; Clarke, Greg; Wagner, Karen Dineen; Spirito, Anthony; Vitiello, Benedetto; Iyengar, Satish; Shamseddeen, Wael; Ritz, Louise; Birmaher, Boris; Ryan, Neal; Kennard, Betsy; Mayes, Taryn; DeBar, Lynn; McCracken, James; Strober, Michael; Suddath, Robert; Leonard, Henrietta; Porta, Giovanna; Keller, Martin; Brent, David

    2009-01-01

    Adolescents who did not improve with Selective Serotonin Reuptake Inhibitor (SSRI) were provided an alternative SSRI plus cognitive-behavioral therapy (CBT). The superiority of the CBT/combined treatment as compared to medication alone is more evident in youths who had more comorbid disorders, no abuse history, and lower hopelessness.

  8. Structure and selection in an autocatalytic binary polymer model

    DEFF Research Database (Denmark)

    Tanaka, Shinpei; Fellermann, Harold; Rasmussen, Steen

    2014-01-01

    An autocatalytic binary polymer system is studied as an abstract model for a chemical reaction network capable to evolve. Due to autocatalysis, long polymers appear spontaneously and their concentration is shown to be maintained at the same level as that of monomers. When the reaction starts from....... Stability, fluctuations, and dynamic selection mechanisms are investigated for the involved self-organizing processes. Copyright (C) EPLA, 2014......An autocatalytic binary polymer system is studied as an abstract model for a chemical reaction network capable to evolve. Due to autocatalysis, long polymers appear spontaneously and their concentration is shown to be maintained at the same level as that of monomers. When the reaction starts from...

  9. Velocity selection in the symmetric model of dendritic crystal growth

    Science.gov (United States)

    Barbieri, Angelo; Hong, Daniel C.; Langer, J. S.

    1987-01-01

    An analytic solution of the problem of velocity selection in a fully nonlocal model of dendritic crystal growth is presented. The analysis uses a WKB technique to derive and evaluate a solvability condition for the existence of steady-state needle-like solidification fronts in the limit of small under-cooling Delta. For the two-dimensional symmetric model with a capillary anisotropy of strength alpha, it is found that the velocity is proportional to (Delta to the 4th) times (alpha exp 7/4). The application of the method in three dimensions is also described.

  10. A simple application of FIC to model selection

    CERN Document Server

    Wiggins, Paul A

    2015-01-01

    We have recently proposed a new information-based approach to model selection, the Frequentist Information Criterion (FIC), that reconciles information-based and frequentist inference. The purpose of this current paper is to provide a simple example of the application of this criterion and a demonstration of the natural emergence of model complexities with both AIC-like ($N^0$) and BIC-like ($\\log N$) scaling with observation number $N$. The application developed is deliberately simplified to make the analysis analytically tractable.

  11. Small populations corrections for selection-mutation models

    CERN Document Server

    Jabin, Pierre-Emmanuel

    2012-01-01

    We consider integro-differential models describing the evolution of a population structured by a quantitative trait. Individuals interact competitively, creating a strong selection pressure on the population. On the other hand, mutations are assumed to be small. Following the formalism of Diekmann, Jabin, Mischler, and Perthame, this creates concentration phenomena, typically consisting in a sum of Dirac masses slowly evolving in time. We propose a modification to those classical models that takes the effect of small populations into accounts and corrects some abnormal behaviours.

  12. Process chain modeling and selection in an additive manufacturing context

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Stolfi, Alessandro; Mischkot, Michael

    2016-01-01

    can compete with traditional process chains for small production runs. Combining both types of technology added cost but no benefit in this case. The new process chain model can be used to explain the results and support process selection, but process chain prototyping is still important for rapidly......This paper introduces a new two-dimensional approach to modeling manufacturing process chains. This approach is used to consider the role of additive manufacturing technologies in process chains for a part with micro scale features and no internal geometry. It is shown that additive manufacturing...

  13. Treatment Seeking for Alcohol Use Disorders : Treatment Gap or Adequate Self-Selection?

    NARCIS (Netherlands)

    Tuithof, Marlous; Ten Have, Margreet; Van Den Brink, Wim; Vollebergh, Wilma; De Graaf, Ron

    2016-01-01

    Background/Aims: This study examines whether it is harmful that subjects with an alcohol use disorder (AUD) in the general population rarely seek treatment. Methods: Baseline and 3-year follow-up data from the Netherlands Mental Health Survey and Incidence Study-2 were used. Treatment utilization

  14. Selecting, weeding, and weighting biased climate model ensembles

    Science.gov (United States)

    Jackson, C. S.; Picton, J.; Huerta, G.; Nosedal Sanchez, A.

    2012-12-01

    In the Bayesian formulation, the "log-likelihood" is a test statistic for selecting, weeding, or weighting climate model ensembles with observational data. This statistic has the potential to synthesize the physical and data constraints on quantities of interest. One of the thorny issues for formulating the log-likelihood is how one should account for biases. While in the past we have included a generic discrepancy term, not all biases affect predictions of quantities of interest. We make use of a 165-member ensemble CAM3.1/slab ocean climate models with different parameter settings to think through the issues that are involved with predicting each model's sensitivity to greenhouse gas forcing given what can be observed from the base state. In particular we use multivariate empirical orthogonal functions to decompose the differences that exist among this ensemble to discover what fields and regions matter to the model's sensitivity. We find that the differences that matter are a small fraction of the total discrepancy. Moreover, weighting members of the ensemble using this knowledge does a relatively poor job of adjusting the ensemble mean toward the known answer. This points out the shortcomings of using weights to correct for biases in climate model ensembles created by a selection process that does not emphasize the priorities of your log-likelihood.

  15. Catalytic conversion reactions in nanoporous systems with concentration-dependent selectivity: Statistical mechanical modeling

    Science.gov (United States)

    García, Andrés; Wang, Jing; Windus, Theresa L.; Sadow, Aaron D.; Evans, James W.

    2016-05-01

    Statistical mechanical modeling is developed to describe a catalytic conversion reaction A →Bc or Bt with concentration-dependent selectivity of the products, Bc or Bt, where reaction occurs inside catalytic particles traversed by narrow linear nanopores. The associated restricted diffusive transport, which in the extreme case is described by single-file diffusion, naturally induces strong concentration gradients. Furthermore, by comparing kinetic Monte Carlo simulation results with analytic treatments, selectivity is shown to be impacted by strong spatial correlations induced by restricted diffusivity in the presence of reaction and also by a subtle clustering of reactants, A .

  16. Bayesian Model Selection with Network Based Diffusion Analysis.

    Science.gov (United States)

    Whalen, Andrew; Hoppitt, William J E

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed.

  17. Selection of productivity improvement techniques via mathematical modeling

    Directory of Open Access Journals (Sweden)

    Mahassan M. Khater

    2011-07-01

    Full Text Available This paper presents a new mathematical model to select an optimal combination of productivity improvement techniques. The proposed model of this paper considers four-stage cycle productivity and the productivity is assumed to be a linear function of fifty four improvement techniques. The proposed model of this paper is implemented for a real-world case study of manufacturing plant. The resulted problem is formulated as a mixed integer programming which can be solved for optimality using traditional methods. The preliminary results of the implementation of the proposed model of this paper indicate that the productivity can be improved through a change on equipments and it can be easily applied for both manufacturing and service industries.

  18. An Introduction to Model Selection: Tools and Algorithms

    Directory of Open Access Journals (Sweden)

    Sébastien Hélie

    2006-03-01

    Full Text Available Model selection is a complicated matter in science, and psychology is no exception. In particular, the high variance in the object of study (i.e., humans prevents the use of Popper’s falsification principle (which is the norm in other sciences. Therefore, the desirability of quantitative psychological models must be assessed by measuring the capacity of the model to fit empirical data. In the present paper, an error measure (likelihood, as well as five methods to compare model fits (the likelihood ratio test, Akaike’s information criterion, the Bayesian information criterion, bootstrapping and cross-validation, are presented. The use of each method is illustrated by an example, and the advantages and weaknesses of each method are also discussed.

  19. Selection of key terrain attributes for SOC model

    DEFF Research Database (Denmark)

    Greve, Mogens Humlekrog; Adhikari, Kabindra; Chellasamy, Menaka

    was selected, total 2,514,820 data mining models were constructed by 71 differences grid from 12m to 2304m and 22 attributes, 21 attributes derived by DTM and the original elevation. Relative importance and usage of each attributes in every model were calculated. Comprehensive impact rates of each attribute...... (standh) are the first three key terrain attributes in 5-attributes-model in all resolutions, the rest 2 of 5 attributes are Normal High (NormalH) and Valley Depth (Vall_depth) at the resolution finer than 40m, and Elevation and Channel Base (Chnl_base) coarser than 40m. The models at pixels size at 88m......As an important component of the global carbon pool, soil organic carbon (SOC) plays an important role in the global carbon cycle. SOC pool is the basic information to carry out global warming research, and needs to sustainable use of land resources. Digital terrain attributes are often use...

  20. Unifying models for X-ray selected and Radio selected BL Lac Objects

    CERN Document Server

    Fossati, G; Ghisellini, G; Maraschi, L; Brera-Merate, O A

    1997-01-01

    We discuss alternative interpretations of the differences in the Spectral Energy Distributions (SEDs) of BL Lacs found in complete Radio or X-ray surveys. A large body of observations in different bands suggests that the SEDs of BL Lac objects appearing in X-ray surveys differ from those appearing in radio surveys mainly in having a (synchrotron) spectral cut-off (or break) at much higher frequency. In order to explain the different properties of radio and X-ray selected BL Lacs Giommi and Padovani proposed a model based on a common radio luminosity function. At each radio luminosity, objects with high frequency spectral cut-offs are assumed to be a minority. Nevertheless they dominate the X-ray selected population due to the larger X-ray-to-radio-flux ratio. An alternative model explored here (reminiscent of the orientation models previously proposed) is that the X-ray luminosity function is "primary" and that at each X-ray luminosity a minority of objects has larger radio-to-X-ray flux ratio. The prediction...

  1. Nelotanserin, a novel selective human 5-hydroxytryptamine2A inverse agonist for the treatment of insomnia.

    Science.gov (United States)

    Al-Shamma, Hussien A; Anderson, Christen; Chuang, Emil; Luthringer, Remy; Grottick, Andrew J; Hauser, Erin; Morgan, Michael; Shanahan, William; Teegarden, Bradley R; Thomsen, William J; Behan, Dominic

    2010-01-01

    5-Hydroxytryptamine (5-HT)(2A) receptor inverse agonists are promising therapeutic agents for the treatment of sleep maintenance insomnias. Among these agents is nelotanserin, a potent, selective 5-HT(2A) inverse agonist. Both radioligand binding and functional inositol phosphate accumulation assays suggest that nelotanserin has low nanomolar potency on the 5-HT(2A) receptor with at least 30- and 5000-fold selectivity compared with 5-HT(2C) and 5-HT(2B) receptors, respectively. Nelotanserin dosed orally prevented (+/-)-1-(2,5-dimethoxy-4-iodophenyl)-2-aminopropane (DOI; 5-HT(2A) agonist)-induced hypolocomotion, increased sleep consolidation, and increased total nonrapid eye movement sleep time and deep sleep, the latter marked by increases in electroencephalogram (EEG) delta power. These effects on rat sleep were maintained after repeated subchronic dosing. In healthy human volunteers, nelotanserin was rapidly absorbed after oral administration and achieved maximum concentrations 1 h later. EEG effects occurred within 2 to 4 h after dosing, and were consistent with vigilance-lowering. A dose response of nelotanserin was assessed in a postnap insomnia model in healthy subjects. All doses (up to 40 mg) of nelotanserin significantly improved measures of sleep consolidation, including decreases in the number of stage shifts, number of awakenings after sleep onset, microarousal index, and number of sleep bouts, concomitant with increases in sleep bout duration. Nelotanserin did not affect total sleep time, or sleep onset latency. Furthermore, subjective pharmacodynamic effects observed the morning after dosing were minimal and had no functional consequences on psychomotor skills or memory. These studies point to an efficacy and safety profile for nelotanserin that might be ideally suited for the treatment of sleep maintenance insomnias.

  2. Bayesian model selection applied to artificial neural networks used for water resources modeling

    Science.gov (United States)

    Kingston, Greer B.; Maier, Holger R.; Lambert, Martin F.

    2008-04-01

    Artificial neural networks (ANNs) have proven to be extremely valuable tools in the field of water resources engineering. However, one of the most difficult tasks in developing an ANN is determining the optimum level of complexity required to model a given problem, as there is no formal systematic model selection method. This paper presents a Bayesian model selection (BMS) method for ANNs that provides an objective approach for comparing models of varying complexity in order to select the most appropriate ANN structure. The approach uses Markov Chain Monte Carlo posterior simulations to estimate the evidence in favor of competing models and, in this study, three known methods for doing this are compared in terms of their suitability for being incorporated into the proposed BMS framework for ANNs. However, it is acknowledged that it can be particularly difficult to accurately estimate the evidence of ANN models. Therefore, the proposed BMS approach for ANNs incorporates a further check of the evidence results by inspecting the marginal posterior distributions of the hidden-to-output layer weights, which unambiguously indicate any redundancies in the hidden layer nodes. The fact that this check is available is one of the greatest advantages of the proposed approach over conventional model selection methods, which do not provide such a test and instead rely on the modeler's subjective choice of selection criterion. The advantages of a total Bayesian approach to ANN development, including training and model selection, are demonstrated on two synthetic and one real world water resources case study.

  3. The Impact of Varied Discrimination Parameters on Mixed-Format Item Response Theory Model Selection

    Science.gov (United States)

    Whittaker, Tiffany A.; Chang, Wanchen; Dodd, Barbara G.

    2013-01-01

    Whittaker, Chang, and Dodd compared the performance of model selection criteria when selecting among mixed-format IRT models and found that the criteria did not perform adequately when selecting the more parameterized models. It was suggested by M. S. Johnson that the problems when selecting the more parameterized models may be because of the low…

  4. Model selection, identification and validation in anaerobic digestion: a review.

    Science.gov (United States)

    Donoso-Bravo, Andres; Mailier, Johan; Martin, Cristina; Rodríguez, Jorge; Aceves-Lara, César Arturo; Vande Wouwer, Alain

    2011-11-01

    Anaerobic digestion enables waste (water) treatment and energy production in the form of biogas. The successful implementation of this process has lead to an increasing interest worldwide. However, anaerobic digestion is a complex biological process, where hundreds of microbial populations are involved, and whose start-up and operation are delicate issues. In order to better understand the process dynamics and to optimize the operating conditions, the availability of dynamic models is of paramount importance. Such models have to be inferred from prior knowledge and experimental data collected from real plants. Modeling and parameter identification are vast subjects, offering a realm of approaches and methods, which can be difficult to fully understand by scientists and engineers dedicated to the plant operation and improvements. This review article discusses existing modeling frameworks and methodologies for parameter estimation and model validation in the field of anaerobic digestion processes. The point of view is pragmatic, intentionally focusing on simple but efficient methods.

  5. The impact of treatment with selective serotonin reuptake inhibitors on primate cardiovascular disease, behavior, and neuroanatomy.

    Science.gov (United States)

    Shively, Carol A; Silverstein-Metzler, Marnie; Justice, Jamie; Willard, Stephanie L

    2017-03-01

    Selective serotonin reuptake inhibitor (SSRI) use is ubiquitous because they are widely prescribed for a number of disorders in addition to depression. Depression increases the risk of coronary heart disease (CHD). Hence, treating depression with SSRIs could reduce CHD risk. However, the effects of long term antidepressant treatment on CHD risk, as well as other aspects of health, remain poorly understood. Thus, we undertook an investigation of multisystem effects of SSRI treatment with a physiologically relevant dose in middle-aged adult female cynomolgus monkeys, a primate species shown to be a useful model of both depression and coronary and carotid artery atherosclerosis. Sertraline had no effect on depressive behavior, reduced anxious behavior, increased affiliation, reduced aggression, changed serotonin neurotransmission and volumes of neural areas critical to mood disorders, and exacerbated coronary and carotid atherosclerosis. These data suggest that a conservative approach to prescribing SSRIs for cardiovascular or other disorders for long periods may be warranted, and that further study is critical given the widespread use of these medications.

  6. Effects of selective androgen receptor modulator (SARM) treatment in osteopenic female rats.

    Science.gov (United States)

    Kearbey, Jeffrey D; Gao, Wenqing; Fisher, Scott J; Wu, Di; Miller, Duane D; Dalton, James T

    2009-11-01

    Although androgens are known to protect bone, side effects and poor oral bioavailability have limited their use. We previously reported that S-3-(4-acetylamino-phenoxy)-2-hydroxy-2-methyl-N-(4-nitro-3-trifluoromethyl-phenyl)-propionamide (S-4) is a potent and tissue-selective androgen receptor modulator (SARM). This study was designed to evaluate the skeletal effects of S-4 in an osteopenic model. Aged female rats were gonadectomized or sham operated on day 1 and assigned to treatment groups. Dosing was initiated on day 90 and continued daily until day 210. Whole animal bone mineral density (BMD), body weight, and fat mass were determined by dual energy x-ray absorptiometry (DEXA). Regional analysis of excised bones was performed using DEXA or computed tomography. Femur strength was evaluated by 3-point bending. S-4 restored whole body and lumbar vertebrae (L5-L6) BMD to the level of intact controls. Significant increases in cortical bone quality were observed at the femoral midshaft, resulting in increased load bearing capacity. S-4 demonstrated partial/complete recovery of bone parameters to age-matched intact levels. Increased efficacy observed in cortical bone sites is consistent with reported androgen action in bone. The ability of S-4 to promote bone anabolism, prevent bone resorption, and increase skeletal muscle mass/strength positions these drugs as promising new alternatives for the treatment of osteoporosis.

  7. The Hierarchical Sparse Selection Model of Visual Crowding

    Directory of Open Access Journals (Sweden)

    Wesley eChaney

    2014-09-01

    Full Text Available Because the environment is cluttered, objects rarely appear in isolation. The visual system must therefore attentionally select behaviorally relevant objects from among many irrelevant ones. A limit on our ability to select individual objects is revealed by the phenomenon of visual crowding: an object seen in the periphery, easily recognized in isolation, can become impossible to identify when surrounded by other, similar objects. The neural basis of crowding is hotly debated: while prevailing theories hold that crowded information is irrecoverable – destroyed due to over-integration in early-stage visual processing – recent evidence demonstrates otherwise. Crowding can occur between high-level, configural object representations, and crowded objects can contribute with high precision to judgments about the gist of a group of objects, even when they are individually unrecognizable. While existing models can account for the basic diagnostic criteria of crowding (e.g. specific critical spacing, spatial anisotropies, and temporal tuning, no present model explains how crowding can operate simultaneously at multiple levels in the visual processing hierarchy, including at the level of whole objects. Here, we present a new model of visual crowding— the hierarchical sparse selection (HSS model, which accounts for object-level crowding, as well as a number of puzzling findings in the recent literature. Counter to existing theories, we posit that crowding occurs not due to degraded visual representations in the brain, but due to impoverished sampling of visual representations for the sake of perception. The HSS model unifies findings from a disparate array of visual crowding studies and makes testable predictions about how information in crowded scenes can be accessed.

  8. The hierarchical sparse selection model of visual crowding.

    Science.gov (United States)

    Chaney, Wesley; Fischer, Jason; Whitney, David

    2014-01-01

    Because the environment is cluttered, objects rarely appear in isolation. The visual system must therefore attentionally select behaviorally relevant objects from among many irrelevant ones. A limit on our ability to select individual objects is revealed by the phenomenon of visual crowding: an object seen in the periphery, easily recognized in isolation, can become impossible to identify when surrounded by other, similar objects. The neural basis of crowding is hotly debated: while prevailing theories hold that crowded information is irrecoverable - destroyed due to over-integration in early stage visual processing - recent evidence demonstrates otherwise. Crowding can occur between high-level, configural object representations, and crowded objects can contribute with high precision to judgments about the "gist" of a group of objects, even when they are individually unrecognizable. While existing models can account for the basic diagnostic criteria of crowding (e.g., specific critical spacing, spatial anisotropies, and temporal tuning), no present model explains how crowding can operate simultaneously at multiple levels in the visual processing hierarchy, including at the level of whole objects. Here, we present a new model of visual crowding-the hierarchical sparse selection (HSS) model, which accounts for object-level crowding, as well as a number of puzzling findings in the recent literature. Counter to existing theories, we posit that crowding occurs not due to degraded visual representations in the brain, but due to impoverished sampling of visual representations for the sake of perception. The HSS model unifies findings from a disparate array of visual crowding studies and makes testable predictions about how information in crowded scenes can be accessed.

  9. Finite element model selection using Particle Swarm Optimization

    CERN Document Server

    Mthembu, Linda; Friswell, Michael I; Adhikari, Sondipon

    2009-01-01

    This paper proposes the application of particle swarm optimization (PSO) to the problem of finite element model (FEM) selection. This problem arises when a choice of the best model for a system has to be made from set of competing models, each developed a priori from engineering judgment. PSO is a population-based stochastic search algorithm inspired by the behaviour of biological entities in nature when they are foraging for resources. Each potentially correct model is represented as a particle that exhibits both individualistic and group behaviour. Each particle moves within the model search space looking for the best solution by updating the parameters values that define it. The most important step in the particle swarm algorithm is the method of representing models which should take into account the number, location and variables of parameters to be updated. One example structural system is used to show the applicability of PSO in finding an optimal FEM. An optimal model is defined as the model that has t...

  10. ModelOMatic: fast and automated model selection between RY, nucleotide, amino acid, and codon substitution models.

    Science.gov (United States)

    Whelan, Simon; Allen, James E; Blackburne, Benjamin P; Talavera, David

    2015-01-01

    Molecular phylogenetics is a powerful tool for inferring both the process and pattern of evolution from genomic sequence data. Statistical approaches, such as maximum likelihood and Bayesian inference, are now established as the preferred methods of inference. The choice of models that a researcher uses for inference is of critical importance, and there are established methods for model selection conditioned on a particular type of data, such as nucleotides, amino acids, or codons. A major limitation of existing model selection approaches is that they can only compare models acting upon a single type of data. Here, we extend model selection to allow comparisons between models describing different types of data by introducing the idea of adapter functions, which project aggregated models onto the originally observed sequence data. These projections are implemented in the program ModelOMatic and used to perform model selection on 3722 families from the PANDIT database, 68 genes from an arthropod phylogenomic data set, and 248 genes from a vertebrate phylogenomic data set. For the PANDIT and arthropod data, we find that amino acid models are selected for the overwhelming majority of alignments; with progressively smaller numbers of alignments selecting codon and nucleotide models, and no families selecting RY-based models. In contrast, nearly all alignments from the vertebrate data set select codon-based models. The sequence divergence, the number of sequences, and the degree of selection acting upon the protein sequences may contribute to explaining this variation in model selection. Our ModelOMatic program is fast, with most families from PANDIT taking fewer than 150 s to complete, and should therefore be easily incorporated into existing phylogenetic pipelines. ModelOMatic is available at https://code.google.com/p/modelomatic/.

  11. Neuroimaging-based biomarkers for treatment selection in major depressive disorder.

    Science.gov (United States)

    Dunlop, Boadie W; Mayberg, Helen S

    2014-12-01

    The use of neuroimaging approaches to identify likely treatment outcomes in patients with major depressive disorder is developing rapidly. Emerging work suggests that resting state pretreatment metabolic activity in the fronto-insular cortex may distinguish between patients likely to respond to psychotherapy or medication and may function as a treatment-selection biomarker. In contrast, high metabolic activity in the subgenual anterior cingulate cortex may be predictive of poor outcomes to both medication and psychotherapy, suggesting that nonstandard treatments may be pursued earlier in the treatment course. Although these findings will require replication before clinical adoption, they provide preliminary support for the concept that brain states can be measured and applied to the selection of a specific treatment most likely to be beneficial for an individual patient.

  12. Gastrostomy Tube Weaning and Treatment of Severe Selective Eating in Childhood: Experience in Israel Using an Intensive Three Week Program.

    Science.gov (United States)

    Shalem, Tzippora; Fradkin, Akiva; Dunitz-Scheer, Marguerite; Sadeh-Kon, Tal; Goz-Gulik, Tali; Fishler, Yael; Weiss, Batia

    2016-06-01

    Children dependent on gastrostomy tube feeding and those with extremely selective eating comprise the most challenging groups of early childhood eating disorders. We established, for the first time in Israel, a 3 week intensive weaning and treatment program for these patients based on the "Graz model." To investigate the Graz model for tube weaning and for treating severe selective eating disorders in one center in Israel. Pre-program assessment of patients' suitability to participate was performed 3 months prior to the study, and a treatment goal was set for each patient. The program included a multidisciplinary outpatient or inpatient 3 week treatment course. The major outcome measures were achievement of the target goal of complete or partial tube weaning for those with tube dependency, and expansion of the child's nutritional diversity for those with selective eating. Thirty-four children, 28 with tube dependency and 6 with selective eating, participated in four programs conducted over 24 months. Their mean age was 4.3 ± 0.37 years. Of all patients, 29 (85%) achieved the target goal (24 who were tube-dependent and 5 selective eaters). One patient was excluded due to aspiration pneumonia. After 6 months follow-up, 24 of 26 available patients (92%) maintained their target or improved. This intensive 3 week program was highly effective in weaning children with gastrostomy tube dependency and ameliorating severe selective eating. Preliminary evaluation of the family is necessary for completion of the program and achieving the child's personal goal, as are an experienced multidisciplinary team and the appropriate hospital setup, i.e., inpatient or outpatient.

  13. [Modern drug therapy of atrial fibrillation: selection of treatment strategy, antiarrhythmic preparations, and schemes of treatment].

    Science.gov (United States)

    Kanorskiĭ, S G

    2012-01-01

    This review presents novel literature data on drug treatment of atrial fibrillation. We discuss here choice of strategy of therapy, antiarrhythmic drugs, and algorithms of preventive measures aimed at prevention of recurrences of this arrhythmia.

  14. Selection of candidate wells and optimization of conformance treatment design in the Barrancas Field using a 3D conformance simulator

    Energy Technology Data Exchange (ETDEWEB)

    Crosta, Dante; Elitseche, Luis [Repsol YPF (Argentina); Gutierrez, Mauricio; Ansah, Joe; Everett, Don [Halliburton Argentina S.A., Buenos Aires (Argentina)

    2004-07-01

    Minimizing the amount of unwanted water production is an important goal at the Barrancas field. This paper describes a selection process for candidate injection wells that is part of a pilot conformance project aimed at improving vertical injection profiles, reducing water cut in producing wells, and improving ultimate oil recovery from this field. The well selection process is based on a review of limited reservoir information available for this field to determine inter-well communications. The methodology focuses on the best use of available information, such as production and injection history, well intervention files, open hole logs and injectivity surveys. After the candidate wells were selected and potential water injection channels were identified, conformance treatment design and future performance of wells in the selected pilot area were evaluated using a new 3 -D conformance simulator, developed specifically for optimization of the design and placement of unwanted fluid shut-off treatments. Thus, when acceptable history match ing of the pilot area production was obtained, the 3 -D simulator was used to: evaluate the required volume of selected conformance treatment fluid; review expected pressures and rates during placement;. model temperature behavior; evaluate placement techniques, and forecast water cut reduction and incremental oil recovery from the producers in this simulated section of the pilot area. This paper outlines a methodology for selecting candidate wells for conformance treatments. The method involves application of several engineering tools, an integral component of which is a user-friendly conformance simulator. The use of the simulator has minimized data preparation time and allows the running of sensitivity cases quickly to explore different possible scenarios that best represent the reservoir. The proposed methodology provides an efficient means of identifying conformance problems and designing optimized solutions for these individual

  15. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  16. Mathematical Model for the Selection of Processing Parameters in Selective Laser Sintering of Polymer Products

    Directory of Open Access Journals (Sweden)

    Ana Pilipović

    2014-03-01

    Full Text Available Additive manufacturing (AM is increasingly applied in the development projects from the initial idea to the finished product. The reasons are multiple, but what should be emphasised is the possibility of relatively rapid manufacturing of the products of complicated geometry based on the computer 3D model of the product. There are numerous limitations primarily in the number of available materials and their properties, which may be quite different from the properties of the material of the finished product. Therefore, it is necessary to know the properties of the product materials. In AM procedures the mechanical properties of materials are affected by the manufacturing procedure and the production parameters. During SLS procedures it is possible to adjust various manufacturing parameters which are used to influence the improvement of various mechanical and other properties of the products. The paper sets a new mathematical model to determine the influence of individual manufacturing parameters on the polymer product made by selective laser sintering. Old mathematical model is checked by statistical method with central composite plan and it is established that old mathematical model must be expanded with new parameter beam overlay ratio. Verification of new mathematical model and optimization of the processing parameters are made on SLS machine.

  17. Selecting global climate models for regional climate change studies.

    Science.gov (United States)

    Pierce, David W; Barnett, Tim P; Santer, Benjamin D; Gleckler, Peter J

    2009-05-26

    Regional or local climate change modeling studies currently require starting with a global climate model, then downscaling to the region of interest. How should global models be chosen for such studies, and what effect do such choices have? This question is addressed in the context of a regional climate detection and attribution (D&A) study of January-February-March (JFM) temperature over the western U.S. Models are often selected for a regional D&A analysis based on the quality of the simulated regional climate. Accordingly, 42 performance metrics based on seasonal temperature and precipitation, the El Nino/Southern Oscillation (ENSO), and the Pacific Decadal Oscillation are constructed and applied to 21 global models. However, no strong relationship is found between the score of the models on the metrics and results of the D&A analysis. Instead, the importance of having ensembles of runs with enough realizations to reduce the effects of natural internal climate variability is emphasized. Also, the superiority of the multimodel ensemble average (MM) to any 1 individual model, already found in global studies examining the mean climate, is true in this regional study that includes measures of variability as well. Evidence is shown that this superiority is largely caused by the cancellation of offsetting errors in the individual global models. Results with both the MM and models picked randomly confirm the original D&A results of anthropogenically forced JFM temperature changes in the western U.S. Future projections of temperature do not depend on model performance until the 2080s, after which the better performing models show warmer temperatures.

  18. Selecting global climate models for regional climate change studies

    Science.gov (United States)

    Pierce, David W.; Barnett, Tim P.; Santer, Benjamin D.; Gleckler, Peter J.

    2009-01-01

    Regional or local climate change modeling studies currently require starting with a global climate model, then downscaling to the region of interest. How should global models be chosen for such studies, and what effect do such choices have? This question is addressed in the context of a regional climate detection and attribution (D&A) study of January-February-March (JFM) temperature over the western U.S. Models are often selected for a regional D&A analysis based on the quality of the simulated regional climate. Accordingly, 42 performance metrics based on seasonal temperature and precipitation, the El Nino/Southern Oscillation (ENSO), and the Pacific Decadal Oscillation are constructed and applied to 21 global models. However, no strong relationship is found between the score of the models on the metrics and results of the D&A analysis. Instead, the importance of having ensembles of runs with enough realizations to reduce the effects of natural internal climate variability is emphasized. Also, the superiority of the multimodel ensemble average (MM) to any 1 individual model, already found in global studies examining the mean climate, is true in this regional study that includes measures of variability as well. Evidence is shown that this superiority is largely caused by the cancellation of offsetting errors in the individual global models. Results with both the MM and models picked randomly confirm the original D&A results of anthropogenically forced JFM temperature changes in the western U.S. Future projections of temperature do not depend on model performance until the 2080s, after which the better performing models show warmer temperatures. PMID:19439652

  19. Multilevel selection in a resource-based model

    Science.gov (United States)

    Ferreira, Fernando Fagundes; Campos, Paulo R. A.

    2013-07-01

    In the present work we investigate the emergence of cooperation in a multilevel selection model that assumes limiting resources. Following the work by R. J. Requejo and J. Camacho [Phys. Rev. Lett.0031-900710.1103/PhysRevLett.108.038701 108, 038701 (2012)], the interaction among individuals is initially ruled by a prisoner's dilemma (PD) game. The payoff matrix may change, influenced by the resource availability, and hence may also evolve to a non-PD game. Furthermore, one assumes that the population is divided into groups, whose local dynamics is driven by the payoff matrix, whereas an intergroup competition results from the nonuniformity of the growth rate of groups. We study the probability that a single cooperator can invade and establish in a population initially dominated by defectors. Cooperation is strongly favored when group sizes are small. We observe the existence of a critical group size beyond which cooperation becomes counterselected. Although the critical size depends on the parameters of the model, it is seen that a saturation value for the critical group size is achieved. The results conform to the thought that the evolutionary history of life repeatedly involved transitions from smaller selective units to larger selective units.

  20. Development of an odorant emission model for sewage treatment works.

    Science.gov (United States)

    Gostelow, P; Parsons, S A; Cobb, J

    2001-01-01

    In the field of odour assessment, much attention has been paid to the measurement of odour concentration. Whilst the concentration of an odour at a receptor is a useful indicator of annoyance, the concentration at the source tells only half the story. The emission rate - the product of odour concentration and air flow rate - is required to appreciate the significance of odour sources. Knowledge of emission rates allows odour sources to be ranked in terms of significance and facilitates appropriate selection and design of odour control units. The emission rate is also a key input for atmospheric dispersion models. Given the increasing importance of odour to sewage treatment works operators, there is a clear need for predictive methods for odour emission rates. Theory suggests that the emission of odorants from sewage to air is controlled by mass transfer resistances in both the gas and liquid phase. These are in turn controlled by odorant and emission source characteristics. The required odorant characteristics are largely known, and mass transfer from many different types of emission sources have been studied. Sewage treatment processes can be described by one or more of six characteristic emission sources, these being quiescent surfaces, channels, weirs and drop structures, diffused aeration, surface aeration and flow over media. This paper describes the development of odorant mass transfer models for these characteristic emission types. The models have been applied in the form of spreadsheet models to the prediction of H2S emissions and the results compared with commercial VOC emission models.

  1. Specific heat treatment of selective laser melted Ti-6Al-4V for biomedical applications

    Science.gov (United States)

    Huang, Qianli; Liu, Xujie; Yang, Xing; Zhang, Ranran; Shen, Zhijian; Feng, Qingling

    2015-12-01

    The ductility of as-fabricated Ti-6Al-4V falls far short of the requirements for biomedical titanium alloy implants and the heat treatment remains the only applicable option for improvement of their mechanical properties. In the present study, the decomposition of as-fabricated martensite was investigated to provide a general understanding on the kinetics of its phase transformation. The decomposition of asfabricated martensite was found to be slower than that of water-quenched martensite. It indicates that specific heat treatment strategy is needed to be explored for as-fabricated Ti-6Al-4V. Three strategies of heat treatment were proposed based on different phase transformation mechanisms and classified as subtransus treatment, supersolvus treatment and mixed treatment. These specific heat treatments were conducted on selective laser melted samples to investigate the evolutions of microstructure and mechanical properties. The subtransus treatment leaded to a basket-weave structure without changing the morphology of columnar prior β grains. The supersolvus treatment resulted in a lamellar structure and equiaxed β grains. The mixed treatment yielded a microstructure that combines both features of the subtransus treatment and supersolvus treatment. The subtransus treatment is found to be the best choice among these three strategies for as-fabricated Ti-6Al-4V to be used as biomedical implants.

  2. A Reliability Based Model for Wind Turbine Selection

    Directory of Open Access Journals (Sweden)

    A.K. Rajeevan

    2013-06-01

    Full Text Available A wind turbine generator output at a specific site depends on many factors, particularly cut- in, rated and cut-out wind speed parameters. Hence power output varies from turbine to turbine. The objective of this paper is to develop a mathematical relationship between reliability and wind power generation. The analytical computation of monthly wind power is obtained from weibull statistical model using cubic mean cube root of wind speed. Reliability calculation is based on failure probability analysis. There are many different types of wind turbinescommercially available in the market. From reliability point of view, to get optimum reliability in power generation, it is desirable to select a wind turbine generator which is best suited for a site. The mathematical relationship developed in this paper can be used for site-matching turbine selection in reliability point of view.

  3. Applying a Hybrid MCDM Model for Six Sigma Project Selection

    Directory of Open Access Journals (Sweden)

    Fu-Kwun Wang

    2014-01-01

    Full Text Available Six Sigma is a project-driven methodology; the projects that provide the maximum financial benefits and other impacts to the organization must be prioritized. Project selection (PS is a type of multiple criteria decision making (MCDM problem. In this study, we present a hybrid MCDM model combining the decision-making trial and evaluation laboratory (DEMATEL technique, analytic network process (ANP, and the VIKOR method to evaluate and improve Six Sigma projects for reducing performance gaps in each criterion and dimension. We consider the film printing industry of Taiwan as an empirical case. The results show that our study not only can use the best project selection, but can also be used to analyze the gaps between existing performance values and aspiration levels for improving the gaps in each dimension and criterion based on the influential network relation map.

  4. Refined homology model of monoacylglycerol lipase: toward a selective inhibitor

    Science.gov (United States)

    Bowman, Anna L.; Makriyannis, Alexandros

    2009-11-01

    Monoacylglycerol lipase (MGL) is primarily responsible for the hydrolysis of 2-arachidonoylglycerol (2-AG), an endocannabinoid with full agonist activity at both cannabinoid receptors. Increased tissue 2-AG levels consequent to MGL inhibition are considered therapeutic against pain, inflammation, and neurodegenerative disorders. However, the lack of MGL structural information has hindered the development of MGL-selective inhibitors. Here, we detail a fully refined homology model of MGL which preferentially identifies MGL inhibitors over druglike noninhibitors. We include for the first time insight into the active-site geometry and potential hydrogen-bonding interactions along with molecular dynamics simulations describing the opening and closing of the MGL helical-domain lid. Docked poses of both the natural substrate and known inhibitors are detailed. A comparison of the MGL active-site to that of the other principal endocannabinoid metabolizing enzyme, fatty acid amide hydrolase, demonstrates key differences which provide crucial insight toward the design of selective MGL inhibitors as potential drugs.

  5. Auditory-model based robust feature selection for speech recognition.

    Science.gov (United States)

    Koniaris, Christos; Kuropatwinski, Marcin; Kleijn, W Bastiaan

    2010-02-01

    It is shown that robust dimension-reduction of a feature set for speech recognition can be based on a model of the human auditory system. Whereas conventional methods optimize classification performance, the proposed method exploits knowledge implicit in the auditory periphery, inheriting its robustness. Features are selected to maximize the similarity of the Euclidean geometry of the feature domain and the perceptual domain. Recognition experiments using mel-frequency cepstral coefficients (MFCCs) confirm the effectiveness of the approach, which does not require labeled training data. For noisy data the method outperforms commonly used discriminant-analysis based dimension-reduction methods that rely on labeling. The results indicate that selecting MFCCs in their natural order results in subsets with good performance.

  6. POSSIBILISTIC SHARPE RATIO BASED NOVICE PORTFOLIO SELECTION MODELS

    Directory of Open Access Journals (Sweden)

    Rupak Bhattacharyya

    2013-02-01

    Full Text Available This paper uses the concept of possibilistic risk aversion to propose a new approach for portfolio selection in fuzzy environment. Using possibility theory, the possibilistic mean, variance, standard deviation and risk premium of a fuzzy number are established. Possibilistic Sharpe ratio is defined as the ratio of possibilistic risk premium and possibilistic standard deviation of a portfolio. The Sharpe ratio is a measure of the performance of the portfolio compared to the risk taken. The higher the Sharpe ratio, the better the performance of the portfolio is and the greater the profits of taking risk. New models of fuzzy portfolio selection considering the possibilistic Sharpe ratio, return and skewness of the portfolio are considered. The feasibility and effectiveness of the proposed method is illustrated by numerical example extracted from Bombay Stock Exchange (BSE, India and is solved by multiple objective genetic algorithm (MOGA.

  7. Mathematical Model of ComputerHeat Treatment and Its Simulation

    Institute of Scientific and Technical Information of China (English)

    PanJiansheng; ZhangWeimin; TianDong; GuJianfeng; HuMingjuan

    2004-01-01

    Computer simulation on heat treatment is the foundation of intelligent heat treatment. The simulations of temperature field,phase transformation, stress/strain complicate quenching operation were realized by using the model of three dimensional non-linear finite element method and the treatment methods of abruptly changing interface conditions. The simulation results basically fit those measured in experiments. The intelligent sealed multipurpose furnace production line has been developed based on the combination of computer simulation on gaseous carburizing and computer control technology. More than 3000 batches of workpieces have been processed on this production line, and all are up to standard. The application of computer simulation technology can significantly improve the loading ability and reliability of nitriding and carburizing workpieces, reduce heat treatment distortion, and shorten carburizing duration. It is recommended that the reliable product design without redundancy should be performed with the combination of the CAD of mechanical products, the CAE of materials selection and heat treatment, and the dynamic evaluation technology of product reliability.

  8. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  9. A Dual-Stage Two-Phase Model of Selective Attention

    Science.gov (United States)

    Hubner, Ronald; Steinhauser, Marco; Lehle, Carola

    2010-01-01

    The dual-stage two-phase (DSTP) model is introduced as a formal and general model of selective attention that includes both an early and a late stage of stimulus selection. Whereas at the early stage information is selected by perceptual filters whose selectivity is relatively limited, at the late stage stimuli are selected more efficiently on a…

  10. glmulti: An R Package for Easy Automated Model Selection with (Generalized Linear Models

    Directory of Open Access Journals (Sweden)

    Vincent Calcagno

    2010-10-01

    Full Text Available We introduce glmulti, an R package for automated model selection and multi-model inference with glm and related functions. From a list of explanatory variables, the provided function glmulti builds all possible unique models involving these variables and, optionally, their pairwise interactions. Restrictions can be specified for candidate models, by excluding specific terms, enforcing marginality, or controlling model complexity. Models are fitted with standard R functions like glm. The n best models and their support (e.g., (QAIC, (QAICc, or BIC are returned, allowing model selection and multi-model inference through standard R functions. The package is optimized for large candidate sets by avoiding memory limitation, facilitating parallelization and providing, in addition to exhaustive screening, a compiled genetic algorithm method. This article briefly presents the statistical framework and introduces the package, with applications to simulated and real data.

  11. Hospital patients' perceptions during treatment and early discontinuation of serotonin selective reuptake inhibitor antidepressants.

    Science.gov (United States)

    Woolley, Stephen B; Fredman, Lisa; Goethe, John W; Lincoln, Alisa K; Heeren, Timothy

    2010-12-01

    Studies have suggested that discontinuation of treatment in depressed patients is associated with their perceptions about their treatment. We surveyed 403 adults treated for major depressive disorder with a selective serotonin reuptake inhibitor (SSRI) 3 months after onset of treatment to assess their interactions with clinicians, reasons they stopped SSRI treatment, and SSRI side effects (SEs). Bothersome SEs, poorer instruction by physicians about SSRI SEs, and self-reported change in depression, sex, marital status, and employment were significantly (P < 0.05) associated with discontinuation. Logistic regression examined the associations between patients' perceptions during treatment planning and SSRI discontinuation. Seventeen percent of patients felt uninvolved in treatment decisions, 9% disagreed with the diagnosis, and 24% subsequently stopped treatment. Elevated risk of discontinuation was found among patients who felt uninvolved in treatment decisions (unadjusted risk ratio [RR], 2.3; 95% confidence interval [CI], 1.2-4.3) and those who disagreed with the diagnosis (RR, 2.0; CI, 0.9-4.4). Patients who both felt uninvolved and disagreed with the diagnosis were 7-fold as likely to discontinue their SSRI (RR, 7.3; CI, 1.5-36.3) compared with those who felt neither uninvolved nor disagreed. Selective serotonin reuptake inhibitor SEs, specific interactions with clinicians, self-assessed outcomes, and sociodemographics did not explain these associations. To improve adherence to medications, clinicians should consider patients' perceptions about their involvement in treatment decisions and agreement with their diagnosis.

  12. Selection between foreground models for global 21-cm experiments

    CERN Document Server

    Harker, Geraint

    2015-01-01

    The precise form of the foregrounds for sky-averaged measurements of the 21-cm line during and before the epoch of reionization is unknown. We suggest that the level of complexity in the foreground models used to fit global 21-cm data should be driven by the data, under a Bayesian model selection methodology. A first test of this approach is carried out by applying nested sampling to simplified models of global 21-cm data to compute the Bayesian evidence for the models. If the foregrounds are assumed to be polynomials of order n in log-log space, we can infer the necessity to use n=4 rather than n=3 with <2h of integration with limited frequency coverage, for reasonable values of the n=4 coefficient. Using a higher-order polynomial does not necessarily prevent a significant detection of the 21-cm signal. Even for n=8, we can obtain very strong evidence distinguishing a reasonable model for the signal from a null model with 128h of integration. More subtle features of the signal may, however, be lost if the...

  13. Development of Solar Drying Model for Selected Cambodian Fish Species

    Directory of Open Access Journals (Sweden)

    Anna Hubackova

    2014-01-01

    Full Text Available A solar drying was investigated as one of perspective techniques for fish processing in Cambodia. The solar drying was compared to conventional drying in electric oven. Five typical Cambodian fish species were selected for this study. Mean solar drying temperature and drying air relative humidity were 55.6°C and 19.9%, respectively. The overall solar dryer efficiency was 12.37%, which is typical for natural convection solar dryers. An average evaporative capacity of solar dryer was 0.049 kg·h−1. Based on coefficient of determination (R2, chi-square (χ2 test, and root-mean-square error (RMSE, the most suitable models describing natural convection solar drying kinetics were Logarithmic model, Diffusion approximate model, and Two-term model for climbing perch and Nile tilapia, swamp eel and walking catfish and Channa fish, respectively. In case of electric oven drying, the Modified Page 1 model shows the best results for all investigated fish species except Channa fish where the two-term model is the best one. Sensory evaluation shows that most preferable fish is climbing perch, followed by Nile tilapia and walking catfish. This study brings new knowledge about drying kinetics of fresh water fish species in Cambodia and confirms the solar drying as acceptable technology for fish processing.

  14. Development of solar drying model for selected Cambodian fish species.

    Science.gov (United States)

    Hubackova, Anna; Kucerova, Iva; Chrun, Rithy; Chaloupkova, Petra; Banout, Jan

    2014-01-01

    A solar drying was investigated as one of perspective techniques for fish processing in Cambodia. The solar drying was compared to conventional drying in electric oven. Five typical Cambodian fish species were selected for this study. Mean solar drying temperature and drying air relative humidity were 55.6 °C and 19.9%, respectively. The overall solar dryer efficiency was 12.37%, which is typical for natural convection solar dryers. An average evaporative capacity of solar dryer was 0.049 kg · h(-1). Based on coefficient of determination (R(2)), chi-square (χ(2)) test, and root-mean-square error (RMSE), the most suitable models describing natural convection solar drying kinetics were Logarithmic model, Diffusion approximate model, and Two-term model for climbing perch and Nile tilapia, swamp eel and walking catfish and Channa fish, respectively. In case of electric oven drying, the Modified Page 1 model shows the best results for all investigated fish species except Channa fish where the two-term model is the best one. Sensory evaluation shows that most preferable fish is climbing perch, followed by Nile tilapia and walking catfish. This study brings new knowledge about drying kinetics of fresh water fish species in Cambodia and confirms the solar drying as acceptable technology for fish processing.

  15. Selection Strategies for Social Influence in the Threshold Model

    Science.gov (United States)

    Karampourniotis, Panagiotis; Szymanski, Boleslaw; Korniss, Gyorgy

    The ubiquity of online social networks makes the study of social influence extremely significant for its applications to marketing, politics and security. Maximizing the spread of influence by strategically selecting nodes as initiators of a new opinion or trend is a challenging problem. We study the performance of various strategies for selection of large fractions of initiators on a classical social influence model, the Threshold model (TM). Under the TM, a node adopts a new opinion only when the fraction of its first neighbors possessing that opinion exceeds a pre-assigned threshold. The strategies we study are of two kinds: strategies based solely on the initial network structure (Degree-rank, Dominating Sets, PageRank etc.) and strategies that take into account the change of the states of the nodes during the evolution of the cascade, e.g. the greedy algorithm. We find that the performance of these strategies depends largely on both the network structure properties, e.g. the assortativity, and the distribution of the thresholds assigned to the nodes. We conclude that the optimal strategy needs to combine the network specifics and the model specific parameters to identify the most influential spreaders. Supported in part by ARL NS-CTA, ARO, and ONR.

  16. Selection of models to calculate the LLW source term

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, T.M. (Brookhaven National Lab., Upton, NY (United States))

    1991-10-01

    Performance assessment of a LLW disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). In turn, many of these physical processes are influenced by the design of the disposal facility (e.g., infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This document provides a brief overview of disposal practices and reviews existing source term models as background for selecting appropriate models for estimating the source term. The selection rationale and the mathematical details of the models are presented. Finally, guidance is presented for combining the inventory data with appropriate mechanisms describing release from the disposal facility. 44 refs., 6 figs., 1 tab.

  17. Quantum Model for the Selectivity Filter in K$^{+}$ Ion Channel

    CERN Document Server

    Cifuentes, A A

    2013-01-01

    In this work, we present a quantum transport model for the selectivity filter in the KcsA potassium ion channel. This model is fully consistent with the fact that two conduction pathways are involved in the translocation of ions thorough the filter, and we show that the presence of a second path may actually bring advantages for the filter as a result of quantum interference. To highlight interferences and resonances in the model, we consider the selectivity filter to be driven by a controlled time-dependent external field which changes the free energy scenario and consequently the conduction of the ions. In particular, we demonstrate that the two-pathway conduction mechanism is more advantageous for the filter when dephasing in the transient configurations is lower than in the main configurations. As a matter of fact, K$^+$ ions in the main configurations are highly coordinated by oxygen atoms of the filter backbone and this increases noise. Moreover, we also show that, for a wide range of driving frequencie...

  18. Continuum model for chiral induced spin selectivity in helical molecules

    Energy Technology Data Exchange (ETDEWEB)

    Medina, Ernesto [Centro de Física, Instituto Venezolano de Investigaciones Científicas, 21827, Caracas 1020 A (Venezuela, Bolivarian Republic of); Groupe de Physique Statistique, Institut Jean Lamour, Université de Lorraine, 54506 Vandoeuvre-les-Nancy Cedex (France); Department of Chemistry and Biochemistry, Arizona State University, Tempe, Arizona 85287 (United States); González-Arraga, Luis A. [IMDEA Nanoscience, Cantoblanco, 28049 Madrid (Spain); Finkelstein-Shapiro, Daniel; Mujica, Vladimiro [Department of Chemistry and Biochemistry, Arizona State University, Tempe, Arizona 85287 (United States); Berche, Bertrand [Centro de Física, Instituto Venezolano de Investigaciones Científicas, 21827, Caracas 1020 A (Venezuela, Bolivarian Republic of); Groupe de Physique Statistique, Institut Jean Lamour, Université de Lorraine, 54506 Vandoeuvre-les-Nancy Cedex (France)

    2015-05-21

    A minimal model is exactly solved for electron spin transport on a helix. Electron transport is assumed to be supported by well oriented p{sub z} type orbitals on base molecules forming a staircase of definite chirality. In a tight binding interpretation, the spin-orbit coupling (SOC) opens up an effective π{sub z} − π{sub z} coupling via interbase p{sub x,y} − p{sub z} hopping, introducing spin coupled transport. The resulting continuum model spectrum shows two Kramers doublet transport channels with a gap proportional to the SOC. Each doubly degenerate channel satisfies time reversal symmetry; nevertheless, a bias chooses a transport direction and thus selects for spin orientation. The model predicts (i) which spin orientation is selected depending on chirality and bias, (ii) changes in spin preference as a function of input Fermi level and (iii) back-scattering suppression protected by the SO gap. We compute the spin current with a definite helicity and find it to be proportional to the torsion of the chiral structure and the non-adiabatic Aharonov-Anandan phase. To describe room temperature transport, we assume that the total transmission is the result of a product of coherent steps.

  19. A Successive Selection Method for finite element model updating

    Science.gov (United States)

    Gou, Baiyong; Zhang, Weijie; Lu, Qiuhai; Wang, Bo

    2016-03-01

    Finite Element (FE) model can be updated effectively and efficiently by using the Response Surface Method (RSM). However, it often involves performance trade-offs such as high computational cost for better accuracy or loss of efficiency for lots of design parameter updates. This paper proposes a Successive Selection Method (SSM), which is based on the linear Response Surface (RS) function and orthogonal design. SSM rewrites the linear RS function into a number of linear equations to adjust the Design of Experiment (DOE) after every FE calculation. SSM aims to interpret the implicit information provided by the FE analysis, to locate the Design of Experiment (DOE) points more quickly and accurately, and thereby to alleviate the computational burden. This paper introduces the SSM and its application, describes the solution steps of point selection for DOE in detail, and analyzes SSM's high efficiency and accuracy in the FE model updating. A numerical example of a simply supported beam and a practical example of a vehicle brake disc show that the SSM can provide higher speed and precision in FE model updating for engineering problems than traditional RSM.

  20. Selection Experiments in the Penna Model for Biological Aging

    Science.gov (United States)

    Medeiros, G.; Idiart, M. A.; de Almeida, R. M. C.

    We consider the Penna model for biological aging to investigate correlations between early fertility and late life survival rates in populations at equilibrium. We consider inherited initial reproduction ages together with a reproduction cost translated in a probability that mother and offspring die at birth, depending on the mother age. For convenient sets of parameters, the equilibrated populations present genetic variability in what regards both genetically programmed death age and initial reproduction age. In the asexual Penna model, a negative correlation between early life fertility and late life survival rates naturally emerges in the stationary solutions. In the sexual Penna model, selection experiments are performed where individuals are sorted by initial reproduction age from the equilibrated populations and the separated populations are evolved independently. After a transient, a negative correlation between early fertility and late age survival rates also emerges in the sense that populations that start reproducing earlier present smaller average genetically programmed death age. These effects appear due to the age structure of populations in the steady state solution of the evolution equations. We claim that the same demographic effects may be playing an important role in selection experiments in the laboratory.

  1. Color Doppler flow imaging diagnosis and treatment selection for erectile dysfunction

    Institute of Scientific and Technical Information of China (English)

    XUAN Xu-jun; ZHANG Cai-xia; HUANG Jian; Rong Lu; SUN Peng; LIU Hai-nan

    2011-01-01

    Background Targeted therapy for erectile dysfunction (ED) involves fewer screening tests and provides a variety of treatment choices for patients.Although the advantage of targeted therapy in diagnosis and therapy for ED has been recognized,the rational mode for oriented ED therapy has not been established.This study aimed to investigate targeted diagnosis and therapy for ED.Methods A total of 198 patients with ED were included in the study.After intracavernosal vasoactive agent injection was given,color Doppler flow imaging was performed and penile rigidity was classified as Schramek grade 5 (10 minutes duration),grade 4 (10 minutes duration),grade 3 and grade 2,defining four patient groups as group Ⅴ (143 cases),group Ⅳ (23 cases),group Ⅲ (18 cases),and group Ⅱ (14 cases).Appropriate and acceptable treatment was recommended to patients according to erection grade.Results In 198 patients with ED,the peak systolic velocity,end diastolic velocity,and resistance index in the cavemosal artery and dorsal artery and the flow velocity in the deep dorsal vein were not significantly different before injection (P >0.05).After injection,peak systolic velocity,end diastolic velocity,and resistance index in the cavernosal artery were different among the four groups (P<0.05).Between each two groups,the difference in resistance index was significant (P <0.05).The statistical differences in other indexes were not significant (P >0.05).Selective targeted therapy based on erection grade by color Doppler flow imaging improved the clinical satisfaction rate to 91.91% (182/198).Conclusions Based on the routine diagnosis of ED,blood flow indexes in the cavernosal artery are measured by color Doppler flow imaging following minimally invasive intercavernosal injection,which is combined with the Schramek grade of erection.The most appropriate and acceptable treatment is recommended according to the different groups,which improves the clinical satisfaction of treatment for

  2. A qualitative model structure sensitivity analysis method to support model selection

    Science.gov (United States)

    Van Hoey, S.; Seuntjens, P.; van der Kwast, J.; Nopens, I.

    2014-11-01

    The selection and identification of a suitable hydrological model structure is a more challenging task than fitting parameters of a fixed model structure to reproduce a measured hydrograph. The suitable model structure is highly dependent on various criteria, i.e. the modeling objective, the characteristics and the scale of the system under investigation and the available data. Flexible environments for model building are available, but need to be assisted by proper diagnostic tools for model structure selection. This paper introduces a qualitative method for model component sensitivity analysis. Traditionally, model sensitivity is evaluated for model parameters. In this paper, the concept is translated into an evaluation of model structure sensitivity. Similarly to the one-factor-at-a-time (OAT) methods for parameter sensitivity, this method varies the model structure components one at a time and evaluates the change in sensitivity towards the output variables. As such, the effect of model component variations can be evaluated towards different objective functions or output variables. The methodology is presented for a simple lumped hydrological model environment, introducing different possible model building variations. By comparing the effect of changes in model structure for different model objectives, model selection can be better evaluated. Based on the presented component sensitivity analysis of a case study, some suggestions with regard to model selection are formulated for the system under study: (1) a non-linear storage component is recommended, since it ensures more sensitive (identifiable) parameters for this component and less parameter interaction; (2) interflow is mainly important for the low flow criteria; (3) excess infiltration process is most influencing when focussing on the lower flows; (4) a more simple routing component is advisable; and (5) baseflow parameters have in general low sensitivity values, except for the low flow criteria.

  3. Estimation and variable selection for generalized additive partial linear models

    KAUST Repository

    Wang, Li

    2011-08-01

    We study generalized additive partial linear models, proposing the use of polynomial spline smoothing for estimation of nonparametric functions, and deriving quasi-likelihood based estimators for the linear parameters. We establish asymptotic normality for the estimators of the parametric components. The procedure avoids solving large systems of equations as in kernel-based procedures and thus results in gains in computational simplicity. We further develop a class of variable selection procedures for the linear parameters by employing a nonconcave penalized quasi-likelihood, which is shown to have an asymptotic oracle property. Monte Carlo simulations and an empirical example are presented for illustration. © Institute of Mathematical Statistics, 2011.

  4. Parametric pattern selection in a reaction-diffusion model.

    Directory of Open Access Journals (Sweden)

    Michael Stich

    Full Text Available We compare spot patterns generated by Turing mechanisms with those generated by replication cascades, in a model one-dimensional reaction-diffusion system. We determine the stability region of spot solutions in parameter space as a function of a natural control parameter (feed-rate where degenerate patterns with different numbers of spots coexist for a fixed feed-rate. While it is possible to generate identical patterns via both mechanisms, we show that replication cascades lead to a wider choice of pattern profiles that can be selected through a tuning of the feed-rate, exploiting hysteresis and directionality effects of the different pattern pathways.

  5. The socio-political context of the application of fair selection models in the usa

    Directory of Open Access Journals (Sweden)

    G. K. Huysamen

    1996-06-01

    Full Text Available In an earlier article, the psychometrics of various fair selection models that had been proposed in the United States of America in the late 1960s, early 1970s were presented. The purpose of the present article is to discuss the subsequent history of the application of these models in personnel selection in that country and to view its implications for the South African situation. Because the question of fair selection models ties in with the issue of affirmative action, a brief history of this issue as it pertains to personnel selection is also given. Key decisions of the American Supreme Court that have a bearing on this matter are also reviewed. The failure to widely apply these fair selection models may be attributed to the prevalent socio-political context which favours the preferential treatment of certain groups but is hesitant to specify the particulars and limits of such treatment. Opsomming 'n Vorige artikel het die psigometi-ika onderliggend aan verskeie billike keuringsmodelle wat in die laat sestigerjare, vroee sewentigerjare in die Verenigde State van Amerika voorgestel is, behandel. Die doel met die onderhawige artikel is om 'n oorsig te verskaf van die daaropvolgende geskiedenis van die toepassing van daardie modelle in personeelkeuring in daardie land, en om die implikasies daarvan vir die Suid-Afrikaanse situasie te belig. Omdat die aangeleentheid van billike keuringsmodelle verband hou met die kwessie van regstellende aksie, word 'n bondige geskiedenis van hierdie kwessie soos dit op personeelkeuring van toepassing is, ook verskaf. Sleutel-uitsprake van die Amerikaanse Hooggeregshof wat betrekking het op hierdie aangeleentheid word ook beskou. Die beperkte toepassing van hierdie billike keuringsmodelle kan toegeskryf word aan die heersende sosio-politieke konteks wat die voorkeurbehandeling van bepaalde groepe voorstaan, maar wat huiwerig is om die besonderhede en perke van sodanige behandeling te spesifiseer.

  6. Heroin epidemics, treatment and ODE modelling.

    Science.gov (United States)

    White, Emma; Comiskey, Catherine

    2007-07-01

    The UN [United Nations Office on Drugs and Crime (UNODC): World Drug Report, 2005, vol. 1: Analysis. UNODC, 2005.], EU [European Monitoring Centre for Drugs and Drug Addiction (EMCDDA): Annual Report, 2005.http://annualreport.emcdda.eu.int/en/home-en.html.] and WHO [World Health Organisation (WHO): Biregional Strategy for Harm Reduction, 2005-2009. HIV and Injecting Drug Use. WHO, 2005.] have consistently highlighted in recent years the ongoing and persistent nature of opiate and particularly heroin use on a global scale. While this is a global phenomenon, authors have emphasised the significant impact such an epidemic has on individual lives and on society. National prevalence studies have indicated the scale of the problem, but the drug-using career, typically consisting of initiation, habitual use, a treatment-relapse cycle and eventual recovery, is not well understood. This paper presents one of the first ODE models of opiate addiction, based on the principles of mathematical epidemiology. The aim of this model is to identify parameters of interest for further study, with a view to informing and assisting policy-makers in targeting prevention and treatment resources for maximum effectiveness. An epidemic threshold value, R(0), is proposed for the drug-using career. Sensitivity analysis is performed on R(0) and it is then used to examine the stability of the system. A condition under which a backward bifurcation may exist is found, as are conditions that permit the existence of one or more endemic equilibria. A key result arising from this model is that prevention is indeed better than cure.

  7. Selective irradiation of radicals for biomedical treatment using vacuum ultraviolet light from an excimer lamp

    Science.gov (United States)

    Ono, Ryo; Tokumitsu, Yusuke; Zen, Shungo; Yonemori, Seiya

    2014-10-01

    In plasma medicine, radicals are considered to play important roles. However, the medical effect of each radical, such as OH and O, is unknown. To examine the effect of each radical, selective production of radicals is needed. We developed selective production of radicals for biomedical treatment using a vacuum ultraviolet (VUV) light emitted from an excimer lamp. Selective irradiation of OH radicals can be achieved by irradiating the 172-nm VUV light from a Xe2 excimer lamp to a humid helium flow in a quartz tube. The water molecules are strongly photodissociated by the VUV light to produce OH radicals. A photochemical simulation for the selective OH production is developed to calculate the OH density. The calculated OH density is compared with OH density measured using laser-induced fluorescence (LIF). Selective production of other radicals than OH is also discussed.

  8. Linear regression model selection using p-values when the model dimension grows

    CERN Document Server

    Pokarowski, Piotr; Teisseyre, Paweł

    2012-01-01

    We consider a new criterion-based approach to model selection in linear regression. Properties of selection criteria based on p-values of a likelihood ratio statistic are studied for families of linear regression models. We prove that such procedures are consistent i.e. the minimal true model is chosen with probability tending to 1 even when the number of models under consideration slowly increases with a sample size. The simulation study indicates that introduced methods perform promisingly when compared with Akaike and Bayesian Information Criteria.

  9. Decision criteria for the selection of wet oxidation and conventional biological treatment.

    Science.gov (United States)

    Collado, Sergio; Laca, Adriana; Diaz, Mario

    2012-07-15

    The suitability of wet oxidation or biological treatments for the degradation of industrial wastewaters is here discussed. Advantages of these operations, either singly or in combination, are discussed on the basis of previous experimental results from laboratory and industry. Decision diagrams for the selection of conventional biological treatment, wet oxidation or a combination of both techniques are suggested according to the type of pollutant, its concentration and the wastewater flow rate.

  10. Integrated modeling of ozonation for optimization of drinking water treatment

    NARCIS (Netherlands)

    van der Helm, A.W.C.

    2007-01-01

    Drinking water treatment plants automation becomes more sophisticated, more on-line monitoring systems become available and integration of modeling environments with control systems becomes easier. This gives possibilities for model-based optimization. In operation of drinking water treatment

  11. Water quality modelling and optimisation of wastewater treatment ...

    African Journals Online (AJOL)

    Among these activities, wastewater treatment plays a crucial role. In this work, a Streeter-Phelps dissolved oxygen model (DO) is implemented in a ... The Olifants River catchment modelled in this study features 9 wastewater treatment plants.

  12. On the selection of ordinary differential equation models with application to predator-prey dynamical models.

    Science.gov (United States)

    Zhang, Xinyu; Cao, Jiguo; Carroll, Raymond J

    2015-03-01

    We consider model selection and estimation in a context where there are competing ordinary differential equation (ODE) models, and all the models are special cases of a "full" model. We propose a computationally inexpensive approach that employs statistical estimation of the full model, followed by a combination of a least squares approximation (LSA) and the adaptive Lasso. We show the resulting method, here called the LSA method, to be an (asymptotically) oracle model selection method. The finite sample performance of the proposed LSA method is investigated with Monte Carlo simulations, in which we examine the percentage of selecting true ODE models, the efficiency of the parameter estimation compared to simply using the full and true models, and coverage probabilities of the estimated confidence intervals for ODE parameters, all of which have satisfactory performances. Our method is also demonstrated by selecting the best predator-prey ODE to model a lynx and hare population dynamical system among some well-known and biologically interpretable ODE models.

  13. Prediction of Farmers’ Income and Selection of Model ARIMA

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Based on the research technology of scholars’ prediction of farmers’ income and the data of per capita annual net income in rural households in Henan Statistical Yearbook from 1979 to 2009,it is found that time series of farmers’ income is in accordance with I(2)non-stationary process.The order-determination and identification of the model are achieved by adopting the correlogram-based analytical method of Box-Jenkins.On the basis of comparing a group of model properties with different parameters,model ARIMA(4,2,2)is built up.The testing result shows that the residual error of the selected model is white noise and accords with the normal distribution,which can be used to predict farmers’ income.The model prediction indicates that income in rural households will continue to increase from 2009 to 2012 and will reach the value of 2 282.4,2 502.9,2 686.9 and 2 884.5 respectively.The growth speed will go down from fast to slow with weak sustainability.

  14. BUILDING ROBUST APPEARANCE MODELS USING ON-LINE FEATURE SELECTION

    Energy Technology Data Exchange (ETDEWEB)

    PORTER, REID B. [Los Alamos National Laboratory; LOVELAND, ROHAN [Los Alamos National Laboratory; ROSTEN, ED [Los Alamos National Laboratory

    2007-01-29

    In many tracking applications, adapting the target appearance model over time can improve performance. This approach is most popular in high frame rate video applications where latent variables, related to the objects appearance (e.g., orientation and pose), vary slowly from one frame to the next. In these cases the appearance model and the tracking system are tightly integrated, and latent variables are often included as part of the tracking system's dynamic model. In this paper we describe our efforts to track cars in low frame rate data (1 frame/second) acquired from a highly unstable airborne platform. Due to the low frame rate, and poor image quality, the appearance of a particular vehicle varies greatly from one frame to the next. This leads us to a different problem: how can we build the best appearance model from all instances of a vehicle we have seen so far. The best appearance model should maximize the future performance of the tracking system, and maximize the chances of reacquiring the vehicle once it leaves the field of view. We propose an online feature selection approach to this problem and investigate the performance and computational trade-offs with a real-world dataset.

  15. A method of selecting acupoints for acupuncture treatment of peripheral facial paralysis by thermography.

    Science.gov (United States)

    Zhang, Dong

    2007-01-01

    The purpose of this study is to select acupoints for acupuncture treatment of peripheral facial paralysis according to the temperature on the face of the patient detected by thermogram, to determine an objective acupoint selection method for acupuncture treatment. In the test group of 60 cases of facial paralysis, the infrared thermogram on the face was detected at the first visit, and then acupuncture was given at the acupoints on the affected side with a temperature difference of over 0.5 degrees C from the healthy side for one therapeutic course, and in each successive course the acupoints were re-determined according to the results of thermogram examination and were administrated till the end of the total therapeutic course, and 120 cases of the control group were treated with acupuncture at conventionally selected acupoints. The results showed that the cured and basically cured rate was 90.0% (54 cases) in the test group and 77.5% (93 cases) in the control group with a significant difference between the two groups (p facial thermogram in the test group was in order of Dicang (ST 4, 92.3%), Yingxiang (LI 20, 90.6%), Taiyang (EX-HN 5, 85.5%), Yangbai (GB 14, 76.6%), Quanliao (SI 18, 72.3%), and so on. In conclusion, acupuncture at the acupoints selected by thermogram for treatment of facial paralysis in the cured rate, the therapeutic course and sessions of acupuncture is significantly superior to acupuncture at the conventionally selected acupoints, and the thermogram-aided acupoint selection method is beneficial to objectivity and modernization of acupoint selection for acupuncture and moxibustion treatments.

  16. Stochastic group selection model for the evolution of altruism

    CERN Document Server

    Silva, A T C; Silva, Ana T. C.

    1999-01-01

    We study numerically and analytically a stochastic group selection model in which a population of asexually reproducing individuals, each of which can be either altruist or non-altruist, is subdivided into $M$ reproductively isolated groups (demes) of size $N$. The cost associated with being altruistic is modelled by assigning the fitness $1- \\tau$, with $\\tau \\in [0,1]$, to the altruists and the fitness 1 to the non-altruists. In the case that the altruistic disadvantage $\\tau$ is not too large, we show that the finite $M$ fluctuations are small and practically do not alter the deterministic results obtained for $M \\to \\infty$. However, for large $\\tau$ these fluctuations greatly increase the instability of the altruistic demes to mutations. These results may be relevant to the dynamics of parasite-host systems and, in particular, to explain the importance of mutation in the evolution of parasite virulence.

  17. The Selection of ARIMA Models with or without Regressors

    DEFF Research Database (Denmark)

    Johansen, Søren; Riani, Marco; Atkinson, Anthony C.

    We develop a $C_{p}$ statistic for the selection of regression models with stationary and nonstationary ARIMA error term. We derive the asymptotic theory of the maximum likelihood estimators and show they are consistent and asymptotically Gaussian. We also prove that the distribution of the sum...... to noise ratios. A new plot of our time series $C_{p}$ statistic is highly informative about the choice of model....... of squares of one step ahead standardized prediction errors, when the parameters are estimated, differs from the chi-squared distribution by a term which tends to infinity at a lower rate than $\\chi _{n}^{2}$. We further prove that, in the prediction error decomposition, the term involving the sum...

  18. On Model Specification and Selection of the Cox Proportional Hazards Model*

    OpenAIRE

    Lin, Chen-Yen; Halabi, Susan

    2013-01-01

    Prognosis plays a pivotal role in patient management and trial design. A useful prognostic model should correctly identify important risk factors and estimate their effects. In this article, we discuss several challenges in selecting prognostic factors and estimating their effects using the Cox proportional hazards model. Although a flexible semiparametric form, the Cox’s model is not entirely exempt from model misspecification. To minimize possible misspecification, instead of imposing tradi...

  19. Radial Domany-Kinzel models with mutation and selection

    Science.gov (United States)

    Lavrentovich, Maxim O.; Korolev, Kirill S.; Nelson, David R.

    2013-01-01

    We study the effect of spatial structure, genetic drift, mutation, and selective pressure on the evolutionary dynamics in a simplified model of asexual organisms colonizing a new territory. Under an appropriate coarse-graining, the evolutionary dynamics is related to the directed percolation processes that arise in voter models, the Domany-Kinzel (DK) model, contact process, and so on. We explore the differences between linear (flat front) expansions and the much less familiar radial (curved front) range expansions. For the radial expansion, we develop a generalized, off-lattice DK model that minimizes otherwise persistent lattice artifacts. With both simulations and analytical techniques, we study the survival probability of advantageous mutants, the spatial correlations between domains of neutral strains, and the dynamics of populations with deleterious mutations. “Inflation” at the frontier leads to striking differences between radial and linear expansions. For a colony with initial radius R0 expanding at velocity v, significant genetic demixing, caused by local genetic drift, occurs only up to a finite time t*=R0/v, after which portions of the colony become causally disconnected due to the inflating perimeter of the expanding front. As a result, the effect of a selective advantage is amplified relative to genetic drift, increasing the survival probability of advantageous mutants. Inflation also modifies the underlying directed percolation transition, introducing novel scaling functions and modifications similar to a finite-size effect. Finally, we consider radial range expansions with deflating perimeters, as might arise from colonization initiated along the shores of an island.

  20. Ultrastructural model for size selectivity in glomerular filtration.

    Science.gov (United States)

    Edwards, A; Daniels, B S; Deen, W M

    1999-06-01

    A theoretical model was developed to relate the size selectivity of the glomerular barrier to the structural characteristics of the individual layers of the capillary wall. Thicknesses and other linear dimensions were evaluated, where possible, from previous electron microscopic studies. The glomerular basement membrane (GBM) was represented as a homogeneous material characterized by a Darcy permeability and by size-dependent hindrance coefficients for diffusion and convection, respectively; those coefficients were estimated from recent data obtained with isolated rat GBM. The filtration slit diaphragm was modeled as a single row of cylindrical fibers of equal radius but nonuniform spacing. The resistances of the remainder of the slit channel, and of the endothelial fenestrae, to macromolecule movement were calculated to be negligible. The slit diaphragm was found to be the most restrictive part of the barrier. Because of that, macromolecule concentrations in the GBM increased, rather than decreased, in the direction of flow. Thus the overall sieving coefficient (ratio of Bowman's space concentration to that in plasma) was predicted to be larger for the intact capillary wall than for a hypothetical structure with no GBM. In other words, because the slit diaphragm and GBM do not act independently, the overall sieving coefficient is not simply the product of those for GBM alone and the slit diaphragm alone. Whereas the calculated sieving coefficients were sensitive to the structural features of the slit diaphragm and to the GBM hindrance coefficients, variations in GBM thickness or filtration slit frequency were predicted to have little effect. The ability of the ultrastructural model to represent fractional clearance data in vivo was at least equal to that of conventional pore models with the same number of adjustable parameters. The main strength of the present approach, however, is that it provides a framework for relating structural findings to the size

  1. Developing a conceptual model for selecting and evaluating online markets

    Directory of Open Access Journals (Sweden)

    Sadegh Feizollahi

    2013-04-01

    Full Text Available There are many evidences, which emphasis on the benefits of using new technologies of information and communication in international business and many believe that E-Commerce can help satisfy customer explicit and implicit requirements. Internet shopping is a concept developed after the introduction of electronic commerce. Information technology (IT and its applications, specifically in the realm of the internet and e-mail promoted the development of e-commerce in terms of advertising, motivating and information. However, with the development of new technologies, credit and financial exchange on the internet websites were constructed so to facilitate e-commerce. The proposed study sends a total of 200 questionnaires to the target group (teachers - students - professionals - managers of commercial web sites and it manages to collect 130 questionnaires for final evaluation. Cronbach's alpha test is used for measuring reliability and to evaluate the validity of measurement instruments (questionnaires, and to assure construct validity, confirmatory factor analysis is employed. In addition, in order to analyze the research questions based on the path analysis method and to determine markets selection models, a regular technique is implemented. In the present study, after examining different aspects of e-commerce, we provide a conceptual model for selecting and evaluating online marketing in Iran. These findings provide a consistent, targeted and holistic framework for the development of the Internet market in the country.

  2. A Network Analysis Model for Selecting Sustainable Technology

    Directory of Open Access Journals (Sweden)

    Sangsung Park

    2015-09-01

    Full Text Available Most companies develop technologies to improve their competitiveness in the marketplace. Typically, they then patent these technologies around the world in order to protect their intellectual property. Other companies may use patented technologies to develop new products, but must pay royalties to the patent holders or owners. Should they fail to do so, this can result in legal disputes in the form of patent infringement actions between companies. To avoid such situations, companies attempt to research and develop necessary technologies before their competitors do so. An important part of this process is analyzing existing patent documents in order to identify emerging technologies. In such analyses, extracting sustainable technology from patent data is important, because sustainable technology drives technological competition among companies and, thus, the development of new technologies. In addition, selecting sustainable technologies makes it possible to plan their R&D (research and development efficiently. In this study, we propose a network model that can be used to select the sustainable technology from patent documents, based on the centrality and degree of a social network analysis. To verify the performance of the proposed model, we carry out a case study using actual patent data from patent databases.

  3. Selection of antibiotic-resistant standard plate count bacteria during water treatment.

    Science.gov (United States)

    Armstrong, J L; Calomiris, J J; Seidler, R J

    1982-08-01

    Standard plate count (SPC) bacteria were isolated from a drinking-water treatment facility and from the river supplying the facility. All isolates were identified and tested for their resistance to six antibiotics to determine if drug-resistant bacteria were selected for as a consequence of water treatment. Among the isolates surviving our test procedures, there was a significant selection (P less than 0.05) of gram-negative SPC organisms resistant to two or more of the test antibiotics. These bacteria were isolated from the flash mix tank, where chlorine, alum, and lime are added to the water. Streptomycin resistance in particular was more frequent in this population as compared with bacteria in the untreated river water (P less than 0.01). SPC bacteria from the clear well, which is a tank holding the finished drinking water at the treatment facility, were also more frequently antibiotic resistant than were the respective river water populations. When 15.8 and 18.2% of the river water bacteria were multiply antibiotic resistant, 57.1 and 43.5%, respectively, of the SPC bacteria in the clear well were multiply antibiotic resistant. Selection for bacteria exhibiting resistance to streptomycin was achieved by chlorinating river water in the laboratory. We concluded that the selective factors operating in the aquatic environment of a water treatment facility can act to increase the proportion of antibiotic-resistant members of the SPC bacterial population in treated drinking water.

  4. Critical appraisal of clinical prediction rules that aim to optimize treatment selection for musculoskeletal conditions

    NARCIS (Netherlands)

    T.R. Stanton (Tasha); M.J. Hancock (Mark J.); C. Maher (Chris); B.W. Koes (Bart)

    2010-01-01

    textabstractBackground. Clinical prediction rules (CPRs) for treatment selection in musculoskeletal conditions have become increasingly popular. Purpose. The purposes of this review are: (1) to critically appraise studies evaluating CPRs and (2) to consider the clinical utility and stage of developm

  5. Threat Related Selective Attention Predicts Treatment Success in Childhood Anxiety Disorders

    Science.gov (United States)

    Legerstee, Jeroen S.; Tulen, Joke H. M.; Kallen, Victor L.; Dieleman, Gwen C.; Treffers, Philip D. A.; Verhulst, Frank C.; Utens, Elisabeth M. W. J.

    2009-01-01

    Threat-related selective attention was found to predict the success of the treatment of childhood anxiety disorders through administering a pictorial dot-probe task to 131 children with anxiety disorders prior to cognitive behavioral therapy. The diagnostic status of the subjects was evaluated with a semistructured clinical interview at both pre-…

  6. Effects of Behavioral Skills Training on Parental Treatment of Children's Food Selectivity

    Science.gov (United States)

    Seiverling, Laura; Williams, Keith; Sturmey, Peter; Hart, Sadie

    2012-01-01

    We used behavioral skills training to teach parents of 3 children with autism spectrum disorder and food selectivity to conduct a home-based treatment package that consisted of taste exposure, escape extinction, and fading. Parent performance following training improved during both taste sessions and probe meals and was reflected in increases in…

  7. Effects of Behavioral Skills Training on Parental Treatment of Children's Food Selectivity

    Science.gov (United States)

    Seiverling, Laura; Williams, Keith; Sturmey, Peter; Hart, Sadie

    2012-01-01

    We used behavioral skills training to teach parents of 3 children with autism spectrum disorder and food selectivity to conduct a home-based treatment package that consisted of taste exposure, escape extinction, and fading. Parent performance following training improved during both taste sessions and probe meals and was reflected in increases in…

  8. [Selection of Suitable Microalgal Species for Sorption of Uranium in Radioactive Wastewater Treatment].

    Science.gov (United States)

    Li, Xin; Hu, Hong-ying; Yu, Jun-yi; Zhao, Wen-yu

    2016-05-15

    The amount of radioactive wastewater discharge was increasing year by year, with the quick development of nuclear industry. Therefore, the proper treatment and disposal of radioactive wastewater are essentially important for environmental safety and human health. Microalgal biosorption of nuclide has drawn much attention in the area of radioactive wastewater treatment recently, and the selection of a proper microalgal species for uranium biosorption is the basis for the research and application of this technology. The selection principle was set up from the view of practical application, and 11 species of microalgae were prepared for the selection work. Scenedesmus sp. LX1 has the highest biosorption capacity of 40.7 mg · g⁻¹ for uranium; and its biomass production in mBG11 medium (simulating the nitrogen and phosphorus limits in the first-class A discharge standard of pollutants for municipal wastewater treatment plant) was 0.32 g · L⁻¹, which was relatively high among the 11 microalgal species; when grown into stable phase it also showed a good precipitation capability with the precipitation ratio of 45.3%. Above all, in our selection range of the 11 microalgal species, Scenedesmus sp. LX1 could be considered as the suitable species for uranium biosorption in radioactive wastewater treatment.

  9. Selective treatment of early acute rejection after liver transplantation : Effects on liver, infection rate, and outcome

    NARCIS (Netherlands)

    Klompmaker, IJ; Gouw, ASH; Haagsma, EB; TenVergert, EM; Verwer, R; Slooff, MJH

    1997-01-01

    To evaluate the results of selective treatment of biopsy-proven mild acute rejection episodes, we retrospectively studied 1-week liver biopsies of 103 patients with a primary liver graft in relation to liver function tests. The overall incidence of rejection was 35 %. In four patients the biopsy sho

  10. Threat Related Selective Attention Predicts Treatment Success in Childhood Anxiety Disorders

    Science.gov (United States)

    Legerstee, Jeroen S.; Tulen, Joke H. M.; Kallen, Victor L.; Dieleman, Gwen C.; Treffers, Philip D. A.; Verhulst, Frank C.; Utens, Elisabeth M. W. J.

    2009-01-01

    Threat-related selective attention was found to predict the success of the treatment of childhood anxiety disorders through administering a pictorial dot-probe task to 131 children with anxiety disorders prior to cognitive behavioral therapy. The diagnostic status of the subjects was evaluated with a semistructured clinical interview at both pre-…

  11. A CONCEPTUAL MODEL FOR IMPROVED PROJECT SELECTION AND PRIORITISATION

    Directory of Open Access Journals (Sweden)

    P. J. Viljoen

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Project portfolio management processes are often designed and operated as a series of stages (or project phases and gates. However, the flow of such a process is often slow, characterised by queues waiting for a gate decision and by repeated work from previous stages waiting for additional information or for re-processing. In this paper the authors propose a conceptual model that applies supply chain and constraint management principles to the project portfolio management process. An advantage of the proposed model is that it provides the ability to select and prioritise projects without undue changes to project schedules. This should result in faster flow through the system.

    AFRIKAANSE OPSOMMING: Prosesse om portefeuljes van projekte te bestuur word normaalweg ontwerp en bedryf as ’n reeks fases en hekke. Die vloei deur so ’n proses is dikwels stadig en word gekenmerk deur toue wat wag vir besluite by die hekke en ook deur herwerk van vorige fases wat wag vir verdere inligting of vir herprosessering. In hierdie artikel word ‘n konseptuele model voorgestel. Die model berus op die beginsels van voorsieningskettings sowel as van beperkingsbestuur, en bied die voordeel dat projekte geselekteer en geprioritiseer kan word sonder onnodige veranderinge aan projekskedules. Dit behoort te lei tot versnelde vloei deur die stelsel.

  12. Bayesian Model Selection With Network Based Diffusion Analysis

    Directory of Open Access Journals (Sweden)

    Andrew eWhalen

    2016-04-01

    Full Text Available A number of recent studies have used Network Based Diffusion Analysis (NBDA to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA. To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed.

  13. Tool Steel Heat Treatment Optimization Using Neural Network Modeling

    Science.gov (United States)

    Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz

    2016-11-01

    Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.

  14. Surrogate formulations for thermal treatment of low-level mixed waste, Part II: Selected mixed waste treatment project waste streams

    Energy Technology Data Exchange (ETDEWEB)

    Bostick, W.D.; Hoffmann, D.P.; Chiang, J.M.; Hermes, W.H.; Gibson, L.V. Jr.; Richmond, A.A. [Martin Marietta Energy Systems, Inc., Oak Ridge, TN (United States); Mayberry, J. [Science Applications International Corp., Idaho Falls, ID (United States); Frazier, G. [Univ. of Tennessee, Knoxville, TN (United States)

    1994-01-01

    This report summarizes the formulation of surrogate waste packages, representing the major bulk constituent compositions for 12 waste stream classifications selected by the US DOE Mixed Waste Treatment Program. These waste groupings include: neutral aqueous wastes; aqueous halogenated organic liquids; ash; high organic content sludges; adsorbed aqueous and organic liquids; cement sludges, ashes, and solids; chloride; sulfate, and nitrate salts; organic matrix solids; heterogeneous debris; bulk combustibles; lab packs; and lead shapes. Insofar as possible, formulation of surrogate waste packages are referenced to authentic wastes in inventory within the DOE; however, the surrogate waste packages are intended to represent generic treatability group compositions. The intent is to specify a nonradiological synthetic mixture, with a minimal number of readily available components, that can be used to represent the significant challenges anticipated for treatment of the specified waste class. Performance testing and evaluation with use of a consistent series of surrogate wastes will provide a means for the initial assessment (and intercomparability) of candidate treatment technology applicability and performance. Originally the surrogate wastes were intended for use with emerging thermal treatment systems, but use may be extended to select nonthermal systems as well.

  15. Selection and utilization of assessment instruments in substance abuse treatment trials: the National Drug Abuse Treatment Clinical Trials Network experience

    Directory of Open Access Journals (Sweden)

    Rosa C

    2012-07-01

    Full Text Available Carmen Rosa, Udi Ghitza, Betty TaiCenter for the Clinical Trials Network, National Institute on Drug Abuse, Bethesda, MD, USAAbstract: Based on recommendations from a US Institute of Medicine report, the National Institute on Drug Abuse established the National Drug Abuse Treatment Clinical Trials Network (CTN in 1999, to accelerate the translation of science-based addiction treatment research into community-based practice, and to improve the quality of addiction treatment, using science as the vehicle. One of the CTN's primary tasks is to serve as a platform to forge bi-directional communications and collaborations between providers and scientists, to enhance the relevance of research, which generates empirical results that impact practice. Among many obstacles in moving research into real-world settings, this commentary mainly describes challenges and iterative experiences in regard to how the CTN develops its research protocols, with focus on how the CTN study teams select and utilize assessment instruments, which can reasonably balance the interests of both research scientists and practicing providers when applied in CTN trials. This commentary also discusses the process by which the CTN further selects a core set of common assessment instruments that may be applied across all trials, to allow easier cross-study analyses of comparable data.Keywords: addiction, assessment, drug abuse treatment, drug dependence, NIDA Clinical Trials Network, substance use disorder

  16. On the selection of optimization parameters for an inverse treatment planning replacement of a forward planning technique for prostate cancer.

    Science.gov (United States)

    Hristov, Dimitre H; Moftah, Belal A; Charrois, Colette; Parker, William; Souhami, Luis; Podgorsak, Ervin B

    2002-01-01

    The influence of organ volume sampling, lateral scatter inclusion, and the selection of objectives and constraints on the inverse treatment planning process with a commercial treatment planning system is investigated and suitable parameters are identified for an inverse treatment planning replacement of a clinical forward planning technique for prostate cancer. For the beam geometries of the forward technique, a variable set of parameters is used for the calculation of dose from pencil beams. An optimal set is identified after the evaluation of optimized plans that correspond to different sets of pencil-beam parameters. This set along with a single, optimized set of objectives and constraints is used to perform inverse planning on ten randomly selected patients. The acceptability of the resulting plans is verified by comparisons to the clinical ones calculated with the forward techniques. For the particular commercial treatment planning system, the default values of the pencil beam parameters are found adequate for inverse treatment planning. For all ten patients, the optimized, single set of objectives and constraints results in plans with target coverage comparable to that of the forward plans. Furthermore inverse treatment planning reduces the overall mean rectal and bladder doses by 4.8% and 5.8% of the prescription dose respectively. The study indicates that (i) inverse treatment planning results depend implicitly on the sampling of the dose distribution, (ii) inverse treatment planning results depend on the method used by the dose calculation model to account for scatter, and (iii) for certain sites, a single set of optimization parameters can be used for all patient plans.

  17. An automation model of Effluent Treatment Plant

    Directory of Open Access Journals (Sweden)

    Luiz Alberto Oliveira Lima Roque

    2012-07-01

    on the conservation of water resources, this paper aims to propose an automation model of an Effluent Treatment Plant, using Ladder programming language and supervisory systems.

  18. Causal Inference and Model Selection in Complex Settings

    Science.gov (United States)

    Zhao, Shandong

    Propensity score methods have become a part of the standard toolkit for applied researchers who wish to ascertain causal effects from observational data. While they were originally developed for binary treatments, several researchers have proposed generalizations of the propensity score methodology for non-binary treatment regimes. In this article, we firstly review three main methods that generalize propensity scores in this direction, namely, inverse propensity weighting (IPW), the propensity function (P-FUNCTION), and the generalized propensity score (GPS), along with recent extensions of the GPS that aim to improve its robustness. We compare the assumptions, theoretical properties, and empirical performance of these methods. We propose three new methods that provide robust causal estimation based on the P-FUNCTION and GPS. While our proposed P-FUNCTION-based estimator preforms well, we generally advise caution in that all available methods can be biased by model misspecification and extrapolation. In a related line of research, we consider adjustment for posttreatment covariates in causal inference. Even in a randomized experiment, observations might have different compliance performance under treatment and control assignment. This posttreatment covariate cannot be adjusted using standard statistical methods. We review the principal stratification framework which allows for modeling this effect as part of its Bayesian hierarchical models. We generalize the current model to add the possibility of adjusting for pretreatment covariates. We also propose a new estimator of the average treatment effect over the entire population. In a third line of research, we discuss the spectral line detection problem in high energy astrophysics. We carefully review how this problem can be statistically formulated as a precise hypothesis test with point null hypothesis, why a usual likelihood ratio test does not apply for problem of this nature, and a doable fix to correctly

  19. Xamoterol, a new selective beta-1-adrenoceptor partial agonist, in the treatment of postural hypotension

    DEFF Research Database (Denmark)

    Mehlsen, J; Trap-Jensen, J

    1986-01-01

    Three patients severely disabled from postural hypotension were treated with xamoterol, a selective beta-1-adrenoceptor antagonist with a high degree of partial agonist activity. Oral treatment (200 mg b.i.d.) was chosen on the basis of the effects of acute intravenous administration of xamoterol...... and pindolol, a non-selective beta-adrenoceptor antagonist with partial agonist activity. In these patients pindolol had a predominantly antagonist effect, whereas xamoterol had a predominantly agonist effect after intravenous administration. Oral treatment was carried out with placebo control in a single......, supine). During the placebo period (2 weeks) heart rate decreased to pretreatment levels and mean blood pressure was reduced by only 14 mmHg. The patients reported substantial improvement in their condition during active medication. Xamoterol seems to be a useful alternative in the treatment of postural...

  20. Ethical selection on liver transplantation and abandoning treatment for hepatocellular carcinoma in China.

    Science.gov (United States)

    Xia, D; Zuo, H-Q; Quan, Y; Dong, H-L; Xu, L

    2011-09-01

    Orthotopic liver transplantation (OLT) has evolved in China over three decades, emerging as the mainstay treatment for hepatocellular carcinoma (HCC). Some Chinese transplantation centers have begun offering OLT for selected patients with HCC exceeding Milan criteria. However, this still remains a controversial subject. In this article, we have weighed arguments for and against OLT for advanced HCC. Meanwhile, the development of OLT for HCC in China has raised problems, mainly focused on ethical and moral concerns. Postmodern philosophy and ethics, particularly the life value theory, shall be the theoretical support to the concept of abandoning treatment. In China, ethical selection for OLT and abandoning treatment for HCC must be made justly and prudently. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Bayesian model selection in hydrogeophysics: Application to conceptual subsurface models of the South Oyster Bacterial Transport Site, Virginia, USA

    Science.gov (United States)

    Brunetti, Carlotta; Linde, Niklas; Vrugt, Jasper A.

    2017-04-01

    Geophysical data can help to discriminate among multiple competing subsurface hypotheses (conceptual models). Here, we explore the merits of Bayesian model selection in hydrogeophysics using crosshole ground-penetrating radar data from the South Oyster Bacterial Transport Site in Virginia, USA. Implementation of Bayesian model selection requires computation of the marginal likelihood of the measured data, or evidence, for each conceptual model being used. In this paper, we compare three different evidence estimators, including (1) the brute force Monte Carlo method, (2) the Laplace-Metropolis method, and (3) the numerical integration method proposed by Volpi et al. (2016). The three types of subsurface models that we consider differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. Our results demonstrate that all three estimators provide equivalent results in low parameter dimensions, yet in higher dimensions the brute force Monte Carlo method is inefficient. The isotropic multi-Gaussian model is most supported by the travel time data with Bayes factors that are larger than 10100 compared to conceptual models that assume horizontal or vertical layering of the porosity field.

  2. Multicriteria decision group model for the selection of suppliers

    Directory of Open Access Journals (Sweden)

    Luciana Hazin Alencar

    2008-08-01

    Full Text Available Several authors have been studying group decision making over the years, which indicates how relevant it is. This paper presents a multicriteria group decision model based on ELECTRE IV and VIP Analysis methods, to those cases where there is great divergence among the decision makers. This model includes two stages. In the first, the ELECTRE IV method is applied and a collective criteria ranking is obtained. In the second, using criteria ranking, VIP Analysis is applied and the alternatives are selected. To illustrate the model, a numerical application in the context of the selection of suppliers in project management is used. The suppliers that form part of the project team have a crucial role in project management. They are involved in a network of connected activities that can jeopardize the success of the project, if they are not undertaken in an appropriate way. The question tackled is how to select service suppliers for a project on behalf of an enterprise that assists the multiple objectives of the decision-makers.Vários autores têm estudado decisão em grupo nos últimos anos, o que indica a relevância do assunto. Esse artigo apresenta um modelo multicritério de decisão em grupo baseado nos métodos ELECTRE IV e VIP Analysis, adequado aos casos em que se tem uma grande divergência entre os decisores. Esse modelo é composto por dois estágios. No primeiro, o método ELECTRE IV é aplicado e uma ordenação dos critérios é obtida. No próximo estágio, com a ordenação dos critérios, o método VIP Analysis é aplicado e as alternativas são selecionadas. Para ilustrar o modelo, uma aplicação numérica no contexto da seleção de fornecedores em projetos é realizada. Os fornecedores que fazem parte da equipe do projeto têm um papel fundamental no gerenciamento de projetos. Eles estão envolvidos em uma rede de atividades conectadas que, caso não sejam executadas de forma apropriada, podem colocar em risco o sucesso do

  3. Improving permafrost distribution modelling using feature selection algorithms

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2016-04-01

    The availability of an increasing number of spatial data on the occurrence of mountain permafrost allows the employment of machine learning (ML) classification algorithms for modelling the distribution of the phenomenon. One of the major problems when dealing with high-dimensional dataset is the number of input features (variables) involved. Application of ML classification algorithms to this large number of variables leads to the risk of overfitting, with the consequence of a poor generalization/prediction. For this reason, applying feature selection (FS) techniques helps simplifying the amount of factors required and improves the knowledge on adopted features and their relation with the studied phenomenon. Moreover, taking away irrelevant or redundant variables from the dataset effectively improves the quality of the ML prediction. This research deals with a comparative analysis of permafrost distribution models supported by FS variable importance assessment. The input dataset (dimension = 20-25, 10 m spatial resolution) was constructed using landcover maps, climate data and DEM derived variables (altitude, aspect, slope, terrain curvature, solar radiation, etc.). It was completed with permafrost evidences (geophysical and thermal data and rock glacier inventories) that serve as training permafrost data. Used FS algorithms informed about variables that appeared less statistically important for permafrost presence/absence. Three different algorithms were compared: Information Gain (IG), Correlation-based Feature Selection (CFS) and Random Forest (RF). IG is a filter technique that evaluates the worth of a predictor by measuring the information gain with respect to the permafrost presence/absence. Conversely, CFS is a wrapper technique that evaluates the worth of a subset of predictors by considering the individual predictive ability of each variable along with the degree of redundancy between them. Finally, RF is a ML algorithm that performs FS as part of its

  4. A Model for Selection of Eyespots on Butterfly Wings.

    Directory of Open Access Journals (Sweden)

    Toshio Sekimura

    Full Text Available The development of eyespots on the wing surface of butterflies of the family Nympalidae is one of the most studied examples of biological pattern formation.However, little is known about the mechanism that determines the number and precise locations of eyespots on the wing. Eyespots develop around signaling centers, called foci, that are located equidistant from wing veins along the midline of a wing cell (an area bounded by veins. A fundamental question that remains unsolved is, why a certain wing cell develops an eyespot, while other wing cells do not.We illustrate that the key to understanding focus point selection may be in the venation system of the wing disc. Our main hypothesis is that changes in morphogen concentration along the proximal boundary veins of wing cells govern focus point selection. Based on previous studies, we focus on a spatially two-dimensional reaction-diffusion system model posed in the interior of each wing cell that describes the formation of focus points. Using finite element based numerical simulations, we demonstrate that variation in the proximal boundary condition is sufficient to robustly select whether an eyespot focus point forms in otherwise identical wing cells. We also illustrate that this behavior is robust to small perturbations in the parameters and geometry and moderate levels of noise. Hence, we suggest that an anterior-posterior pattern of morphogen concentration along the proximal vein may be the main determinant of the distribution of focus points on the wing surface. In order to complete our model, we propose a two stage reaction-diffusion system model, in which an one-dimensional surface reaction-diffusion system, posed on the proximal vein, generates the morphogen concentrations that act as non-homogeneous Dirichlet (i.e., fixed boundary conditions for the two-dimensional reaction-diffusion model posed in the wing cells. The two-stage model appears capable of generating focus point distributions

  5. Mutation-selection models of codon substitution and their use to estimate selective strengths on codon usage

    DEFF Research Database (Denmark)

    Yang, Ziheng; Nielsen, Rasmus

    2008-01-01

    Current models of codon substitution are formulated at the levels of nucleotide substitution and do not explicitly consider the separate effects of mutation and selection. They are thus incapable of inferring whether mutation or selection is responsible for evolution at silent sites. Here we...... to examine the null hypothesis that codon usage is due to mutation bias alone, not influenced by natural selection. Application of the test to the mammalian data led to rejection of the null hypothesis in most genes, suggesting that natural selection may be a driving force in the evolution of synonymous...... codon usage in mammals. Estimates of selection coefficients nevertheless suggest that selection on codon usage is weak and most mutations are nearly neutral. The sensitivity of the analysis on the assumed mutation model is discussed....

  6. Multiphysics modeling of selective laser sintering/melting

    Science.gov (United States)

    Ganeriwala, Rishi Kumar

    A significant percentage of total global employment is due to the manufacturing industry. However, manufacturing also accounts for nearly 20% of total energy usage in the United States according to the EIA. In fact, manufacturing accounted for 90% of industrial energy consumption and 84% of industry carbon dioxide emissions in 2002. Clearly, advances in manufacturing technology and efficiency are necessary to curb emissions and help society as a whole. Additive manufacturing (AM) refers to a relatively recent group of manufacturing technologies whereby one can 3D print parts, which has the potential to significantly reduce waste, reconfigure the supply chain, and generally disrupt the whole manufacturing industry. Selective laser sintering/melting (SLS/SLM) is one type of AM technology with the distinct advantage of being able to 3D print metals and rapidly produce net shape parts with complicated geometries. In SLS/SLM parts are built up layer-by-layer out of powder particles, which are selectively sintered/melted via a laser. However, in order to produce defect-free parts of sufficient strength, the process parameters (laser power, scan speed, layer thickness, powder size, etc.) must be carefully optimized. Obviously, these process parameters will vary depending on material, part geometry, and desired final part characteristics. Running experiments to optimize these parameters is costly, energy intensive, and extremely material specific. Thus a computational model of this process would be highly valuable. In this work a three dimensional, reduced order, coupled discrete element - finite difference model is presented for simulating the deposition and subsequent laser heating of a layer of powder particles sitting on top of a substrate. Validation is provided and parameter studies are conducted showing the ability of this model to help determine appropriate process parameters and an optimal powder size distribution for a given material. Next, thermal stresses upon

  7. Selecting Effective Treatments: A Comprehensive, Systematic Guide to Treating Mental Disorders. Revised Edition.

    Science.gov (United States)

    Seligman, Linda

    This book presents an overview of the major types of mental disorders, accompanied by treatment models that are structured, comprehensive, grounded in research, and likely to be effective. Chapter topics are: (1) "Introduction to Effective Treatment Planning"; (2) "Mental Disorders in Infants, Children, and Adolescents"; (3) "Situationally…

  8. Patch-based generative shape model and MDL model selection for statistical analysis of archipelagos

    DEFF Research Database (Denmark)

    Ganz, Melanie; Nielsen, Mads; Brandt, Sami

    2010-01-01

    We propose a statistical generative shape model for archipelago-like structures. These kind of structures occur, for instance, in medical images, where our intention is to model the appearance and shapes of calcifications in x-ray radio graphs. The generative model is constructed by (1) learning...... a patch-based dictionary for possible shapes, (2) building up a time-homogeneous Markov model to model the neighbourhood correlations between the patches, and (3) automatic selection of the model complexity by the minimum description length principle. The generative shape model is proposed...... as a probability distribution of a binary image where the model is intended to facilitate sequential simulation. Our results show that a relatively simple model is able to generate structures visually similar to calcifications. Furthermore, we used the shape model as a shape prior in the statistical segmentation...

  9. Consistency in Estimation and Model Selection of Dynamic Panel Data Models with Fixed Effects

    Directory of Open Access Journals (Sweden)

    Guangjie Li

    2015-07-01

    Full Text Available We examine the relationship between consistent parameter estimation and model selection for autoregressive panel data models with fixed effects. We find that the transformation of fixed effects proposed by Lancaster (2002 does not necessarily lead to consistent estimation of common parameters when some true exogenous regressors are excluded. We propose a data dependent way to specify the prior of the autoregressive coefficient and argue for comparing different model specifications before parameter estimation. Model selection properties of Bayes factors and Bayesian information criterion (BIC are investigated. When model uncertainty is substantial, we recommend the use of Bayesian Model Averaging to obtain point estimators with lower root mean squared errors (RMSE. We also study the implications of different levels of inclusion probabilities by simulations.

  10. Hyperopt: a Python library for model selection and hyperparameter optimization

    Science.gov (United States)

    Bergstra, James; Komer, Brent; Eliasmith, Chris; Yamins, Dan; Cox, David D.

    2015-01-01

    Sequential model-based optimization (also known as Bayesian optimization) is one of the most efficient methods (per function evaluation) of function minimization. This efficiency makes it appropriate for optimizing the hyperparameters of machine learning algorithms that are slow to train. The Hyperopt library provides algorithms and parallelization infrastructure for performing hyperparameter optimization (model selection) in Python. This paper presents an introductory tutorial on the usage of the Hyperopt library, including the description of search spaces, minimization (in serial and parallel), and the analysis of the results collected in the course of minimization. This paper also gives an overview of Hyperopt-Sklearn, a software project that provides automatic algorithm configuration of the Scikit-learn machine learning library. Following Auto-Weka, we take the view that the choice of classifier and even the choice of preprocessing module can be taken together to represent a single large hyperparameter optimization problem. We use Hyperopt to define a search space that encompasses many standard components (e.g. SVM, RF, KNN, PCA, TFIDF) and common patterns of composing them together. We demonstrate, using search algorithms in Hyperopt and standard benchmarking data sets (MNIST, 20-newsgroups, convex shapes), that searching this space is practical and effective. In particular, we improve on best-known scores for the model space for both MNIST and convex shapes. The paper closes with some discussion of ongoing and future work.

  11. NVC Based Model for Selecting Effective Requirement Elicitation Technique

    Directory of Open Access Journals (Sweden)

    Md. Rizwan Beg

    2012-10-01

    Full Text Available Requirement Engineering process starts from gathering of requirements i.e.; requirements elicitation. Requirementselicitation (RE is the base building block for a software project and has very high impact onsubsequent design and builds phases as well. Accurately capturing system requirements is the major factorin the failure of most of software projects. Due to the criticality and impact of this phase, it is very importantto perform the requirements elicitation in no less than a perfect manner. One of the most difficult jobsfor elicitor is to select appropriate technique for eliciting the requirement. Interviewing and Interactingstakeholder during Elicitation process is a communication intensive activity involves Verbal and Nonverbalcommunication (NVC. Elicitor should give emphasis to Non-verbal communication along with verbalcommunication so that requirements recorded more efficiently and effectively. In this paper we proposea model in which stakeholders are classified by observing non-verbal communication and use it as a basefor elicitation technique selection. We also propose an efficient plan for requirements elicitation which intendsto overcome on the constraints, faced by elicitor.

  12. Scaling limits of a model for selection at two scales

    Science.gov (United States)

    Luo, Shishi; Mattingly, Jonathan C.

    2017-04-01

    The dynamics of a population undergoing selection is a central topic in evolutionary biology. This question is particularly intriguing in the case where selective forces act in opposing directions at two population scales. For example, a fast-replicating virus strain outcompetes slower-replicating strains at the within-host scale. However, if the fast-replicating strain causes host morbidity and is less frequently transmitted, it can be outcompeted by slower-replicating strains at the between-host scale. Here we consider a stochastic ball-and-urn process which models this type of phenomenon. We prove the weak convergence of this process under two natural scalings. The first scaling leads to a deterministic nonlinear integro-partial differential equation on the interval [0,1] with dependence on a single parameter, λ. We show that the fixed points of this differential equation are Beta distributions and that their stability depends on λ and the behavior of the initial data around 1. The second scaling leads to a measure-valued Fleming–Viot process, an infinite dimensional stochastic process that is frequently associated with a population genetics.

  13. Model catalysis by size-selected cluster deposition

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Scott [Univ. of Utah, Salt Lake City, UT (United States)

    2015-11-20

    This report summarizes the accomplishments during the last four years of the subject grant. Results are presented for experiments in which size-selected model catalysts were studied under surface science and aqueous electrochemical conditions. Strong effects of cluster size were found, and by correlating the size effects with size-dependent physical properties of the samples measured by surface science methods, it was possible to deduce mechanistic insights, such as the factors that control the rate-limiting step in the reactions. Results are presented for CO oxidation, CO binding energetics and geometries, and electronic effects under surface science conditions, and for the electrochemical oxygen reduction reaction, ethanol oxidation reaction, and for oxidation of carbon by water.

  14. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models

    Science.gov (United States)

    Rieger, TR; Musante, CJ

    2016-01-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed “virtual patients.” In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations. PMID:27069777

  15. The sudden vector projection model for reactivity: mode specificity and bond selectivity made simple.

    Science.gov (United States)

    Guo, Hua; Jiang, Bin

    2014-12-16

    CONSPECTUS: Mode specificity is defined by the differences in reactivity due to excitations in various reactant modes, while bond selectivity refers to selective bond breaking in a reaction. These phenomena not only shed light on reaction dynamics but also open the door for laser control of reactions. The existence of mode specificity and bond selectivity in a reaction indicates that not all forms of energy are equivalent in promoting the reactivity, thus defying a statistical treatment. They also allow the enhancement of reactivity and control product branching ratio. As a result, they are of central importance in chemistry. This Account discusses recent advances in our understanding of these nonstatistical phenomena. In particular, the newly proposed sudden vector projection (SVP) model and its applications are reviewed. The SVP model is based on the premise that the collision in many direct reactions is much faster than intramolecular vibrational energy redistribution in the reactants. In such a sudden limit, the coupling of a reactant mode with the reaction coordinate at the transition state, which dictates its ability to promote the reaction, is approximately quantified by the projection of the former onto the latter. The SVP model can be considered as a generalization of the venerable Polanyi's rules, which are based on the location of the barrier. The SVP model is instead based on properties of the saddle point and as a result capable of treating the translational, rotational, and multiple vibrational modes in reactions involving polyatomic reactants. In case of surface reactions, the involvement of surface atoms can also be examined. Taking advantage of microscopic reversibility, the SVP model has also been used to predict product energy disposal in reactions. This simple yet powerful rule of thumb has been successfully demonstrated in many reactions including uni- and bimolecular reactions in the gas phase and gas-surface reactions. The success of the SVP

  16. Enhanced selection of high affinity DNA-reactive B cells following cyclophosphamide treatment in mice.

    Directory of Open Access Journals (Sweden)

    Daisuke Kawabata

    Full Text Available A major goal for the treatment of patients with systemic lupus erythematosus with cytotoxic therapies is the induction of long-term remission. There is, however, a paucity of information concerning the effects of these therapies on the reconstituting B cell repertoire. Since there is recent evidence suggesting that B cell lymphopenia might attenuate negative selection of autoreactive B cells, we elected to investigate the effects of cyclophosphamide on the selection of the re-emerging B cell repertoire in wild type mice and transgenic mice that express the H chain of an anti-DNA antibody. The reconstituting B cell repertoire in wild type mice contained an increased frequency of DNA-reactive B cells; in heavy chain transgenic mice, the reconstituting repertoire was characterized by an increased frequency of mature, high affinity DNA-reactive B cells and the mice expressed increased levels of serum anti-DNA antibodies. This coincided with a significant increase in serum levels of BAFF. Treatment of transgene-expressing mice with a BAFF blocking agent or with DNase to reduce exposure to autoantigen limited the expansion of high affinity DNA-reactive B cells during B cell reconstitution. These studies suggest that during B cell reconstitution, not only is negative selection of high affinity DNA-reactive B cells impaired by increased BAFF, but also that B cells escaping negative selection are positively selected by autoantigen. There are significant implications for therapy.

  17. Agent-Based vs. Equation-based Epidemiological Models:A Model Selection Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Sukumar, Sreenivas R [ORNL; Nutaro, James J [ORNL

    2012-01-01

    This paper is motivated by the need to design model validation strategies for epidemiological disease-spread models. We consider both agent-based and equation-based models of pandemic disease spread and study the nuances and complexities one has to consider from the perspective of model validation. For this purpose, we instantiate an equation based model and an agent based model of the 1918 Spanish flu and we leverage data published in the literature for our case- study. We present our observations from the perspective of each implementation and discuss the application of model-selection criteria to compare the risk in choosing one modeling paradigm to another. We conclude with a discussion of our experience and document future ideas for a model validation framework.

  18. Acute and chronic effects of selective serotonin reuptake inhibitor treatment on fear conditioning: implications for underlying fear circuits.

    Science.gov (United States)

    Burghardt, N S; Bauer, E P

    2013-09-01

    Selective serotonin reuptake inhibitors (SSRIs) are widely used for the treatment of a spectrum of anxiety disorders, yet paradoxically they may increase symptoms of anxiety when treatment is first initiated. Despite extensive research over the past 30 years focused on SSRI treatment, the precise mechanisms by which SSRIs exert these opposing acute and chronic effects on anxiety remain unknown. By testing the behavioral effects of SSRI treatment on Pavlovian fear conditioning, a well characterized model of emotional learning, we have the opportunity to identify how SSRIs affect the functioning of specific brain regions, including the amygdala, bed nucleus of the stria terminalis (BNST) and hippocampus. In this review, we first define different stages of learning involved in cued and context fear conditioning and describe the neural circuits underlying these processes. We examine the results of numerous rodent studies investigating how acute SSRI treatment modulates fear learning and relate these effects to the known functions of serotonin in specific brain regions. With these findings, we propose a model by which acute SSRI administration, by altering neural activity in the extended amygdala and hippocampus, enhances both acquisition and expression of cued fear conditioning, but impairs the expression of contextual fear conditioning. Finally, we review the literature examining the effects of chronic SSRI treatment on fear conditioning in rodents and describe how downregulation of N-methyl-d-aspartate (NMDA) receptors in the amygdala and hippocampus may mediate the impairments in fear learning and memory that are reported. While long-term SSRI treatment effectively reduces symptoms of anxiety, their disruptive effects on fear learning should be kept in mind when combining chronic SSRI treatment and learning-based therapies, such as cognitive behavioral therapy.

  19. Ethical issues in selecting patients for treatment with clozapine: a commentary.

    Science.gov (United States)

    Eichelman, B; Hartwig, A

    1990-08-01

    Three ethical constructs of distributive justice--utilitarianism, Marxism, and the theories of John Rawls--are applied to selection of patients for treatment with clozapine. Elements of an ethical selection process include a means of monitoring the clinical effectiveness of the drug so that it is not wasted and procedures for ensuring that patients' rights to advocacy and due process are met. The authors suggest that a disproportionate number of patients with tardive dyskinesia may receive clozapine because clinicians and hospitals risk litigation if these patients continue to receive standard neuroleptics and experience worsening side effects.

  20. Therapeutic potential of functional selectivity in the treatment of heart failure

    DEFF Research Database (Denmark)

    Christensen, Gitte Lund; Aplin, Mark; Hansen, Jakob Lerche

    2010-01-01

    Adrenergic and angiotensin receptors are prominent targets in pharmacological alleviation of cardiac remodeling and heart failure, but their use is associated with cardiodepressant side effects. Recent advances in our understanding of seven transmembrane receptor signaling show that it is possible...... to design ligands with "functional selectivity," acting as agonists on certain signaling pathways while antagonizing others. This represents a major pharmaceutical opportunity to separate desired from adverse effects governed by the same receptor. Accordingly, functionally selective ligands are currently...... pursued as next-generation drugs for superior treatment of heart failure....

  1. Selective source blocking for Gamma Knife radiosurgery of trigeminal neuralgia based on analytical dose modelling

    Science.gov (United States)

    Li, Kaile; Ma, Lijun

    2004-08-01

    We have developed an automatic critical region shielding (ACRS) algorithm for Gamma Knife radiosurgery of trigeminal neuralgia. The algorithm selectively blocks 201 Gamma Knife sources to minimize the dose to the brainstem while irradiating the root entry area of the trigeminal nerve with 70-90 Gy. An independent dose model was developed to implement the algorithm. The accuracy of the dose model was tested and validated via comparison with the Leksell GammaPlan (LGP) calculations. Agreements of 3% or 3 mm in isodose distributions were found for both single-shot and multiple-shot treatment plans. After the optimized blocking patterns are obtained via the independent dose model, they are imported into the LGP for final dose calculations and treatment planning analyses. We found that the use of a moderate number of source plugs (30-50 plugs) significantly lowered (~40%) the dose to the brainstem for trigeminal neuralgia treatments. Considering the small effort involved in using these plugs, we recommend source blocking for all trigeminal neuralgia treatments with Gamma Knife radiosurgery.

  2. Selective source blocking for Gamma Knife radiosurgery of trigeminal neuralgia based on analytical dose modelling

    Energy Technology Data Exchange (ETDEWEB)

    Li Kaile; Ma Lijun [Department of Radiation Oncology, University of Maryland School of Medicine, Baltimore, MD 21210 (United States)

    2004-08-07

    We have developed an automatic critical region shielding (ACRS) algorithm for Gamma Knife radiosurgery of trigeminal neuralgia. The algorithm selectively blocks 201 Gamma Knife sources to minimize the dose to the brainstem while irradiating the root entry area of the trigeminal nerve with 70-90 Gy. An independent dose model was developed to implement the algorithm. The accuracy of the dose model was tested and validated via comparison with the Leksell GammaPlan (LGP) calculations. Agreements of 3% or 3 mm in isodose distributions were found for both single-shot and multiple-shot treatment plans. After the optimized blocking patterns are obtained via the independent dose model, they are imported into the LGP for final dose calculations and treatment planning analyses. We found that the use of a moderate number of source plugs (30-50 plugs) significantly lowered ({approx}40%) the dose to the brainstem for trigeminal neuralgia treatments. Considering the small effort involved in using these plugs, we recommend source blocking for all trigeminal neuralgia treatments with Gamma Knife radiosurgery.

  3. Modeling Vertical Flow Treatment Wetland Hydraulics to Optimize Treatment Efficiency

    Science.gov (United States)

    2011-03-24

    Shelley (Member) Date //signed// 10 Mar 2011 Robert Ritzi (Member) Wright State University Date iv AFIT/GES/ENV/11-M03...members; Dr. Michael Shelley, for his invaluable insights into past research of the treatment wetland at Wright Patterson AFB; and Dr. Robert Ritzi, for...Amon, J.P., A. Agrawal, M.L. Shelley, B.C. Opperman, M.P. Enright , N.D. Clemmer, T. Slusser, J. Lach, T. Sobolewski, W. Gruner, and A.C. Entingh

  4. Model selection by LASSO methods in a change-point model

    CERN Document Server

    Ciuperca, Gabriela

    2011-01-01

    The paper considers a linear regression model with multiple change-points occurring at unknown times. The LASSO technique is very interesting since it allows the parametric estimation, including the change-points, and automatic variable selection simultaneously. The asymptotic properties of the LASSO-type (which has as particular case the LASSO estimator) and of the adaptive LASSO estimators are studied. For this last estimator the oracle properties are proved. In both cases, a model selection criterion is proposed. Numerical examples are provided showing the performances of the adaptive LASSO estimator compared to the LS estimator.

  5. Chain-Wise Generalization of Road Networks Using Model Selection

    Science.gov (United States)

    Bulatov, D.; Wenzel, S.; Häufel, G.; Meidow, J.

    2017-05-01

    Streets are essential entities of urban terrain and their automatized extraction from airborne sensor data is cumbersome because of a complex interplay of geometric, topological and semantic aspects. Given a binary image, representing the road class, centerlines of road segments are extracted by means of skeletonization. The focus of this paper lies in a well-reasoned representation of these segments by means of geometric primitives, such as straight line segments as well as circle and ellipse arcs. We propose the fusion of raw segments based on similarity criteria; the output of this process are the so-called chains which better match to the intuitive perception of what a street is. Further, we propose a two-step approach for chain-wise generalization. First, the chain is pre-segmented using circlePeucker and finally, model selection is used to decide whether two neighboring segments should be fused to a new geometric entity. Thereby, we consider both variance-covariance analysis of residuals and model complexity. The results on a complex data-set with many traffic roundabouts indicate the benefits of the proposed procedure.

  6. A simple model of group selection that cannot be analyzed with inclusive fitness

    NARCIS (Netherlands)

    M. van Veelen; S. Luo; B. Simon

    2014-01-01

    A widespread claim in evolutionary theory is that every group selection model can be recast in terms of inclusive fitness. Although there are interesting classes of group selection models for which this is possible, we show that it is not true in general. With a simple set of group selection models,

  7. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    Science.gov (United States)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification

  8. A Selective Mutism Arising from First Language Attrition, Successfully Treated with Paroxetine-CBT Combination Treatment.

    Science.gov (United States)

    Serra, Agostino; Di Mauro, Paola; Andaloro, Claudio; Maiolino, Luigi; Pavone, Piero; Cocuzza, Salvatore

    2015-10-01

    After immersion in a foreign language, speakers often have difficulty retrieving native-language words and may experience a decrease in its proficiency, this phenomenon, in the non-pathological form, is known as first language attrition. Self-perception of this low native-language proficiency and apprehension occurring when speaking is expected and, may sometimes lead these people to a state of social anxiety and, in extreme forms, can involve the withholding of speech as a primitive tool for self-protection, linking them to selective mutism. We report an unusual case of selective mutism arising from first language attrition in an Italian girl after attending a two-year "German language school", who successfully responded to a paroxetine-cognitive behavioral treatment (CBT) combination treatment.

  9. Hastening death by selective disclosure of treatment options--beneficence or "euthanasia by deception"?

    Science.gov (United States)

    Loewy, Roberta Springer

    2004-09-01

    In this paper I make a radical claim regarding selective non-disclosure of treatment options that have some hope of prolonging a patient's life. I suggest that selective non-disclosure under such circumstances is tantamount to what might be called "euthanasia by deception." I offer a case to test the validity of my claim and to demonstrate how the failure to offer or, at least, to discuss renal dialysis in this case (and, by inference, any other form of treatment which has some hope of prolonging a patient's life) qualifies as paternalism in its most egregious form. I discuss the actions of the health care team and try to find some plausible reasons why they acted as they did. I conclude that there must be greater emphasis placed on teaching clinicians how better to incorporate frank, open and on-going discussion about the central elements of the therapeutic relationship with patients long before they lose decisional capacity.

  10. Empirical evaluation of scoring functions for Bayesian network model selection.

    Science.gov (United States)

    Liu, Zhifa; Malone, Brandon; Yuan, Changhe

    2012-01-01

    In this work, we empirically evaluate the capability of various scoring functions of Bayesian networks for recovering true underlying structures. Similar investigations have been carried out before, but they typically relied on approximate learning algorithms to learn the network structures. The suboptimal structures found by the approximation methods have unknown quality and may affect the reliability of their conclusions. Our study uses an optimal algorithm to learn Bayesian network structures from datasets generated from a set of gold standard Bayesian networks. Because all optimal algorithms always learn equivalent networks, this ensures that only the choice of scoring function affects the learned networks. Another shortcoming of the previous studies stems from their use of random synthetic networks as test cases. There is no guarantee that these networks reflect real-world data. We use real-world data to generate our gold-standard structures, so our experimental design more closely approximates real-world situations. A major finding of our study suggests that, in contrast to results reported by several prior works, the Minimum Description Length (MDL) (or equivalently, Bayesian information criterion (BIC)) consistently outperforms other scoring functions such as Akaike's information criterion (AIC), Bayesian Dirichlet equivalence score (BDeu), and factorized normalized maximum likelihood (fNML) in recovering the underlying Bayesian network structures. We believe this finding is a result of using both datasets generated from real-world applications rather than from random processes used in previous studies and learning algorithms to select high-scoring structures rather than selecting random models. Other findings of our study support existing work, e.g., large sample sizes result in learning structures closer to the true underlying structure; the BDeu score is sensitive to the parameter settings; and the fNML performs pretty well on small datasets. We also

  11. Validation of the FAMACHA© method for selective anthelmintic treatment in dairy goat herds

    OpenAIRE

    Zárate Rendón, Daniel; Laboratorio de Parasitología, Departamento Académico de Nutrición, Facultad de Zootecnia, Universidad Nacional Agraria La Molina, Lima; Rojas Flores, Julio; Laboratorio de Parasitología, Departamento Académico de Nutrición, Facultad de Zootecnia, Universidad Nacional Agraria La Molina, Lima; Segura Hong, Alan; Laboratorio de Parasitología, Departamento Académico de Nutrición, Facultad de Zootecnia, Universidad Nacional Agraria La Molina, Lima

    2017-01-01

    A study in goats was carried out in the central coast of Peru to validate the FAMACHA© method for selective anthelmintic treatment in dairy goat herds. Blood and fecal samples were taken from 120 adult goats in five dairy goat farms. The micro haematocrit and the McMaster techniques plus faecal culture were used to evaluate the haematocrit, faecal nematode egg count (EPG) and to identify major nematode species, respectively. Spearman correlation coefficients were obtained. Two FAMACHA© criter...

  12. Percutaneous treatment of patients with heart diseases: selection, guidance and follow-up. A review

    Directory of Open Access Journals (Sweden)

    Contaldi Carla

    2012-03-01

    Full Text Available Abstract Aortic stenosis and mitral regurgitation, patent foramen ovale, interatrial septal defect, atrial fibrillation and perivalvular leak, are now amenable to percutaneous treatment. These percutaneous procedures require the use of Transthoracic (TTE, Transesophageal (TEE and/or Intracardiac echocardiography (ICE. This paper provides an overview of the different percutaneous interventions, trying to provide a systematic and comprehensive approach for selection, guidance and follow-up of patients undergoing these procedures, illustrating the key role of 2D echocardiography.

  13. Self-organized liquid-crystalline nanostructured membranes for water treatment: selective permeation of ions.

    Science.gov (United States)

    Henmi, Masahiro; Nakatsuji, Koji; Ichikawa, Takahiro; Tomioka, Hiroki; Sakamoto, Takeshi; Yoshio, Masafumi; Kato, Takashi

    2012-05-02

    A membrane with ordered 3D ionic nanochannels constructed by in situ photopolymerization of a thermotropic liquid-crystalline monomer shows high filtration performance and ion selectivity. The nanostructured membrane exhibits water-treatment performance superior to that of an amorphous membrane prepared from the isotropic melt of the monomer. Self-organized nanostructured membranes have great potential for supplying high-quality water. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. The occurrence and removal of selected fluoroquinolones in urban drinking water treatment plants.

    Science.gov (United States)

    Xu, Yongpeng; Chen, Ting; Wang, Yuan; Tao, Hui; Liu, Shiyao; Shi, Wenxin

    2015-12-01

    Fluoroquinolones (FQs) are a widely prescribed group of antibiotics. They enter the aqueous environment, where they are frequently detected, and can lead to a threat to human health. Drinking water treatment plants (DWTPs) play a key role in removing FQs from potable water. This study investigated the occurrence and removal of four selected FQs (norfloxacin (NOR), ciprofloxacin (CIP), enrofloxacin (ENR), and ofloxacin (OFL)) in three urban DWTPs in China. The treatment efficacy for each system was simultaneously evaluated. Two of the examined DWTPs used conventional treatment processes. The third used conventional processes followed by additional treatment processes (ozonation-biologically activated carbon (ozonation-BAC) and membrane technology). The average concentrations of the four FQs in the source water and the finished water ranged from 51 to 248 ng/L and from removal of FQs. In contrast, the addition of advanced treatment processes such as the ozonation-BAC and membranes, substantially improved the removal of FQs. The finding of this study has important implications: even though coagulation-sedimentation and chlorination treatment processes can remove most target FQs, the typical practice of advanced treatment processes is necessary for the further removal.

  15. Using nonlinear models in fMRI data analysis: model selection and activation detection.

    Science.gov (United States)

    Deneux, Thomas; Faugeras, Olivier

    2006-10-01

    There is an increasing interest in using physiologically plausible models in fMRI analysis. These models do raise new mathematical problems in terms of parameter estimation and interpretation of the measured data. In this paper, we show how to use physiological models to map and analyze brain activity from fMRI data. We describe a maximum likelihood parameter estimation algorithm and a statistical test that allow the following two actions: selecting the most statistically significant hemodynamic model for the measured data and deriving activation maps based on such model. Furthermore, as parameter estimation may leave much incertitude on the exact values of parameters, model identifiability characterization is a particular focus of our work. We applied these methods to different variations of the Balloon Model (Buxton, R.B., Wang, E.C., and Frank, L.R. 1998. Dynamics of blood flow and oxygenation changes during brain activation: the balloon model. Magn. Reson. Med. 39: 855-864; Buxton, R.B., Uludağ, K., Dubowitz, D.J., and Liu, T.T. 2004. Modelling the hemodynamic response to brain activation. NeuroImage 23: 220-233; Friston, K. J., Mechelli, A., Turner, R., and Price, C. J. 2000. Nonlinear responses in fMRI: the balloon model, volterra kernels, and other hemodynamics. NeuroImage 12: 466-477) in a visual perception checkerboard experiment. Our model selection proved that hemodynamic models better explain the BOLD response than linear convolution, in particular because they are able to capture some features like poststimulus undershoot or nonlinear effects. On the other hand, nonlinear and linear models are comparable when signals get noisier, which explains that activation maps obtained in both frameworks are comparable. The tools we have developed prove that statistical inference methods used in the framework of the General Linear Model might be generalized to nonlinear models.

  16. Isolation of cells for selective treatment and analysis using a magnetic microfluidic chip

    KAUST Repository

    Yassine, O.

    2014-05-01

    This study describes the development and testing of a magnetic microfluidic chip (MMC) for trapping and isolating cells tagged with superparamagnetic beads (SPBs) in a microfluidic environment for selective treatment and analysis. The trapping and isolation are done in two separate steps; first, the trapping of the tagged cells in a main channel is achieved by soft ferromagnetic disks and second, the transportation of the cells into side chambers for isolation is executed by tapered conductive paths made of Gold (Au). Numerical simulations were performed to analyze the magnetic flux and force distributions of the disks and conducting paths, for trapping and transporting SPBs. The MMC was fabricated using standard microfabrication processes. Experiments were performed with E. coli (K12 strand) tagged with 2.8 μm SPBs. The results showed that E. coli can be separated from a sample solution by trapping them at the disk sites, and then isolated into chambers by transporting them along the tapered conducting paths. Once the E. coli was trapped inside the side chambers, two selective treatments were performed. In one chamber, a solution with minimal nutrition content was added and, in another chamber, a solution with essential nutrition was added. The results showed that the growth of bacteria cultured in the second chamber containing nutrient was significantly higher, demonstrating that the E. coli was not affected by the magnetically driven transportation and the feasibility of performing different treatments on selectively isolated cells on a single microfluidic platform.

  17. Effects of Parceling on Model Selection: Parcel-Allocation Variability in Model Ranking.

    Science.gov (United States)

    Sterba, Sonya K; Rights, Jason D

    2016-01-25

    Research interest often lies in comparing structural model specifications implying different relationships among latent factors. In this context parceling is commonly accepted, assuming the item-level measurement structure is well known and, conservatively, assuming items are unidimensional in the population. Under these assumptions, researchers compare competing structural models, each specified using the same parcel-level measurement model. However, little is known about consequences of parceling for model selection in this context-including whether and when model ranking could vary across alternative item-to-parcel allocations within-sample. This article first provides a theoretical framework that predicts the occurrence of parcel-allocation variability (PAV) in model selection index values and its consequences for PAV in ranking of competing structural models. These predictions are then investigated via simulation. We show that conditions known to manifest PAV in absolute fit of a single model may or may not manifest PAV in model ranking. Thus, one cannot assume that low PAV in absolute fit implies a lack of PAV in ranking, and vice versa. PAV in ranking is shown to occur under a variety of conditions, including large samples. To provide an empirically supported strategy for selecting a model when PAV in ranking exists, we draw on relationships between structural model rankings in parcel- versus item-solutions. This strategy employs the across-allocation modal ranking. We developed software tools for implementing this strategy in practice, and illustrate them with an example. Even if a researcher has substantive reason to prefer one particular allocation, investigating PAV in ranking within-sample still provides an informative sensitivity analysis.

  18. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence.

    Science.gov (United States)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  19. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    Science.gov (United States)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  20. A selection model for longitudinal binary responses subject to non-ignorable attrition.

    Science.gov (United States)

    Alfò, Marco; Maruotti, Antonello

    2009-08-30

    Longitudinal studies collect information on a sample of individuals which is followed over time to analyze the effects of individual and time-dependent characteristics on the observed response. These studies often suffer from attrition: individuals drop out of the study before its completion time and thus present incomplete data records. When the missing mechanism, once conditioned on other (observed) variables, does not depend on current (eventually unobserved) values of the response variable, the dropout mechanism is known to be ignorable. We propose a selection model extending semiparametric variance component models for longitudinal binary responses to allow for dependence between the missing data mechanism and the primary response process. The model is applied to a data set from a methadone maintenance treatment programme held in Sidney, 1986.

  1. Hydraulic modelling of drinking water treatment plant operations

    OpenAIRE

    L. C. Rietveld; Borger, K.J.; Van Schagen, K.M.; Mesman, G.A.M.; G. I. M. Worm

    2008-01-01

    For a drinking water treatment plant simulation, water quality models, a hydraulic model, a process-control model, an object model, data management, training and decision-support features and a graphic user interface have been integrated. The integration of a hydraulic model in the simulator is necessary to correctly determine the division of flows over the plant's lanes and, thus, the flow through the individual treatment units, based on valve positions and pump speeds. The flow through a un...

  2. Modeling Penicillium expansum resistance to thermal and chlorine treatments.

    Science.gov (United States)

    Salomão, Beatriz C M; Churey, John J; Aragão, Gláucia M F; Worobo, Randy W

    2009-12-01

    Apples and apple products are excellent substrates for Penicillium expansum to produce patulin. In an attempt to avoid excessive levels of patulin, limiting or reducing P. expansum contamination levels on apples designated for storage in packinghouses and/or during apple juice processing is critical. The aim of this work was (i) to determine the thermal resistance of P. expansum spores in apple juice, comparing the abilities of the Bigelow and Weibull models to describe the survival curves and (ii) to determine the inactivation of P. expansum spores in aqueous chlorine solutions at varying concentrations of chlorine solutions, comparing the abilities of the biphasic and Weibull models to fit the survival curves. The results showed that the Bigelow and Weibull models were similar for describing the heat inactivation data, because the survival curves were almost linear. In this case, the concept of D- and z-values could be used, and the D-values obtained were 10.68, 6.64, 3.32, 1.14, and 0.61 min at 50, 52, 54, 56, and 60 degrees C, respectively, while the z-value was determined to be 7.57 degrees C. For the chlorine treatments, although the biphasic model gave a slightly superior performance, the Weibull model was selected, considering the parsimony principle, because it has fewer parameters than the biphasic model has. In conclusion, the typical pasteurization regimen used for refrigerated apple juice (71 degrees C for 6 s) is capable of achieving a 6-log reduction of P. expansum spores.

  3. Surgical treatment of spasticity by selective posterior rhizotomy: 30 years experience.

    Science.gov (United States)

    Salame, Khalil; Ouaknine, Georges E R; Rochkind, Semion; Constantini, Shlomo; Razon, Nissim

    2003-08-01

    Spasticity is a common neurologic disorder with adverse effects on the patient's function. Conservative management is unsuccessful in a significant proportion of patients and neurosurgical intervention should be considered. The mainstay of surgical treatment of spasticity is selective posterior rhizotomy, i.e., section of sensory nerve roots of the cauda equina. To report our experience with selective posterior rhizotomy in the treatment of spasticity. We retrospectively reviewed our experience in 154 patients who underwent SPR during 30 years. The indication for surgery was spasticity that significantly hindered the patient's function or care and was resistant to conservative treatment. All patients were evaluated for spasticity in the lower and upper limbs, the presence or absence of painful spasms, and sphincter disturbances. The decision as to which roots to be sectioned, and to what extent, was based mainly on clinical muscle testing. Reduction of spasticity in the lower limbs was obtained in every case, with improvement in movements in 86% of cases. Painful spasms were alleviated in 80% of cases. Amelioration of neurogenic bladder was observed in 42%. A minority of the patients also showed improvement in speech and cognitive performance. There was no perioperative mortality or major complications. SPR is a safe and effective method for the treatment of spasticity with long-lasting beneficial effects. We suggest that this method be considered more frequently for patients with spasticity that interferes with their quality of life.

  4. Continuous time limits of the Utterance Selection Model

    CERN Document Server

    Michaud, Jérôme

    2016-01-01

    In this paper, we derive new continuous time limits of the Utterance Selection Model (USM) for language change (Baxter et al., Phys. Rev. E {\\bf 73}, 046118, 2006). This is motivated by the fact that the Fokker-Planck continuous time limit derived in the original version of the USM is only valid for a small range range of parameters. We investigate the consequences of relaxing these constraints on parameters. Using the normal approximation of the multinomial approximation, we derive a new continuous time limit of the USM in the form of a weak-noise stochastic differential equation. We argue that this weak noise, not captured by the Kramers-Moyal expansion, can not be neglected. We then propose a coarse-graining procedure, which takes the form of a stochastic version of the \\emph{heterogeneous mean field} approximation. This approximation groups the behaviour of nodes of same degree, reducing the complexity of the problem. With the help of this approximation, we study in detail two simple families of networks:...

  5. Estimating seabed scattering mechanisms via Bayesian model selection.

    Science.gov (United States)

    Steininger, Gavin; Dosso, Stan E; Holland, Charles W; Dettmer, Jan

    2014-10-01

    A quantitative inversion procedure is developed and applied to determine the dominant scattering mechanism (surface roughness and/or volume scattering) from seabed scattering-strength data. The classification system is based on trans-dimensional Bayesian inversion with the deviance information criterion used to select the dominant scattering mechanism. Scattering is modeled using first-order perturbation theory as due to one of three mechanisms: Interface scattering from a rough seafloor, volume scattering from a heterogeneous sediment layer, or mixed scattering combining both interface and volume scattering. The classification system is applied to six simulated test cases where it correctly identifies the true dominant scattering mechanism as having greater support from the data in five cases; the remaining case is indecisive. The approach is also applied to measured backscatter-strength data where volume scattering is determined as the dominant scattering mechanism. Comparison of inversion results with core data indicates the method yields both a reasonable volume heterogeneity size distribution and a good estimate of the sub-bottom depths at which scatterers occur.

  6. Binocular rivalry waves in a directionally selective neural field model

    Science.gov (United States)

    Carroll, Samuel R.; Bressloff, Paul C.

    2014-10-01

    We extend a neural field model of binocular rivalry waves in the visual cortex to incorporate direction selectivity of moving stimuli. For each eye, we consider a one-dimensional network of neurons that respond maximally to a fixed orientation and speed of a grating stimulus. Recurrent connections within each one-dimensional network are taken to be excitatory and asymmetric, where the asymmetry captures the direction and speed of the moving stimuli. Connections between the two networks are taken to be inhibitory (cross-inhibition). As per previous studies, we incorporate slow adaption as a symmetry breaking mechanism that allows waves to propagate. We derive an analytical expression for traveling wave solutions of the neural field equations, as well as an implicit equation for the wave speed as a function of neurophysiological parameters, and analyze their stability. Most importantly, we show that propagation of traveling waves is faster in the direction of stimulus motion than against it, which is in agreement with previous experimental and computational studies.

  7. Modeling neuron selectivity over simple midlevel features for image classification.

    Science.gov (United States)

    Shu Kong; Zhuolin Jiang; Qiang Yang

    2015-08-01

    We now know that good mid-level features can greatly enhance the performance of image classification, but how to efficiently learn the image features is still an open question. In this paper, we present an efficient unsupervised midlevel feature learning approach (MidFea), which only involves simple operations, such as k-means clustering, convolution, pooling, vector quantization, and random projection. We show this simple feature can also achieve good performance in traditional classification task. To further boost the performance, we model the neuron selectivity (NS) principle by building an additional layer over the midlevel features prior to the classifier. The NS-layer learns category-specific neurons in a supervised manner with both bottom-up inference and top-down analysis, and thus supports fast inference for a query image. Through extensive experiments, we demonstrate that this higher level NS-layer notably improves the classification accuracy with our simple MidFea, achieving comparable performances for face recognition, gender classification, age estimation, and object categorization. In particular, our approach runs faster in inference by an order of magnitude than sparse coding-based feature learning methods. As a conclusion, we argue that not only do carefully learned features (MidFea) bring improved performance, but also a sophisticated mechanism (NS-layer) at higher level boosts the performance further.

  8. Selective androgen receptor modulators for the treatment of late onset male hypogonadism.

    Science.gov (United States)

    Coss, Christopher C; Jones, Amanda; Hancock, Michael L; Steiner, Mitchell S; Dalton, James T

    2014-01-01

    Several testosterone preparations are used in the treatment of hypogonadism in the ageing male. These therapies differ in their convenience, flexibility, regional availability and expense but share their pharmacokinetic basis of approval and dearth of long-term safety data. The brevity and relatively reduced cost of pharmacokinetic based registration trials provides little commercial incentive to develop improved novel therapies for the treatment of late onset male hypogonadism. Selective androgen receptor modulators (SARMs) have been shown to provide anabolic benefit in the absence of androgenic effects on prostate, hair and skin. Current clinical development for SARMs is focused on acute muscle wasting conditions with defi ned clinical endpoints of physical function and lean body mass. Similar regulatory clarity concerning clinical deficits in men with hypogonadism is required before the beneficial pharmacology and desirable pharmacokinetics of SARMs can be employed in the treatment of late onset male hypogonadism.

  9. Selective androgen receptor modulators for the treatment of late onset male hypogonadism

    Directory of Open Access Journals (Sweden)

    Christopher C Coss

    2014-04-01

    Full Text Available Several testosterone preparations are used in the treatment of hypogonadism in the ageing male. These therapies differ in their convenience, flexibility, regional availability and expense but share their pharmacokinetic basis of approval and dearth of long-term safety data. The brevity and relatively reduced cost of pharmacokinetic based registration trials provides little commercial incentive to develop improved novel therapies for the treatment of late onset male hypogonadism. Selective androgen receptor modulators (SARMs have been shown to provide anabolic benefit in the absence of androgenic effects on prostate, hair and skin. Current clinical development for SARMs is focused on acute muscle wasting conditions with defi ned clinical endpoints of physical function and lean body mass. Similar regulatory clarity concerning clinical deficits in men with hypogonadism is required before the beneficial pharmacology and desirable pharmacokinetics of SARMs can be employed in the treatment of late onset male hypogonadism.

  10. HI-selected Galaxies in Hierarchical Models of Galaxy Formation and Evolution

    CERN Document Server

    Zoldan, Anna; Xie, Lizhi; Fontanot, Fabio; Hirschmann, Michaela

    2016-01-01

    In this work, we study the basic statistical properties of HI-selected galaxies extracted from six different semi-analytic models, all run on the same cosmological N-body simulation. One model includes an explicit treatment for the partition of cold gas into atomic and molecular hydrogen. All models considered agree nicely with the measured HI mass function in the local Universe, with the measured scaling relations between HI and galaxy stellar mass, and with the predicted 2-point correlation function for HI rich galaxies. One exception is given by one model that predicts very little HI associated with galaxies in haloes above 10^12 Msun: we argue this is due to a too efficient radio-mode feedback for central galaxies, and to a combination of efficient stellar feedback and instantaneous stripping of hot gas for satellites. We demonstrate that treatment of satellite galaxies introduces large uncertainties at low HI masses. While models assuming non instantaneous stripping of hot gas tend to form satellite gala...

  11. Criteria for selecting children with special needs for dental treatment under general anaesthesia.

    Science.gov (United States)

    de Nova García, M Joaquín; Gallardo López, Nuria E; Martín Sanjuán, Carmen; Mourelle Martínez, M Rosa; Alonso García, Yolanda; Carracedo Cabaleiro, Esther

    2007-11-01

    To study criteria for helping to select children with special needs for dental treatment under general anaesthesia. Group of 30 children (aged under 18) examined on the Course at the Universidad Complutense de Madrid (UCM) (Specialisation on holistic dental treatment of children with special needs) and subsequently referred to the Disabled Children's Oral Health Unit (DCOHU) within Primary Health Care Area 2 of the Madrid Health Service (SERMAS) where dental treatment under general anaesthesia was given during 2005. Relevant data were taken from their case histories with regard to their general health, oral health and behaviour. In most of the children (22 children), it was possible to carry out a complete dental diagnosis. With regard to medical diagnoses, the most frequent pathology was cerebral palsy (8 children), but it was not possible to establish a link between the pathology and the use of general anaesthesia. With regard to oral health, most of the children received restorative treatment in all 4 quadrants (26 children). On the basis of scales for behavioural evaluation and movement, most of the children (17 children) showed clearly negative behaviour, with movements that interrupted or hindered examination. With the exception of certain specific medical problems, the reasons for using general anaesthesia for dental treatment in children with special needs are extensive treatment needs and bad behaviour, both of which can be judged objectively.

  12. A comparison of three commercial IMRT treatment planning systems for selected paediatric cases.

    Science.gov (United States)

    Eldesoky, Ismail; Attalla, Ehab M; Elshemey, Wael M; Zaghloul, Mohamed S

    2012-03-08

    This work aimed at evaluating the performance of three different intensity-modulated radiotherapy (IMRT) treatment planning systems (TPSs)--KonRad, XiO and Prowess--for selected pediatric cases. For this study, 11 pediatric patients with different types of brain, orbit, head and neck cancer were selected. Clinical step-and-shoot IMRT treatment plans were designed for delivery on a Siemens ONCOR accelerator with 82-leaf multileaf collimators (MLCs). Plans were optimized to achieve the same clinical objectives by applying the same beam energy and the same number and direction of beams. The analysis of performance was based on isodose distributions, dose-volume histograms (DVHs) for planning target volume (PTV), the relevant organs at risk (OARs), as well as mean dose (Dmean), maximum dose (Dmax), 95% dose (D₉₅), volume of patient receiving 2 and 5 Gy, total number of segments, monitor units per segment (MU/Segment), and the number of MU/cGy. Treatment delivery time and conformation number were two other evaluation parameters that were considered in this study. Collectively, the Prowess and KonRad plans showed a significant reduction in the number of MUs that varied between 1.8% and 61.5% (p-value = 0.001) for the different cases, compared to XiO. This was reflected in shorter treatment delivery times. The percentage volumes of each patient receiving 2 Gy and 5 Gy were compared for the three TPSs. The general trend was that KonRad had the highest percentage volume, Prowess showed the lowest (p-value = 0.0001). The KonRad achieved better conformality than both of XiO and Prowess. Based on the present results, the three treatment planning systems were efficient in IMRT, yet XiO showed the lowest performance. The three TPSs achieved the treatment goals according to the internationally approved standards.

  13. Methodologies in the modeling of combined chemo-radiation treatments

    Science.gov (United States)

    Grassberger, C.; Paganetti, H.

    2016-11-01

    The variety of treatment options for cancer patients has increased significantly in recent years. Not only do we combine radiation with surgery and chemotherapy, new therapeutic approaches such as immunotherapy and targeted therapies are starting to play a bigger role. Physics has made significant contributions to radiation therapy treatment planning and delivery. In particular, treatment plan optimization using inverse planning techniques has improved dose conformity considerably. Furthermore, medical physics is often the driving force behind tumor control and normal tissue complication modeling. While treatment optimization and outcome modeling does focus mainly on the effects of radiation, treatment modalities such as chemotherapy are treated independently or are even neglected entirely. This review summarizes the published efforts to model combined modality treatments combining radiation and chemotherapy. These models will play an increasing role in optimizing cancer therapy not only from a radiation and drug dosage standpoint, but also in terms of spatial and temporal optimization of treatment schedules.

  14. Variable Selection for Generalized Varying Coefficient Partially Linear Models with Diverging Number of Parameters

    Institute of Scientific and Technical Information of China (English)

    Zheng-yan Lin; Yu-ze Yuan

    2012-01-01

    Semiparametric models with diverging number of predictors arise in many contemporary scientific areas. Variable selection for these models consists of two components: model selection for non-parametric components and selection of significant variables for the parametric portion.In this paper,we consider a variable selection procedure by combining basis function approximation with SCAD penalty.The proposed procedure simultaneously selects significant variables in the parametric components and the nonparametric components.With appropriate selection of tuning parameters,we establish the consistency and sparseness of this procedure.

  15. Combating unmeasured confounding in cross-sectional studies: evaluating instrumental-variable and Heckman selection models.

    Science.gov (United States)

    DeMaris, Alfred

    2014-09-01

    Unmeasured confounding is the principal threat to unbiased estimation of treatment "effects" (i.e., regression parameters for binary regressors) in nonexperimental research. It refers to unmeasured characteristics of individuals that lead them both to be in a particular "treatment" category and to register higher or lower values than others on a response variable. In this article, I introduce readers to 2 econometric techniques designed to control the problem, with a particular emphasis on the Heckman selection model (HSM). Both techniques can be used with only cross-sectional data. Using a Monte Carlo experiment, I compare the performance of instrumental-variable regression (IVR) and HSM to that of ordinary least squares (OLS) under conditions with treatment and unmeasured confounding both present and absent. I find HSM generally to outperform IVR with respect to mean-square-error of treatment estimates, as well as power for detecting either a treatment effect or unobserved confounding. However, both HSM and IVR require a large sample to be fully effective. The use of HSM and IVR in tandem with OLS to untangle unobserved confounding bias in cross-sectional data is further demonstrated with an empirical application. Using data from the 2006-2010 General Social Survey (National Opinion Research Center, 2014), I examine the association between being married and subjective well-being.

  16. Estimation and Model Selection for Model-Based Clustering with the Conditional Classification Likelihood

    CERN Document Server

    Baudry, Jean-Patrick

    2012-01-01

    The Integrated Completed Likelihood (ICL) criterion has been proposed by Biernacki et al. (2000) in the model-based clustering framework to select a relevant number of classes and has been used by statisticians in various application areas. A theoretical study of this criterion is proposed. A contrast related to the clustering objective is introduced: the conditional classification likelihood. This yields an estimator and a model selection criteria class. The properties of these new procedures are studied and ICL is proved to be an approximation of one of these criteria. We oppose these results to the current leading point of view about ICL, that it would not be consistent. Moreover these results give insights into the class notion underlying ICL and feed a reflection on the class notion in clustering. General results on penalized minimum contrast criteria and on mixture models are derived, which are interesting in their own right.

  17. Voxel-based dose prediction with multi-patient atlas selection for automated radiotherapy treatment planning

    Science.gov (United States)

    McIntosh, Chris; Purdie, Thomas G.

    2017-01-01

    Automating the radiotherapy treatment planning process is a technically challenging problem. The majority of automated approaches have focused on customizing and inferring dose volume objectives to be used in plan optimization. In this work we outline a multi-patient atlas-based dose prediction approach that learns to predict the dose-per-voxel for a novel patient directly from the computed tomography planning scan without the requirement of specifying any objectives. Our method learns to automatically select the most effective atlases for a novel patient, and then map the dose from those atlases onto the novel patient. We extend our previous work to include a conditional random field for the optimization of a joint distribution prior that matches the complementary goals of an accurately spatially distributed dose distribution while still adhering to the desired dose volume histograms. The resulting distribution can then be used for inverse-planning with a new spatial dose objective, or to create typical dose volume objectives for the canonical optimization pipeline. We investigated six treatment sites (633 patients for training and 113 patients for testing) and evaluated the mean absolute difference in all DVHs for the clinical and predicted dose distribution. The results on average are favorable in comparison to our previous approach (1.91 versus 2.57). Comparing our method with and without atlas-selection further validates that atlas-selection improved dose prediction on average in whole breast (0.64 versus 1.59), prostate (2.13 versus 4.07), and rectum (1.46 versus 3.29) while it is less important in breast cavity (0.79 versus 0.92) and lung (1.33 versus 1.27) for which there is high conformity and minimal dose shaping. In CNS brain, atlas-selection has the potential to be impactful (3.65 versus 5.09), but selecting the ideal atlas is the most challenging.

  18. Comparison of multimedia system and conventional method in patients’ selecting prosthetic treatment

    Directory of Open Access Journals (Sweden)

    Baghai R

    2010-12-01

    Full Text Available "nBackground and Aims: Selecting an appropriate treatment plan is one of the most critical aspects of dental treatments. The purpose of this study was to compare multimedia system and conventional method in patients' selecting prosthetic treatment and the time consumed."nMaterials and Methods: Ninety patients were randomly divided into three groups. Patients in group A, once were instructed using the conventional method of dental office and once multimedia system and time was measured in seconds from the beginning of the instruction till the patient had came to decision. The patients were asked about the satisfaction of the method used for them. In group B, patients were only instructed using the conventional method, whereas they were only exposed to soft ware in group C. The data were analyzed with Paired-T-test"n(in group A and T-test and Mann-Whitney test (in groups B and C."nResult: There was a significant difference between multimedia system and conventional method in group A and also between groups B and C (P<0.001. In group A and between groups B and C, patient's satisfaction about multimedia system was better. However, in comparison between groups B and C, multimedia system did not have a significant effect in treatment selection score (P=0.08."nConclusion: Using multimedia system is recommended due to its high ability in giving answers to a large number of patient's questions as well as in terms of marketing.

  19. Alemtuzumab in the treatment of multiple sclerosis: patient selection and special considerations

    Directory of Open Access Journals (Sweden)

    Dörr J

    2016-10-01

    the other are not yet available. Thus, the overall success of alemtuzumab treatment critically depends on the patient selection. The aim of this article is therefore, to characterize the significance of alemtuzumab in the treatment of MS with a focus on the selection of the optimal patient. Keywords: multiple sclerosis, treatment, safety, efficacy, selection, benefit risk relation

  20. Variable selection for distribution-free models for longitudinal zero-inflated count responses.

    Science.gov (United States)

    Chen, Tian; Wu, Pan; Tang, Wan; Zhang, Hui; Feng, Changyong; Kowalski, Jeanne; Tu, Xin M

    2016-07-20

    Zero-inflated count outcomes arise quite often in research and practice. Parametric models such as the zero-inflated Poisson and zero-inflated negative binomial are widely used to model such responses. Like most parametric models, they are quite sensitive to departures from assumed distributions. Recently, new approaches have been proposed to provide distribution-free, or semi-parametric, alternatives. These methods extend the generalized estimating equations to provide robust inference for population mixtures defined by zero-inflated count outcomes. In this paper, we propose methods to extend smoothly clipped absolute deviation (SCAD)-based variable selection methods to these new models. Variable selection has been gaining popularity in modern clinical research studies, as determining differential treatment effects of interventions for different subgroups has become the norm, rather the exception, in the era of patent-centered outcome research. Such moderation analysis in general creates many explanatory variables in regression analysis, and the advantages of SCAD-based methods over their traditional counterparts render them a great choice for addressing this important and timely issues in clinical research. We illustrate the proposed approach with both simulated and real study data. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Model selection and assessment for multi­-species occupancy models

    Science.gov (United States)

    Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.

    2016-01-01

    While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.

  2. Fuzzy Programming Models for Vendor Selection Problem in a Supply Chain

    Institute of Scientific and Technical Information of China (English)

    WANG Junyan; ZHAO Ruiqing; TANG Wansheng

    2008-01-01

    This paper characterizes quality, budget, and demand as fuzzy variables in a fuzzy vendor selec-tion expected value model and a fuzzy vendor selection chance-constrained programming model, to maxi-mize the total quality level. The two models have distinct advantages over existing methods for selecting vendors in fuzzy environments. A genetic algorithm based on fuzzy simulations is designed to solve these two models. Numerical examples show the effectiveness of the algorithm.

  3. Performance Measurement Model for the Supplier Selection Based on AHP

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2015-10-01

    Full Text Available The performance of the supplier is a crucial factor for the success or failure of any company. Rational and effective decision making in terms of the supplier selection process can help the organization to optimize cost and quality functions. The nature of supplier selection processes is generally complex, especially when the company has a large variety of products and vendors. Over the years, several solutions and methods have emerged for addressing the supplier selection problem (SSP. Experience and studies have shown that there is no best way for evaluating and selecting a specific supplier process, but that it varies from one organization to another. The aim of this research is to demonstrate how a multiple attribute decision making approach can be effectively applied for the supplier selection process.

  4. Continuous time limits of the utterance selection model

    Science.gov (United States)

    Michaud, Jérôme

    2017-02-01

    In this paper we derive alternative continuous time limits of the utterance selection model (USM) for language change [G. J. Baxter et al., Phys. Rev. E 73, 046118 (2006), 10.1103/PhysRevE.73.046118]. This is motivated by the fact that the Fokker-Planck continuous time limit derived in the original version of the USM is only valid for a small range of parameters. We investigate the consequences of relaxing these constraints on parameters. Using the normal approximation of the multinomial approximation, we derive a continuous time limit of the USM in the form of a weak-noise stochastic differential equation. We argue that this weak noise, not captured by the Kramers-Moyal expansion, cannot be neglected. We then propose a coarse-graining procedure, which takes the form of a stochastic version of the heterogeneous mean field approximation. This approximation groups the behavior of nodes of the same degree, reducing the complexity of the problem. With the help of this approximation, we study in detail two simple families of networks: the regular networks and the star-shaped networks. The analysis reveals and quantifies a finite-size effect of the dynamics. If we increase the size of the network by keeping all the other parameters constant, we transition from a state where conventions emerge to a state where no convention emerges. Furthermore, we show that the degree of a node acts as a time scale. For heterogeneous networks such as star-shaped networks, the time scale difference can become very large, leading to a noisier behavior of highly connected nodes.

  5. Helminths in horses : use of selective treatment for the control of strongyles

    Directory of Open Access Journals (Sweden)

    S. Matthee

    2004-06-01

    Full Text Available The current level of anthelmintic resistance in the horse-breeding industry is extremely high and therefore more emphasis is being placed on studies that focus on the judicious use of anthelmintic products. The aims of the study were to: 1 establish if there is variation in the egg excretion pattern of strongyles between the different age classes of Thoroughbred horses in the Western Cape Province (WCP, 2 test if a selective treatment approach successfully reduces the number of anthelmintic treatments and maintains acceptably low helminth burdens in adult Thoroughbred horses, and 3 evaluate the efficacy of subsampling large horse herds for faecal egg counts (FECs to monitor the strongyle burden. In 2001 the FECs of 4 adult mare, 5 yearling and 3 weanling herds from 8 different farms were compared in the WCP. Within the mare herds there were generally fewer eggexcreting individuals with lower mean FECs compared with the younger age classes. Individual faecal samples were collected every 3-4 weeks from 52 adult Thoroughbred mares from 1 farm in the WCP during a 12-month period (2002/2003. Animals with strongyle FECs > 100 eggs per gram (epg were treated with an ivermectin-praziquantel combination drug (Equimax oral paste, Virbac. The mean monthly strongyle FEC for the entire group was < 300 epg throughout the study and the number of treatments was reduced by 50 %. Resampling methods showed that an asymptote to mean FEC was reached at 55 animals for each of the pooled weanling, yearling and mare egg counts. Resampling within 4 different mare herds recorded asymptotes of between 24 and 28 animals. Subsampling entire herds for FECs therefore provided an effective approach to treatment management. This study demonstrates that selective treatment is both a practical and an effective approach to the management of anthelmintic resistance.

  6. Patient selection and targeted treatment in the management of platinum-resistant ovarian cancer

    Directory of Open Access Journals (Sweden)

    Leamon PC

    2013-09-01

    Full Text Available Christopher P Leamon,1 Chandra D Lovejoy,2 Binh Nguyen3 1Research and Development, 2Regulatory Affairs, 3Clinical Affairs, Endocyte Inc, West Lafayette, IN, USA Abstract: Ovarian cancer (OC has the highest mortality rate of any gynecologic cancer, and patients generally have a poor prognosis due to high chemotherapy resistance and late stage disease diagnosis. Platinum-resistant OC can be treated with cytotoxic chemotherapy such as paclitaxel, topotecan, pegylated liposomal doxorubicin, and gemcitabine, but many patients eventually relapse upon treatment. Fortunately, there are currently a number of targeted therapies in development for these patients who have shown promising results in recent clinical trials. These treatments often target the vascular endothelial growth factor pathway (eg, bevacizumab and aflibercept, DNA repair mechanisms (eg, iniparib and olaparib, or they are directed against folate related pathways (eg, pemetrexed, farletuzumab, and vintafolide. As many targeted therapies are only effective in a subset of patients, there is an increasing need for the identification of response predictive biomarkers. Selecting the right patients through biomarker screening will help tailor therapy to patients and decrease superfluous treatment to those who are biomarker negative; this approach should lead to improved clinical results and decreased toxicities. In this review the current targeted therapies used for treating platinum-resistant OC are discussed. Furthermore, use of prognostic and response predictive biomarkers to define OC patient populations that may benefit from specific targeted therapies is also highlighted. Keywords: platinum-resistant ovarian cancer, targeted therapy, patient selection, folate receptor, VEGF, biomarkers

  7. Integrated basic treatment of activated carbon for enhanced CO{sub 2} selectivity

    Energy Technology Data Exchange (ETDEWEB)

    Adelodun, Adedeji Adebukola; Jo, Young-Min, E-mail: ymjo@khu.ac.kr

    2013-12-01

    We attempted the use of three chemical agents viz nitric acid (HN), calcium nitrate (CaN) and calcium ethanoate (CaEt) to achieve enhanced CO{sub 2} selective adsorption by activated carbon (AC). In dry phase treatment, microporous coconut shell-based carbon (CS) exhibits higher CO{sub 2} capacity than coal-based. However, upon wet-phase pre-treatment, modified CS samples showed lesser CO{sub 2} adsorption efficiency. Surface characterization with X-ray photoelectron spectroscopy confirms the presence of calcium and amine species on the samples with integrated treatment (A-CaN). These samples recorded the highest low-level CO{sub 2} capture despite calcinated CaEt-doped samples (C-CaEt) showing the highest value for pure and high level CO{sub 2} adsorption capacities. The slope and linearity values of isobaric desorption were used to estimate the proportion of CO{sub 2} chemisorbed and heterogeneity of the adsorbents’ surfaces respectively. Consequently, integrated basic impregnation provides the most efficient adsorbents for selective adsorption of both indoor and outdoor CO{sub 2} levels.

  8. Bazedoxifene acetate: a novel selective estrogen receptor modulator for the prevention and treatment of postmenopausal osteoporosis.

    Science.gov (United States)

    Chines, Arkadi A; Komm, Barry S

    2009-07-01

    Postmenopausal osteoporosis is an increasing worldwide health concern affecting an estimated 200 million individuals. Despite a wide range of available treatment options, many patients are not being treated or discontinue therapy. The ongoing need for new osteoporosis therapies has led to the development of new selective estrogen receptor modulators (SERMs) with an ideal tissue selectivity profile and beneficial effects on bone without undesirable effects on the endometrium and breast. Bazedoxifene acetate, a novel SERM in clinical development for the prevention and treatment of postmenopausal osteoporosis, resembles this ideal profile more closely than other currently available SERMs. Results from large prospective phase III trials showed that it increases bone mineral density, reduces bone turnover rate and decreases the risk for new vertebral fractures. Moreover, based on a post hoc analysis of a subgroup of women with a higher risk for fracture, bazedoxifene was demonstrated to significantly reduce the incidence of nonvertebral fractures compared with both raloxifene hydrochloride and placebo. Furthermore, it was reported to be well tolerated, with a favorable safety profile and no evidence of endometrial or breast tissue stimulation. Bazedoxifene represents an important new treatment option for women at risk for osteoporosis and fracture.

  9. Mean field theory for a balanced hypercolumn model of orientation selectivity in primary visual cortex

    CERN Document Server

    Lerchner, A; Hertz, J; Ahmadi, M

    2004-01-01

    We present a complete mean field theory for a balanced state of a simple model of an orientation hypercolumn. The theory is complemented by a description of a numerical procedure for solving the mean-field equations quantitatively. With our treatment, we can determine self-consistently both the firing rates and the firing correlations, without being restricted to specific neuron models. Here, we solve the analytically derived mean-field equations numerically for integrate-and-fire neurons. Several known key properties of orientation selective cortical neurons emerge naturally from the description: Irregular firing with statistics close to -- but not restricted to -- Poisson statistics; an almost linear gain function (firing frequency as a function of stimulus contrast) of the neurons within the network; and a contrast-invariant tuning width of the neuronal firing. We find that the irregularity in firing depends sensitively on synaptic strengths. If Fano factors are bigger than 1, then they are so for all stim...

  10. Integrated modeling of ozonation for optimization of drinking water treatment

    NARCIS (Netherlands)

    van der Helm, A.W.C.

    2007-01-01

    Drinking water treatment plants automation becomes more sophisticated, more on-line monitoring systems become available and integration of modeling environments with control systems becomes easier. This gives possibilities for model-based optimization. In operation of drinking water treatment plants

  11. Integrated modeling of ozonation for optimization of drinking water treatment

    NARCIS (Netherlands)

    van der Helm, A.W.C.

    2007-01-01

    Drinking water treatment plants automation becomes more sophisticated, more on-line monitoring systems become available and integration of modeling environments with control systems becomes easier. This gives possibilities for model-based optimization. In operation of drinking water treatment plants

  12. Preliminary Investigation on Regularity of Selecting Acupoints for Treatment of Insomnia by Acupuncture

    Institute of Scientific and Technical Information of China (English)

    LIN Xue

    2005-01-01

    In the literatures from 1994-2004, points selected for treatment of insomnia were Shenmen (HT 7), Sanyinjiao (SP 6), Taichong (LR 3), Baihui (GV 20), Sishencong (Ex-HN 1),Taixi (KI 3) and Anmian etc. Most of the key acupoints are yuan-source points of yin meridians or located at head region, and back-sbu points were selected as coordinative points.%在1994~2004年的文献中,用于治疗失眠的穴位有神门、三阴交、太冲、百会、内关、四神聪、太溪、安眠等.所采用的主穴多为阴经原穴和头部穴位,并以背俞穴作为配穴.

  13. [Efficacy of selective serotonin reuptake inhibitor treatment in children and adolescents].

    Science.gov (United States)

    Bailly, Daniel

    2006-09-01

    Selective serotonin reuptake inhibitors (SSRIs) have been used increasingly since the early 1990s to treat anxiety disorders and depression in children and adolescents. Several recent reports, however, cast doubt on their efficacy and especially raise questions about their role in serious adverse effects (increase in suicidal ideation and suicide attempts as well as reactions involving irritability, hostility, self-harm and self-destructive actions). The efficacy of SSRIs (fluoxetine, sertraline, fluvoxamine, paroxetine) in the treatment of obsessive-compulsive disorders in this population is clear today, although their effects are globally relatively modest. SSRIs remain notably less effective than clomipramine for this indication, although a variety of factors (age, family history, and psychiatric comorbidity) are also likely to influence response to treatment. Only several placebo-controlled studies suggest that the SSRIs (fluoxetine, sertraline and fluvoxamine) may have some utility in the treatment of anxiety disorders (generalized anxiety, separation anxiety, social phobias) in children and teens. The additional benefits from SSRIs for this indication nonetheless require confirmation. Imipramine and related tricyclic antidepressants are ineffective in the treatment of depressive disorders in children and adolescents. Among the SSRIs, only fluoxetine has proven its efficacy for this indication, although its effect here too appears relatively modest. The efficacy of sertraline and paroxetine cannot be considered more than probable, requiring confirmation, and that of citalopram has not been demonstrated. Moreover, because of the risk of suicidal behavior observed in some studies, SSRIs are inadvisable for the treatment of depressive disorders in this population. Overall, although the currently available data show SSRIs to be moderately effective and useful in treating anxiety disorders and depression in children and adolescents, future studies must focus on

  14. Effectiveness of Selected Stages of Wastewater Treatment in Elimination of Eggs of Intestinal Parasites

    Directory of Open Access Journals (Sweden)

    Zdybel Jolanta

    2015-04-01

    Full Text Available The objective of the study was to determine the degree of municipal wastewater contamination with intestinal parasite eggs of the genera Ascaris, Toxocara, and Trichuris at individual stages of treatment, and indication of potentially weak points in the hygienisation of sewage sludge. The study was conducted in 17 municipal mechanical-biological wastewater treatment plants which, to a slight degree, differed in the technological process of wastewater treatment and the method of hygienisation of sewage sludge. The selected treatment plants, located in seven regions, included five classified as large agglomerations (population equivalent - PE >100 000, ten as medium-size (PE 15 000-100 000, and two as smaller size with PE 10 000 - 5000. The largest number of viable eggs of Ascaris spp., Toxocara spp., and Trichuris spp. was found in the sewage sludge collected from the primary settling tank. A slightly lower number of the eggs were found in the samples of excess sludge, which indicates that the sedimentation process in the primary settling tank is not sufficiently long to effectively separate parasites’ eggs from the sewage treated. The number of eggs of Ascaris spp. and Toxocara spp. in the fermented sludge was nearly 3 times lower than that in the raw sludge. The effectiveness of hygienisation of dehydrated sewage sludge by means of quicklime was confirmed in two wastewater treatment plants, with respect to Ascaris spp. eggs, in three plants with respect to Toxocara spp. eggs, and in one plant with respect to Trichuris spp. eggs. The mean reduction of the number of eggs was 65%, 61%, and 100%, respectively. In one wastewater treatment plant, a reduction in the number of viable eggs of Ascaris and Trichuris species was also noted as a result of composting sludge by 85% and 75%, respectively. In the remaining treatment plants, no effect of hygienisation of sewage sludge was observed on the contents of viable eggs of these nematodes.

  15. Brivaracetam: Rationale for discovery and preclinical profile of a selective SV2A ligand for epilepsy treatment.

    Science.gov (United States)

    Klitgaard, Henrik; Matagne, Alain; Nicolas, Jean-Marie; Gillard, Michel; Lamberty, Yves; De Ryck, Marc; Kaminski, Rafal M; Leclercq, Karine; Niespodziany, Isabelle; Wolff, Christian; Wood, Martyn; Hannestad, Jonas; Kervyn, Sophie; Kenda, Benoit

    2016-04-01

    Despite availability of effective antiepileptic drugs (AEDs), many patients with epilepsy continue to experience refractory seizures and adverse events. Achievement of better seizure control and fewer side effects is key to improving quality of life. This review describes the rationale for the discovery and preclinical profile of brivaracetam (BRV), currently under regulatory review as adjunctive therapy for adults with partial-onset seizures. The discovery of BRV was triggered by the novel mechanism of action and atypical properties of levetiracetam (LEV) in preclinical seizure and epilepsy models. LEV is associated with several mechanisms that may contribute to its antiepileptic properties and adverse effect profile. Early findings observed a moderate affinity for a unique brain-specific LEV binding site (LBS) that correlated with anticonvulsant effects in animal models of epilepsy. This provided a promising molecular target and rationale for identifying selective, high-affinity ligands for LBS with potential for improved antiepileptic properties. The later discovery that synaptic vesicle protein 2A (SV2A) was the molecular correlate of LBS confirmed the novelty of the target. A drug discovery program resulted in the identification of anticonvulsants, comprising two distinct families of high-affinity SV2A ligands possessing different pharmacologic properties. Among these, BRV differed significantly from LEV by its selective, high affinity and differential interaction with SV2A as well as a higher lipophilicity, correlating with more potent and complete seizure suppression, as well as a more rapid brain penetration in preclinical models. Initial studies in animal models also revealed BRV had a greater antiepileptogenic potential than LEV. These properties of BRV highlight its promising potential as an AED that might provide broad-spectrum efficacy, associated with a promising tolerability profile and a fast onset of action. BRV represents the first selective SV2A

  16. Autotransplant tissue selection criteria with or without stereomicroscopy in parathyroidectomy for treatment of renal hyperparathyroidism

    Directory of Open Access Journals (Sweden)

    Monique Nakayama Ohe

    2014-07-01

    Full Text Available INTRODUCTION: Several methods have been proposed to improve operative success in renal hyperparathyroidism. OBJECTIVE: To evaluate stereomicroscopy in parathyroid tissue selection for total parathyroidectomy with autotransplantation in secondary (SHPT/tertiary (THPT hyperparathyroidism. METHODS: 118 renal patients underwent surgery from April of 2000 to October 2009. They were divided into two groups: G1, 66 patients operated from April of 2000 to May of 2005, with tissue selection based on macroscopic observation; G2, 52 patients operated from March of 2008 to October 2009 with stereomicroscopy for tissue selection searching for the presence of adipose cells. All surgeries were performed by the same surgeon. Patients presented SHPT (dialysis treatment or THPT (renal-grafted. Follow-up was 12-36 months. Intra-operative parathyroid hormone (PTH was measured in 100/118 (84.7% patients. RESULTS: Data are presented as means. G1 included 66 patients (38 SHPT, 24 females/14 males; 40.0 years of age; 28 THPT, 14 females/14 males; 44 years of age. G2 included 52 patients (29 SHPT, 11 females/18 males; 50.7 years of age; 23 THPT, 13 females/10 males, 44.4 years of age. SHPT patients from G2 presented preoperative serum calcium higher than those of SHPT patients in G1 (p < 0.05, suggesting a more severe disease. Definitive hypoparathyroidism was found in seven of 118 patients (5.9%. Graft-dependent recurrence occurred in four patients, two in each group. All occurred in dialysis patients. CONCLUSION: Stereomicroscopy in SHPT/THPT surgical treatment may be a useful tool to standardize parathyroid tissue selection.

  17. [The role of magnetic resonance imaging to select patients for preoperative treatment in rectal cancer].

    Science.gov (United States)

    Rödel, Claus; Sauer, Rolf; Fietkau, Rainer

    2009-08-01

    Traditionally, the decision to apply preoperative treatment for rectal cancer patients has been based on the T- and N-category. Recently, the radial distance of the tumor to the circumferential resection margin (CRM) has been identified as an important risk factor for local failure. By magnetic resonance imaging (MRI) this distance can be measured preoperatively with high reliability. Thus, selected groups have started to limit the indication for preoperative therapy to tumors extending to - or growing within 1 mm from - the mesorectal fascia (CRM+). Pros and cons of this selected approach for preoperative treatment and first clinical results are presented. Prerequisites are the availability of modern high-resolution thin-section MRI technology as well as strict quality control of MRI and surgical quality of total mesorectal excision (TME). By selecting patients with CRM-positive tumors on MRI for preoperative therapy, only approximately 35% patients will require preoperative radiotherapy (RT) or radiochemotherapy (RCT). However, with histopathologic work-up of the resected specimen after primary surgery, the indication for postoperative RCT is given for a rather large percentage of patients, i.e., for pCRM+ (5-10%), intramesorectal or intramural excision (30-40%), pN+ (30-40%). Postoperative RCT, however, is significantly less effective and more toxic than preoperative RCT. A further point of concern is the assertion that patients, in whom a CRM-negative status is achieved by surgery alone, do not benefit from additional RT. Data of the Dutch TME trial and the British MRC (Medical Research Council) CR07 trial, however, suggest the reverse. To omit preoperative RT/RCT for CRM-negative tumors on MRI needs to be further investigated in prospective clinical trials. The German guidelines for the treatment of colorectal cancer 2008 continue to indicate preoperative RT/RCT based on the T- and N-category.

  18. A selected controlled trial of supplementary vitamin E for treatment of muscle cramps in hemodialysis patients.

    Science.gov (United States)

    El-Hennawy, Adel S; Zaib, Salwat

    2010-01-01

    Muscle cramps are not uncommon complications of hemodialysis (HD) treatments and lead to early termination of HD sessions and are therefore a significant cause of under-dialysis. The etiology of cramps in dialysis patients remains a matter of debate. Many reports suggested that vitamin E (vit. E) may be effective for the prevention of HD-associated cramps. We decided to perform a selected controlled trial of supplementary vit. E for treatment of patients on HD who experience frequent attacks during and between HD sessions. The goal was to compare the number of attacks of muscle cramps with the patient's baseline over a specific period of time. In this study, 19 HD patients were randomly selected of different age groups and ethnicity. Patient must have had at least 60 attacks of muscle cramps during and between HD sessions over a 12-week period. All selected patients received vit. E at a dose of 400 international units daily for 12 weeks, and the number of attacks of muscle cramps was recorded. The frequency of muscle cramps decreased significantly during vit. E therapy, and, at the end of the trial, vit. E led to cramp reductions of 68.3%. The reduction in number of attacks of muscle cramps had no significant correlation with age, sex, etiology of end-stage renal disease, serum electrolytes, or HD duration, and it showed a statistically positive correlation (P = 0.0001) with vit. E therapy. No vit. E-related adverse effects were encountered during the trial. Short-term treatment with vit. E is safe and effective in reducing number of attacks of muscle cramps in HD patients, as shown in our study.

  19. Reaction selectivity studies on nanolithographically-fabricated platinum model catalyst arrays

    Energy Technology Data Exchange (ETDEWEB)

    Grunes, Jeffrey Benjamin

    2004-05-15

    In an effort to understand the molecular ingredients of catalytic activity and selectivity toward the end of tuning a catalyst for 100% selectivity, advanced nanolithography techniques were developed and utilized to fabricate well-ordered two-dimensional model catalyst arrays of metal nanostructures on an oxide support for the investigation of reaction selectivity. In-situ and ex-situ surface science techniques were coupled with catalytic reaction data to characterize the molecular structure of the catalyst systems and gain insight into hydrocarbon conversion in heterogeneous catalysis. Through systematic variation of catalyst parameters (size, spacing, structure, and oxide support) and catalytic reaction conditions (hydrocarbon chain length, temperature, pressures, and gas composition), the data presented in this dissertation demonstrate the ability to direct a reaction by rationally adjusting, through precise control, the design of the catalyst system. Electron beam lithography (EBL) was employed to create platinum nanoparticles on an alumina (Al{sub 2}O{sub 3}) support. The Pt nanoparticle spacing (100-150-nm interparticle distance) was varied in these samples, and they were characterized using x-ray photoelectron spectroscopy (XPS), transmission electron microscopy (TEM), scanning electron microscopy (SEM), and atomic force microscopy (AFM), both before and after reactions. The TEM studies showed the 28-nm Pt nanoparticles with 100 and 150-nm interparticle spacing on alumina to be polycrystalline in nature, with crystalline sizes of 3-5 nm. The nanoparticle crystallites increased significantly after heat treatment. The nanoparticles were still mostly polycrystalline in nature, with 2-3 domains. The 28-nm Pt nanoparticles deposited on alumina were removed by the AFM tip in contact mode with a normal force of approximately 30 nN. After heat treatment at 500 C in vacuum for 3 hours, the AFM tip, even at 4000 nN, could not remove the platinum nanoparticles. The

  20. Reaction selectivity studies on nanolithographically-fabricated platinum model catalyst arrays

    Energy Technology Data Exchange (ETDEWEB)

    Grunes, Jeffrey Benjamin [Univ. of California, Berkeley, CA (United States)

    2004-05-01

    In an effort to understand the molecular ingredients of catalytic activity and selectivity toward the end of tuning a catalyst for 100% selectivity, advanced nanolithography techniques were developed and utilized to fabricate well-ordered two-dimensional model catalyst arrays of metal nanostructures on an oxide support for the investigation of reaction selectivity. In-situ and ex-situ surface science techniques were coupled with catalytic reaction data to characterize the molecular structure of the catalyst systems and gain insight into hydrocarbon conversion in heterogeneous catalysis. Through systematic variation of catalyst parameters (size, spacing, structure, and oxide support) and catalytic reaction conditions (hydrocarbon chain length, temperature, pressures, and gas composition), the data presented in this dissertation demonstrate the ability to direct a reaction by rationally adjusting, through precise control, the design of the catalyst system. Electron beam lithography (EBL) was employed to create platinum nanoparticles on an alumina (Al2O3) support. The Pt nanoparticle spacing (100-150-nm interparticle distance) was varied in these samples, and they were characterized using x-ray photoelectron spectroscopy (XPS), transmission electron microscopy (TEM), scanning electron microscopy (SEM), and atomic force microscopy (AFM), both before and after reactions. The TEM studies showed the 28-nm Pt nanoparticles with 100 and 150-nm interparticle spacing on alumina to be polycrystalline in nature, with crystalline sizes of 3-5 nm. The nanoparticle crystallites increased significantly after heat treatment. The nanoparticles were still mostly polycrystalline in nature, with 2-3 domains. The 28-nm Pt nanoparticles deposited on alumina were removed by the AFM tip in contact mode with a normal force of approximately 30 nN. After heat treatment at 500 C in vacuum for 3 hours, the AFM tip, even at 4000 nN, could not remove the platinum

  1. Selecting representative climate models for climate change impact studies : An advanced envelope-based selection approach

    NARCIS (Netherlands)

    Lutz, Arthur F.; ter Maat, Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.|info:eu-repo/dai/nl/290472113

    2016-01-01

    Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change

  2. Selecting representative climate models for climate change impact studies: an advanced envelope-based selection approach

    NARCIS (Netherlands)

    Lutz, Arthur F.; Maat, ter Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.

    2016-01-01

    Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change

  3. Selecting representative climate models for climate change impact studies : An advanced envelope-based selection approach

    NARCIS (Netherlands)

    Lutz, Arthur F.; ter Maat, Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.

    2016-01-01

    Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change impa

  4. Selecting representative climate models for climate change impact studies: an advanced envelope-based selection approach

    NARCIS (Netherlands)

    Lutz, Arthur F.; Maat, ter Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.

    2016-01-01

    Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change impa

  5. Modelling and control of laser surface treatment

    NARCIS (Netherlands)

    Römer, Gerardus Richardus Benardus Engelina

    1999-01-01

    The results of laser surface treatment may vary significantly during laser surface processing. These variations arise from the sensitivity of the process to disturbances, such as varying absorptivity and the small dimensions of the work piece. To increase the reproducibility of the process, a real-t

  6. Modelling and control of laser surface treatment

    NARCIS (Netherlands)

    Römer, Gerardus Richardus, Bernardus, Engelina

    1999-01-01

    The results of laser surface treatment may vary significantly during laser surface processing. These variations arise from the sensitivity of the process to disturbances, such as varying absorptivity and the small dimensions of the work piece. To increase the reproducibility of the process, a

  7. Toward understanding the selective anticancer capacity of cold atmospheric plasma--a model based on aquaporins (Review).

    Science.gov (United States)

    Yan, Dayun; Talbot, Annie; Nourmohammadi, Niki; Sherman, Jonathan H; Cheng, Xiaoqian; Keidar, Michael

    2015-01-01

    Selectively treating tumor cells is the ongoing challenge of modern cancer therapy. Recently, cold atmospheric plasma (CAP), a near room-temperature ionized gas, has been demonstrated to exhibit selective anticancer behavior. However, the mechanism governing such selectivity is still largely unknown. In this review, the authors first summarize the progress that has been made applying CAP as a selective tool for cancer treatment. Then, the key role of aquaporins in the H2O2 transmembrane diffusion is discussed. Finally, a novel model, based on the expression of aquaporins, is proposed to explain why cancer cells respond to CAP treatment with a greater rise in reactive oxygen species than homologous normal cells. Cancer cells tend to express more aquaporins on their cytoplasmic membranes, which may cause the H2O2 uptake speed in cancer cells to be faster than in normal cells. As a result, CAP treatment kills cancer cells more easily than normal cells. Our preliminary observations indicated that glioblastoma cells consumed H2O2 much faster than did astrocytes in either the CAP-treated or H2O2-rich media, which supported the selective model based on aquaporins.

  8. Guidelines for Selecting Control and Treatment Options for Contaminated Dredged Material.

    Science.gov (United States)

    1986-09-01

    This level increases organics removal to 95 percent. viii NO ’% V 1 d. Level IV is treatment to remove nutrients such as ammonia and phosphorus. e...metals. Clinoptilolite (Cu ;’n Cd - Pb) and mordenite (;: Zn; and Co < Cu M n) both show selective exchange of heavv metals. As wtti all ion exchangers...Equalization Sulfides >100 mg/i Precipitation or stripping with recovery Phenols >70-300 mg/i Extraction, adsorption, internal dilution Ammonia >1.6 g/k

  9. Bariatric surgery: the challenges with candidate selection, individualizing treatment and clinical outcomes

    Science.gov (United States)

    2013-01-01

    Obesity is recognized as a global health crisis. Bariatric surgery offers a treatment that can reduce weight, induce remission of obesity-related diseases, and improve the quality of life. In this article, we outline the different options in bariatric surgery and summarize the recommendations for selecting and assessing potential candidates before proceeding to surgery. We present current data on post-surgical outcomes and evaluate the psychosocial and economic effects of bariatric surgery. Finally, we evaluate the complication rates and present recommendations for post-operative care. PMID:23302153

  10. Lead Selection of a New Aminomethylphenol, JPC-3210, for Malaria Treatment and Prevention.

    Science.gov (United States)

    Chavchich, Marina; Birrell, Geoffrey W; Ager, Arba L; MacKenzie, Donna O; Heffernan, Gavin D; Schiehser, Guy A; Jacobus, Laura R; Shanks, G Dennis; Jacobus, David P; Edstein, Michael D

    2016-05-01

    Structure-activity relationship studies of trifluoromethyl-substituted pyridine and pyrimidine analogues of 2-aminomethylphenols (JPC-2997, JPC-3186, and JPC-3210) were conducted for preclinical development for malaria treatment and/or prevention. Of these compounds, JPC-3210 [4-(tert-butyl)-2-((tert-butylamino)methyl)-6-(5-fluoro-6-(trifluoromethyl)pyridin-3-yl)phenol] was selected as the lead compound due to superior in vitro antimalarial activity against multidrug-resistant Plasmodium falciparum lines, lower in vitro cytotoxicity in mammalian cell lines, longer plasma elimination half-life, and greater in vivo efficacy against murine malaria.

  11. Bariatric surgery: the challenges with candidate selection, individualizing treatment and clinical outcomes

    Directory of Open Access Journals (Sweden)

    Neff KJ

    2013-01-01

    Full Text Available Abstract Obesity is recognized as a global health crisis. Bariatric surgery offers a treatment that can reduce weight, induce remission of obesity-related diseases, and improve the quality of life. In this article, we outline the different options in bariatric surgery and summarize the recommendations for selecting and assessing potential candidates before proceeding to surgery. We present current data on post-surgical outcomes and evaluate the psychosocial and economic effects of bariatric surgery. Finally, we evaluate the complication rates and present recommendations for post-operative care.

  12. Treatments of Precipitation Inputs to Hydrologic Models

    Science.gov (United States)

    Hydrological models are used to assess many water resources problems from agricultural use and water quality to engineering issues. The success of these models are dependent on correct parameterization; the most sensitive being the rainfall input time series. These records can come from land-based ...

  13. A data-driven model for maximization of methane production in a wastewater treatment plant.

    Science.gov (United States)

    Kusiak, Andrew; Wei, Xiupeng

    2012-01-01

    A data-driven approach for maximization of methane production in a wastewater treatment plant is presented. Industrial data collected on a daily basis was used to build the model. Temperature, total solids, volatile solids, detention time and pH value were selected as parameters for the model construction. First, a prediction model of methane production was built by a multi-layer perceptron neural network. Then a particle swarm optimization algorithm was used to maximize methane production based on the model developed in this research. The model resulted in a 5.5% increase in methane production.

  14. Clinical study of emergency treatment and selective closed reduction for the treatment of supracondylar humerus fracture in children

    Directory of Open Access Journals (Sweden)

    Wei Zhong

    2016-11-01

    Full Text Available Objective: To study the effect of emergency treatment, selective closed reduction combined with percutaneous Kirschner wire fixation on the treatment of Gartland type-II and type-III supracondylar humerus fracture. Methods: Children who sustained the Gartland type-II and type-III supracondylar fractures of humerus treated with selective closed reduction combined with percutaneous Kirschner wire fixation in our hospital from May 2012 to August 2015 were analyzed retrospectively. They were divided into group A (emergency operation group and group B (selective operation group according to different operation timing. Perioperative situation, blood biochemical parameters, swelling degree and elbow joint function of affected limb were compared between two groups. Results: Operation time for patients of group A was significantly shorter than that of group B [(17.19 ± 2.85 vs. (21.43 ± 3.91 min], and frequency of fluoroscopy during operation of group A was obviously less than that of group B [(6.03 ± 0.95 vs. (7.61 ± 0.92 times]. Swelling index of affected limb in group A at 3 days, 5 days and 7 days after injury was all significantly lower than that in group B [(1.20 ± 0.17 vs. (1.38 ± 0.14, (1.13 ± 0.13 vs. (1.30 ± 0.18, (1.02 ± 0.15 vs. (1.22 ± 0.15]. Hospital for special surgery score at 1 week, 2 weeks, 3 and 4 weeks after removing Kirschner wire had no significant difference between group A and B (88.75 ± 10.18 vs. (89.14 ± 10.52, (94.22 ± 10.85 vs. (93.85 ± 11.08, (95.52 ± 11.27 vs. (95.92 ± 12.19, (95.43 ± 10.96 vs. (96.02 ± 11.38. Contents of serum alanine transaminase, aspertate aminotransferase, total protein, albumin and C-reactive protein in perioperative period had no obvious difference between patients in group A and B. Conclusions: Emergency closed reduction combined with percutaneous Kirschner wire fixation for Gartland type-II and type-III supracondylar humerus fracture in children has less trauma, low swelling degree

  15. Clinical study of emergency treatment and selective closed reduction for the treatment of supracondylar humerus fracture in children

    Institute of Scientific and Technical Information of China (English)

    Wei Zhong; Xue-Wen Wang

    2016-01-01

    Objective: To study the effect of emergency treatment, selective closed reduction combined with percutaneous Kirschner wire fixation on the treatment of Gartland type-II and type-III supracondylar humerus fracture. Methods: Children who sustained the Gartland type-II and type-III supracondylar fractures of humerus treated with selective closed reduction combined with percutaneous Kirschner wire fixation in our hospital from May 2012 to August 2015 were analyzed retrospectively. They were divided into group A (emergency operation group) and group B (selective operation group) according to different operation timing. Perioperative sit-uation, blood biochemical parameters, swelling degree and elbow joint function of affected limb were compared between two groups. Results: Operation time for patients of group A was significantly shorter than that of group B [(17.19 ± 2.85) vs. (21.43 ± 3.91) min], and frequency of fluoroscopy during operation of group A was obviously less than that of group B [(6.03 ± 0.95) vs. (7.61 ± 0.92) times]. Swelling index of affected limb in group A at 3 days, 5 days and 7 days after injury was all significantly lower than that in group B [(1.20 ± 0.17) vs. (1.38 ± 0.14), (1.13 ± 0.13) vs. (1.30 ± 0.18), (1.02 ± 0.15) vs. (1.22 ± 0.15)]. Hospital for special surgery score at 1 week, 2 weeks, 3 and 4 weeks after removing Kirschner wire had no significant difference between group A and B (88.75 ± 10.18) vs. (89.14 ± 10.52), (94.22 ± 10.85) vs. (93.85 ± 11.08), (95.52 ± 11.27) vs. (95.92 ± 12.19), (95.43 ± 10.96) vs. (96.02 ± 11.38). Contents of serum alanine transaminase, aspertate aminotransferase, total protein, albumin and C-reactive protein in perioperative period had no obvious difference between patients in group A and B. Conclusions: Emergency closed reduction combined with percutaneous Kirschner wire fixation for Gartland type-II and type-III supracondylar humerus fracture in children has less trauma, low swelling

  16. Bioactive treatment promotes osteoblast differentiation on titanium materials fabricated by selective laser melting technology.

    Science.gov (United States)

    Tsukanaka, Masako; Fujibayashi, Shunsuke; Takemoto, Mitsuru; Matsushita, Tomiharu; Kokubo, Tadashi; Nakamura, Takashi; Sasaki, Kiyoyuki; Matsuda, Shuichi

    2016-01-01

    Selective laser melting (SLM) technology is useful for the fabrication of porous titanium implants with complex shapes and structures. The materials fabricated by SLM characteristically have a very rough surface (average surface roughness, Ra=24.58 µm). In this study, we evaluated morphologically and biochemically the specific effects of this very rough surface and the additional effects of a bioactive treatment on osteoblast proliferation and differentiation. Flat-rolled titanium materials (Ra=1.02 µm) were used as the controls. On the treated materials fabricated by SLM, we observed enhanced osteoblast differentiation compared with the flat-rolled materials and the untreated materials fabricated by SLM. No significant differences were observed between the flat-rolled materials and the untreated materials fabricated by SLM in their effects on osteoblast differentiation. We concluded that the very rough surface fabricated by SLM had to undergo a bioactive treatment to obtain a positive effect on osteoblast differentiation.

  17. Quantitative EEG Brain Mapping In Psychotropic Drug Development, Drug Treatment Selection, and Monitoring.

    Science.gov (United States)

    Itil, Turan M.; Itil, Kurt Z.

    1995-05-01

    Quantification of standard electroencephalogram (EEG) by digital computers [computer-analyzed EEG (CEEG)] has transformed the subjective analog EEG into an objective scientific method. Until a few years ago, CEEG was only used to assist in the development of psychotropic drugs by means of the quantitative pharmaco EEG. Thanks to the computer revolution and the accompanying reductions in cost of quantification, CEEG can now also be applied in psychiatric practice. CEEG can assist the physician in confirming clinical diagnoses, selecting psychotropic drugs for treatment, and drug treatment monitoring. Advancements in communications technology allow physicians and researchers to reduce the costs of acquiring a high-technology CEEG brain mapping system by utilizing the more economical telephonic services.

  18. Forecasting macroeconomic variables using neural network models and three automated model selection techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2016-01-01

    When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. To alleviate the problem, White (2006) presented a solution (Quick......Net) that converts the specification and nonlinear estimation problem into a linear model selection and estimation problem. We shall compare its performance to that of two other procedures building on the linearization idea: the Marginal Bridge Estimator and Autometrics. Second, one must decide whether forecasting...

  19. Impact of periodic selective mebendazole treatment on soil-transmitted helminth infections in Cuban schoolchildren.

    Science.gov (United States)

    van der Werff, Suzanne D; Vereecken, Kim; van der Laan, Kim; Campos Ponce, Maiza; Junco Díaz, Raquel; Núñez, Fidel A; Rojas Rivero, Lázara; Bonet Gorbea, Mariano; Polman, Katja

    2014-06-01

    To evaluate the impact of periodic selective treatment with 500 mg mebendazole on soil-transmitted helminth (STH) infections in Cuban schoolchildren. We followed up a cohort of 268 STH-positive schoolchildren, aged 5-14 years at baseline, at six-month intervals for two years and a final follow-up after three years. Kato-Katz stool examination was used to detect infections with Ascaris lumbricoides, Trichuris trichiura and hookworm. Common risk factors related to STHs were assessed by parental questionnaire. A significant reduction in the number of STH infections was obtained after three years with the highest reduction for T. trichiura (87.8%) and the lowest for hookworm (57.9%). After six months, cure rates (CRs) were 76.9% for A. lumbricoides, 67.4% for T. trichiura and 44.4% for hookworm. After two treatment rounds, more than 75% of all STH-positive children at baseline were cured, but with important differences between STH species (95.2% for A. lumbricoides, 80.5% for T. trichiura and 76.5% for hookworm). At the end of the study, these cumulative CRs were almost 100% for all three STHs. Risk factors for STHs were sex, sanitary disposal and habit of playing in the soil. Our results indicate that periodic selective treatment with 500 mg mebendazole is effective in reducing the number of STH infections in Cuban schoolchildren. Although important differences were found between helminth species, two rounds of treatment appeared sufficient to obtain substantial reductions. © 2014 John Wiley & Sons Ltd.

  20. Selective androgen receptor modulators for the prevention and treatment of muscle wasting associated with cancer.

    Science.gov (United States)

    Dalton, James T; Taylor, Ryan P; Mohler, Michael L; Steiner, Mitchell S

    2013-12-01

    This review highlights selective androgen receptor modulators (SARMs) as emerging agents in late-stage clinical development for the prevention and treatment of muscle wasting associated with cancer. Muscle wasting, including a loss of skeletal muscle, is a cancer-related symptom that begins early in the progression of cancer and affects a patient's quality of life, ability to tolerate chemotherapy, and survival. SARMs increase muscle mass and improve physical function in healthy and diseased individuals, and potentially may provide a new therapy for muscle wasting and cancer cachexia. SARMs modulate the same anabolic pathways targeted with classical steroidal androgens, but within the dose range in which expected effects on muscle mass and function are seen androgenic side-effects on prostate, skin, and hair have not been observed. Unlike testosterone, SARMs are orally active, nonaromatizable, nonvirilizing, and tissue-selective anabolic agents. Recent clinical efficacy data for LGD-4033, MK-0773, MK-3984, and enobosarm (GTx-024, ostarine, and S-22) are reviewed. Enobosarm, a nonsteroidal SARM, is the most well characterized clinically, and has consistently demonstrated increases in lean body mass and better physical function across several populations along with a lower hazard ratio for survival in cancer patients. Completed in May 2013, results for the Phase III clinical trials entitled Prevention and treatment Of muscle Wasting in patiEnts with Cancer1 (POWER1) and POWER2 evaluating enobosarm for the prevention and treatment of muscle wasting in patients with nonsmall cell lung cancer will be available soon, and will potentially establish a SARM, enobosarm, as the first drug for the prevention and treatment of muscle wasting in cancer patients.