WorldWideScience

Sample records for regression approach electronic

  1. QUANTITATIVE ELECTRONIC STRUCTURE - ACTIVITY RELATIONSHIP OF ANTIMALARIAL COMPOUND OF ARTEMISININ DERIVATIVES USING PRINCIPAL COMPONENT REGRESSION APPROACH

    Directory of Open Access Journals (Sweden)

    Paul Robert Martin Werfette

    2010-06-01

    Full Text Available Analysis of quantitative structure - activity relationship (QSAR for a series of antimalarial compound artemisinin derivatives has been done using principal component regression. The descriptors for QSAR study were representation of electronic structure i.e. atomic net charges of the artemisinin skeleton calculated by AM1 semi-empirical method. The antimalarial activity of the compound was expressed in log 1/IC50 which is an experimental data. The main purpose of the principal component analysis approach is to transform a large data set of atomic net charges to simplify into a data set which known as latent variables. The best QSAR equation to analyze of log 1/IC50 can be obtained from the regression method as a linear function of several latent variables i.e. x1, x2, x3, x4 and x5. The best QSAR model is expressed in the following equation,  (;;   Keywords: QSAR, antimalarial, artemisinin, principal component regression

  2. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  3. Forecasting exchange rates: a robust regression approach

    OpenAIRE

    Preminger, Arie; Franck, Raphael

    2005-01-01

    The least squares estimation method as well as other ordinary estimation method for regression models can be severely affected by a small number of outliers, thus providing poor out-of-sample forecasts. This paper suggests a robust regression approach, based on the S-estimation method, to construct forecasting models that are less sensitive to data contamination by outliers. A robust linear autoregressive (RAR) and a robust neural network (RNN) models are estimated to study the predictabil...

  4. Crime Modeling using Spatial Regression Approach

    Science.gov (United States)

    Saleh Ahmar, Ansari; Adiatma; Kasim Aidid, M.

    2018-01-01

    Act of criminality in Indonesia increased both variety and quantity every year. As murder, rape, assault, vandalism, theft, fraud, fencing, and other cases that make people feel unsafe. Risk of society exposed to crime is the number of reported cases in the police institution. The higher of the number of reporter to the police institution then the number of crime in the region is increasing. In this research, modeling criminality in South Sulawesi, Indonesia with the dependent variable used is the society exposed to the risk of crime. Modelling done by area approach is the using Spatial Autoregressive (SAR) and Spatial Error Model (SEM) methods. The independent variable used is the population density, the number of poor population, GDP per capita, unemployment and the human development index (HDI). Based on the analysis using spatial regression can be shown that there are no dependencies spatial both lag or errors in South Sulawesi.

  5. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  6. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  7. Electronics a systems approach

    CERN Document Server

    Storey, Neil

    2017-01-01

    Electronics plays a central role in our everyday lives. It is at the heart of almost all of today's essential technology, from mobile phones to computers and from cars to power stations. As such, all engineers, scientists and technologists need to have a fundamental understanding of this exciting subject, and for many this will just be the beginning. Now in its sixth edition, Electronics: A Systems Approach provides an outstanding introduction to this fast-moving and important field. Comprehensively revised and updated to cover the latest developments in the world of electronics, the text continues to use Neil Storey's established and well-respected systems approach. It introduces the basic concepts first before progressing to a more advanced analysis, enabling you to contextualise what a system is designed to achieve before tackling the intricacies of designing or analysing its various components with confidence. This book is accompanied by a website which contains over 100 video tutorials to help explain ke...

  8. bayesQR: A Bayesian Approach to Quantile Regression

    Directory of Open Access Journals (Sweden)

    Dries F. Benoit

    2017-01-01

    Full Text Available After its introduction by Koenker and Basset (1978, quantile regression has become an important and popular tool to investigate the conditional response distribution in regression. The R package bayesQR contains a number of routines to estimate quantile regression parameters using a Bayesian approach based on the asymmetric Laplace distribution. The package contains functions for the typical quantile regression with continuous dependent variable, but also supports quantile regression for binary dependent variables. For both types of dependent variables, an approach to variable selection using the adaptive lasso approach is provided. For the binary quantile regression model, the package also contains a routine that calculates the fitted probabilities for each vector of predictors. In addition, functions for summarizing the results, creating traceplots, posterior histograms and drawing quantile plots are included. This paper starts with a brief overview of the theoretical background of the models used in the bayesQR package. The main part of this paper discusses the computational problems that arise in the implementation of the procedure and illustrates the usefulness of the package through selected examples.

  9. Approaches to Low Fuel Regression Rate in Hybrid Rocket Engines

    Directory of Open Access Journals (Sweden)

    Dario Pastrone

    2012-01-01

    Full Text Available Hybrid rocket engines are promising propulsion systems which present appealing features such as safety, low cost, and environmental friendliness. On the other hand, certain issues hamper the development hoped for. The present paper discusses approaches addressing improvements to one of the most important among these issues: low fuel regression rate. To highlight the consequence of such an issue and to better understand the concepts proposed, fundamentals are summarized. Two approaches are presented (multiport grain and high mixture ratio which aim at reducing negative effects without enhancing regression rate. Furthermore, fuel material changes and nonconventional geometries of grain and/or injector are presented as methods to increase fuel regression rate. Although most of these approaches are still at the laboratory or concept scale, many of them are promising.

  10. Analysing inequalities in Germany a structured additive distributional regression approach

    CERN Document Server

    Silbersdorff, Alexander

    2017-01-01

    This book seeks new perspectives on the growing inequalities that our societies face, putting forward Structured Additive Distributional Regression as a means of statistical analysis that circumvents the common problem of analytical reduction to simple point estimators. This new approach allows the observed discrepancy between the individuals’ realities and the abstract representation of those realities to be explicitly taken into consideration using the arithmetic mean alone. In turn, the method is applied to the question of economic inequality in Germany.

  11. Bayesian approach to errors-in-variables in regression models

    Science.gov (United States)

    Rozliman, Nur Aainaa; Ibrahim, Adriana Irawati Nur; Yunus, Rossita Mohammad

    2017-05-01

    In many applications and experiments, data sets are often contaminated with error or mismeasured covariates. When at least one of the covariates in a model is measured with error, Errors-in-Variables (EIV) model can be used. Measurement error, when not corrected, would cause misleading statistical inferences and analysis. Therefore, our goal is to examine the relationship of the outcome variable and the unobserved exposure variable given the observed mismeasured surrogate by applying the Bayesian formulation to the EIV model. We shall extend the flexible parametric method proposed by Hossain and Gustafson (2009) to another nonlinear regression model which is the Poisson regression model. We shall then illustrate the application of this approach via a simulation study using Markov chain Monte Carlo sampling methods.

  12. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  13. A Practical pedestrian approach to parsimonious regression with inaccurate inputs

    Directory of Open Access Journals (Sweden)

    Seppo Karrila

    2014-04-01

    Full Text Available A measurement result often dictates an interval containing the correct value. Interval data is also created by roundoff, truncation, and binning. We focus on such common interval uncertainty in data. Inaccuracy in model inputs is typically ignored on model fitting. We provide a practical approach for regression with inaccurate data: the mathematics is easy, and the linear programming formulations simple to use even in a spreadsheet. This self-contained elementary presentation introduces interval linear systems and requires only basic knowledge of algebra. Feature selection is automatic; but can be controlled to find only a few most relevant inputs; and joint feature selection is enabled for multiple modeled outputs. With more features than cases, a novel connection to compressed sensing emerges: robustness against interval errors-in-variables implies model parsimony, and the input inaccuracies determine the regularization term. A small numerical example highlights counterintuitive results and a dramatic difference to total least squares.

  14. Does intense monitoring matter? A quantile regression approach

    Directory of Open Access Journals (Sweden)

    Fekri Ali Shawtari

    2017-06-01

    Full Text Available Corporate governance has become a centre of attention in corporate management at both micro and macro levels due to adverse consequences and repercussion of insufficient accountability. In this study, we include the Malaysian stock market as sample to explore the impact of intense monitoring on the relationship between intellectual capital performance and market valuation. The objectives of the paper are threefold: i to investigate whether intense monitoring affects the intellectual capital performance of listed companies; ii to explore the impact of intense monitoring on firm value; iii to examine the extent to which the directors serving more than two board committees affects the linkage between intellectual capital performance and firms' value. We employ two approaches, namely, the Ordinary Least Square (OLS and the quantile regression approach. The purpose of the latter is to estimate and generate inference about conditional quantile functions. This method is useful when the conditional distribution does not have a standard shape such as an asymmetric, fat-tailed, or truncated distribution. In terms of variables, the intellectual capital is measured using the value added intellectual coefficient (VAIC, while the market valuation is proxied by firm's market capitalization. The findings of the quantile regression shows that some of the results do not coincide with the results of OLS. We found that intensity of monitoring does not influence the intellectual capital of all firms. It is also evident that intensity of monitoring does not influence the market valuation. However, to some extent, it moderates the relationship between intellectual capital performance and market valuation. This paper contributes to the existing literature as it presents new empirical evidences on the moderating effects of the intensity of monitoring of the board committees on the relationship between performance and intellectual capital.

  15. Bayesian logistic regression approaches to predict incorrect DRG assignment.

    Science.gov (United States)

    Suleiman, Mani; Demirhan, Haydar; Boyd, Leanne; Girosi, Federico; Aksakalli, Vural

    2018-05-07

    Episodes of care involving similar diagnoses and treatments and requiring similar levels of resource utilisation are grouped to the same Diagnosis-Related Group (DRG). In jurisdictions which implement DRG based payment systems, DRGs are a major determinant of funding for inpatient care. Hence, service providers often dedicate auditing staff to the task of checking that episodes have been coded to the correct DRG. The use of statistical models to estimate an episode's probability of DRG error can significantly improve the efficiency of clinical coding audits. This study implements Bayesian logistic regression models with weakly informative prior distributions to estimate the likelihood that episodes require a DRG revision, comparing these models with each other and to classical maximum likelihood estimates. All Bayesian approaches had more stable model parameters than maximum likelihood. The best performing Bayesian model improved overall classification per- formance by 6% compared to maximum likelihood, with a 34% gain compared to random classification, respectively. We found that the original DRG, coder and the day of coding all have a significant effect on the likelihood of DRG error. Use of Bayesian approaches has improved model parameter stability and classification accuracy. This method has already lead to improved audit efficiency in an operational capacity.

  16. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    Bulej, Lubomír

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  17. Testing for Stock Market Contagion: A Quantile Regression Approach

    NARCIS (Netherlands)

    S.Y. Park (Sung); W. Wang (Wendun); N. Huang (Naijing)

    2015-01-01

    markdownabstract__Abstract__ Regarding the asymmetric and leptokurtic behavior of financial data, we propose a new contagion test in the quantile regression framework that is robust to model misspecification. Unlike conventional correlation-based tests, the proposed quantile contagion test

  18. Identifying predictors of physics item difficulty: A linear regression approach

    Science.gov (United States)

    Mesic, Vanes; Muratovic, Hasnija

    2011-06-01

    Large-scale assessments of student achievement in physics are often approached with an intention to discriminate students based on the attained level of their physics competencies. Therefore, for purposes of test design, it is important that items display an acceptable discriminatory behavior. To that end, it is recommended to avoid extraordinary difficult and very easy items. Knowing the factors that influence physics item difficulty makes it possible to model the item difficulty even before the first pilot study is conducted. Thus, by identifying predictors of physics item difficulty, we can improve the test-design process. Furthermore, we get additional qualitative feedback regarding the basic aspects of student cognitive achievement in physics that are directly responsible for the obtained, quantitative test results. In this study, we conducted a secondary analysis of data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we explored the concept of “physics competence” and performed a content analysis of 123 physics items that were included within the above-mentioned assessments. Thereafter, an item database was created. Items were described by variables which reflect some basic cognitive aspects of physics competence. For each of the assessments, Rasch item difficulties were calculated in separate analyses. In order to make the item difficulties from different assessments comparable, a virtual test equating procedure had to be implemented. Finally, a regression model of physics item difficulty was created. It has been shown that 61.2% of item difficulty variance can be explained by factors which reflect the automaticity, complexity, and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the divergence of required thinking and interference effects between intuitive and formal physics knowledge

  19. Identifying predictors of physics item difficulty: A linear regression approach

    Directory of Open Access Journals (Sweden)

    Hasnija Muratovic

    2011-06-01

    Full Text Available Large-scale assessments of student achievement in physics are often approached with an intention to discriminate students based on the attained level of their physics competencies. Therefore, for purposes of test design, it is important that items display an acceptable discriminatory behavior. To that end, it is recommended to avoid extraordinary difficult and very easy items. Knowing the factors that influence physics item difficulty makes it possible to model the item difficulty even before the first pilot study is conducted. Thus, by identifying predictors of physics item difficulty, we can improve the test-design process. Furthermore, we get additional qualitative feedback regarding the basic aspects of student cognitive achievement in physics that are directly responsible for the obtained, quantitative test results. In this study, we conducted a secondary analysis of data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we explored the concept of “physics competence” and performed a content analysis of 123 physics items that were included within the above-mentioned assessments. Thereafter, an item database was created. Items were described by variables which reflect some basic cognitive aspects of physics competence. For each of the assessments, Rasch item difficulties were calculated in separate analyses. In order to make the item difficulties from different assessments comparable, a virtual test equating procedure had to be implemented. Finally, a regression model of physics item difficulty was created. It has been shown that 61.2% of item difficulty variance can be explained by factors which reflect the automaticity, complexity, and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the divergence of required thinking and interference effects between intuitive and formal

  20. The price sensitivity of Medicare beneficiaries: a regression discontinuity approach.

    Science.gov (United States)

    Buchmueller, Thomas C; Grazier, Kyle; Hirth, Richard A; Okeke, Edward N

    2013-01-01

    We use 4 years of data from the retiree health benefits program of the University of Michigan to estimate the effect of price on the health plan choices of Medicare beneficiaries. During the period of our analysis, changes in the University's premium contribution rules led to substantial price changes. A key feature of this 'natural experiment' is that individuals who had retired before a certain date were exempted from having to pay any premium contributions. This 'grandfathering' creates quasi-experimental variation that is ideal for estimating the effect of price. Using regression discontinuity methods, we compare the plan choices of individuals who retired just after the grandfathering cutoff date and were therefore exposed to significant price changes to the choices of a 'control group' of individuals who retired just before that date and therefore did not experience the price changes. The results indicate a statistically significant effect of price, with a $10 increase in monthly premium contributions leading to a 2 to 3 percentage point decrease in a plan's market share. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Efficient and robust cell detection: A structured regression approach.

    Science.gov (United States)

    Xie, Yuanpu; Xing, Fuyong; Shi, Xiaoshuang; Kong, Xiangfei; Su, Hai; Yang, Lin

    2018-02-01

    Efficient and robust cell detection serves as a critical prerequisite for many subsequent biomedical image analysis methods and computer-aided diagnosis (CAD). It remains a challenging task due to touching cells, inhomogeneous background noise, and large variations in cell sizes and shapes. In addition, the ever-increasing amount of available datasets and the high resolution of whole-slice scanned images pose a further demand for efficient processing algorithms. In this paper, we present a novel structured regression model based on a proposed fully residual convolutional neural network for efficient cell detection. For each testing image, our model learns to produce a dense proximity map that exhibits higher responses at locations near cell centers. Our method only requires a few training images with weak annotations (just one dot indicating the cell centroids). We have extensively evaluated our method using four different datasets, covering different microscopy staining methods (e.g., H & E or Ki-67 staining) or image acquisition techniques (e.g., bright-filed image or phase contrast). Experimental results demonstrate the superiority of our method over existing state of the art methods in terms of both detection accuracy and running time. Copyright © 2017. Published by Elsevier B.V.

  2. Neighborhood Effects in Wind Farm Performance: A Regression Approach

    Directory of Open Access Journals (Sweden)

    Matthias Ritter

    2017-03-01

    Full Text Available The optimization of turbine density in wind farms entails a trade-off between the usage of scarce, expensive land and power losses through turbine wake effects. A quantification and prediction of the wake effect, however, is challenging because of the complex aerodynamic nature of the interdependencies of turbines. In this paper, we propose a parsimonious data driven regression wake model that can be used to predict production losses of existing and potential wind farms. Motivated by simple engineering wake models, the predicting variables are wind speed, the turbine alignment angle, and distance. By utilizing data from two wind farms in Germany, we show that our models can compete with the standard Jensen model in predicting wake effect losses. A scenario analysis reveals that a distance between turbines can be reduced by up to three times the rotor size, without entailing substantial production losses. In contrast, an unfavorable configuration of turbines with respect to the main wind direction can result in production losses that are much higher than in an optimal case.

  3. Detection of Differential Item Functioning with Nonlinear Regression: A Non-IRT Approach Accounting for Guessing

    Science.gov (United States)

    Drabinová, Adéla; Martinková, Patrícia

    2017-01-01

    In this article we present a general approach not relying on item response theory models (non-IRT) to detect differential item functioning (DIF) in dichotomous items with presence of guessing. The proposed nonlinear regression (NLR) procedure for DIF detection is an extension of method based on logistic regression. As a non-IRT approach, NLR can…

  4. Design and analysis of experiments classical and regression approaches with SAS

    CERN Document Server

    Onyiah, Leonard C

    2008-01-01

    Introductory Statistical Inference and Regression Analysis Elementary Statistical Inference Regression Analysis Experiments, the Completely Randomized Design (CRD)-Classical and Regression Approaches Experiments Experiments to Compare Treatments Some Basic Ideas Requirements of a Good Experiment One-Way Experimental Layout or the CRD: Design and Analysis Analysis of Experimental Data (Fixed Effects Model) Expected Values for the Sums of Squares The Analysis of Variance (ANOVA) Table Follow-Up Analysis to Check fo

  5. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    Science.gov (United States)

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  6. Fuzzy multinomial logistic regression analysis: A multi-objective programming approach

    Science.gov (United States)

    Abdalla, Hesham A.; El-Sayed, Amany A.; Hamed, Ramadan

    2017-05-01

    Parameter estimation for multinomial logistic regression is usually based on maximizing the likelihood function. For large well-balanced datasets, Maximum Likelihood (ML) estimation is a satisfactory approach. Unfortunately, ML can fail completely or at least produce poor results in terms of estimated probabilities and confidence intervals of parameters, specially for small datasets. In this study, a new approach based on fuzzy concepts is proposed to estimate parameters of the multinomial logistic regression. The study assumes that the parameters of multinomial logistic regression are fuzzy. Based on the extension principle stated by Zadeh and Bárdossy's proposition, a multi-objective programming approach is suggested to estimate these fuzzy parameters. A simulation study is used to evaluate the performance of the new approach versus Maximum likelihood (ML) approach. Results show that the new proposed model outperforms ML in cases of small datasets.

  7. Comparative analysis of neural network and regression based condition monitoring approaches for wind turbine fault detection

    DEFF Research Database (Denmark)

    Schlechtingen, Meik; Santos, Ilmar

    2011-01-01

    This paper presents the research results of a comparison of three different model based approaches for wind turbine fault detection in online SCADA data, by applying developed models to five real measured faults and anomalies. The regression based model as the simplest approach to build a normal...

  8. Multiple regression approach to predict turbine-generator output for Chinshan nuclear power plant

    International Nuclear Information System (INIS)

    Chan, Yea-Kuang; Tsai, Yu-Ching

    2017-01-01

    The objective of this study is to develop a turbine cycle model using the multiple regression approach to estimate the turbine-generator output for the Chinshan Nuclear Power Plant (NPP). The plant operating data was verified using a linear regression model with a corresponding 95% confidence interval for the operating data. In this study, the key parameters were selected as inputs for the multiple regression based turbine cycle model. The proposed model was used to estimate the turbine-generator output. The effectiveness of the proposed turbine cycle model was demonstrated by using plant operating data obtained from the Chinshan NPP Unit 2. The results show that this multiple regression based turbine cycle model can be used to accurately estimate the turbine-generator output. In addition, this study also provides an alternative approach with simple and easy features to evaluate the thermal performance for nuclear power plants.

  9. Multiple regression approach to predict turbine-generator output for Chinshan nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Yea-Kuang; Tsai, Yu-Ching [Institute of Nuclear Energy Research, Taoyuan City, Taiwan (China). Nuclear Engineering Division

    2017-03-15

    The objective of this study is to develop a turbine cycle model using the multiple regression approach to estimate the turbine-generator output for the Chinshan Nuclear Power Plant (NPP). The plant operating data was verified using a linear regression model with a corresponding 95% confidence interval for the operating data. In this study, the key parameters were selected as inputs for the multiple regression based turbine cycle model. The proposed model was used to estimate the turbine-generator output. The effectiveness of the proposed turbine cycle model was demonstrated by using plant operating data obtained from the Chinshan NPP Unit 2. The results show that this multiple regression based turbine cycle model can be used to accurately estimate the turbine-generator output. In addition, this study also provides an alternative approach with simple and easy features to evaluate the thermal performance for nuclear power plants.

  10. An Ionospheric Index Model based on Linear Regression and Neural Network Approaches

    Science.gov (United States)

    Tshisaphungo, Mpho; McKinnell, Lee-Anne; Bosco Habarulema, John

    2017-04-01

    The ionosphere is well known to reflect radio wave signals in the high frequency (HF) band due to the present of electron and ions within the region. To optimise the use of long distance HF communications, it is important to understand the drivers of ionospheric storms and accurately predict the propagation conditions especially during disturbed days. This paper presents the development of an ionospheric storm-time index over the South African region for the application of HF communication users. The model will result into a valuable tool to measure the complex ionospheric behaviour in an operational space weather monitoring and forecasting environment. The development of an ionospheric storm-time index is based on a single ionosonde station data over Grahamstown (33.3°S,26.5°E), South Africa. Critical frequency of the F2 layer (foF2) measurements for a period 1996-2014 were considered for this study. The model was developed based on linear regression and neural network approaches. In this talk validation results for low, medium and high solar activity periods will be discussed to demonstrate model's performance.

  11. Dual Regression

    OpenAIRE

    Spady, Richard; Stouli, Sami

    2012-01-01

    We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...

  12. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    Science.gov (United States)

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods

  13. A Quantile Regression Approach to Estimating the Distribution of Anesthetic Procedure Time during Induction.

    Directory of Open Access Journals (Sweden)

    Hsin-Lun Wu

    Full Text Available Although procedure time analyses are important for operating room management, it is not easy to extract useful information from clinical procedure time data. A novel approach was proposed to analyze procedure time during anesthetic induction. A two-step regression analysis was performed to explore influential factors of anesthetic induction time (AIT. Linear regression with stepwise model selection was used to select significant correlates of AIT and then quantile regression was employed to illustrate the dynamic relationships between AIT and selected variables at distinct quantiles. A total of 1,060 patients were analyzed. The first and second-year residents (R1-R2 required longer AIT than the third and fourth-year residents and attending anesthesiologists (p = 0.006. Factors prolonging AIT included American Society of Anesthesiologist physical status ≧ III, arterial, central venous and epidural catheterization, and use of bronchoscopy. Presence of surgeon before induction would decrease AIT (p < 0.001. Types of surgery also had significant influence on AIT. Quantile regression satisfactorily estimated extra time needed to complete induction for each influential factor at distinct quantiles. Our analysis on AIT demonstrated the benefit of quantile regression analysis to provide more comprehensive view of the relationships between procedure time and related factors. This novel two-step regression approach has potential applications to procedure time analysis in operating room management.

  14. Modeling Personalized Email Prioritization: Classification-based and Regression-based Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Yoo S.; Yang, Y.; Carbonell, J.

    2011-10-24

    Email overload, even after spam filtering, presents a serious productivity challenge for busy professionals and executives. One solution is automated prioritization of incoming emails to ensure the most important are read and processed quickly, while others are processed later as/if time permits in declining priority levels. This paper presents a study of machine learning approaches to email prioritization into discrete levels, comparing ordinal regression versus classier cascades. Given the ordinal nature of discrete email priority levels, SVM ordinal regression would be expected to perform well, but surprisingly a cascade of SVM classifiers significantly outperforms ordinal regression for email prioritization. In contrast, SVM regression performs well -- better than classifiers -- on selected UCI data sets. This unexpected performance inversion is analyzed and results are presented, providing core functionality for email prioritization systems.

  15. Ordinary Least Squares and Quantile Regression: An Inquiry-Based Learning Approach to a Comparison of Regression Methods

    Science.gov (United States)

    Helmreich, James E.; Krog, K. Peter

    2018-01-01

    We present a short, inquiry-based learning course on concepts and methods underlying ordinary least squares (OLS), least absolute deviation (LAD), and quantile regression (QR). Students investigate squared, absolute, and weighted absolute distance functions (metrics) as location measures. Using differential calculus and properties of convex…

  16. Time series modeling by a regression approach based on a latent process.

    Science.gov (United States)

    Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice

    2009-01-01

    Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.

  17. SU-E-J-212: Identifying Bones From MRI: A Dictionary Learnign and Sparse Regression Approach

    International Nuclear Information System (INIS)

    Ruan, D; Yang, Y; Cao, M; Hu, P; Low, D

    2014-01-01

    Purpose: To develop an efficient and robust scheme to identify bony anatomy based on MRI-only simulation images. Methods: MRI offers important soft tissue contrast and functional information, yet its lack of correlation to electron-density has placed it as an auxiliary modality to CT in radiotherapy simulation and adaptation. An effective scheme to identify bony anatomy is an important first step towards MR-only simulation/treatment paradigm and would satisfy most practical purposes. We utilize a UTE acquisition sequence to achieve visibility of the bone. By contrast to manual + bulk or registration-to identify bones, we propose a novel learning-based approach for improved robustness to MR artefacts and environmental changes. Specifically, local information is encoded with MR image patch, and the corresponding label is extracted (during training) from simulation CT aligned to the UTE. Within each class (bone vs. nonbone), an overcomplete dictionary is learned so that typical patches within the proper class can be represented as a sparse combination of the dictionary entries. For testing, an acquired UTE-MRI is divided to patches using a sliding scheme, where each patch is sparsely regressed against both bone and nonbone dictionaries, and subsequently claimed to be associated with the class with the smaller residual. Results: The proposed method has been applied to the pilot site of brain imaging and it has showed general good performance, with dice similarity coefficient of greater than 0.9 in a crossvalidation study using 4 datasets. Importantly, it is robust towards consistent foreign objects (e.g., headset) and the artefacts relates to Gibbs and field heterogeneity. Conclusion: A learning perspective has been developed for inferring bone structures based on UTE MRI. The imaging setting is subject to minimal motion effects and the post-processing is efficient. The improved efficiency and robustness enables a first translation to MR-only routine. The scheme

  18. SU-E-J-212: Identifying Bones From MRI: A Dictionary Learnign and Sparse Regression Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ruan, D; Yang, Y; Cao, M; Hu, P; Low, D [UCLA, Los Angeles, CA (United States)

    2014-06-01

    Purpose: To develop an efficient and robust scheme to identify bony anatomy based on MRI-only simulation images. Methods: MRI offers important soft tissue contrast and functional information, yet its lack of correlation to electron-density has placed it as an auxiliary modality to CT in radiotherapy simulation and adaptation. An effective scheme to identify bony anatomy is an important first step towards MR-only simulation/treatment paradigm and would satisfy most practical purposes. We utilize a UTE acquisition sequence to achieve visibility of the bone. By contrast to manual + bulk or registration-to identify bones, we propose a novel learning-based approach for improved robustness to MR artefacts and environmental changes. Specifically, local information is encoded with MR image patch, and the corresponding label is extracted (during training) from simulation CT aligned to the UTE. Within each class (bone vs. nonbone), an overcomplete dictionary is learned so that typical patches within the proper class can be represented as a sparse combination of the dictionary entries. For testing, an acquired UTE-MRI is divided to patches using a sliding scheme, where each patch is sparsely regressed against both bone and nonbone dictionaries, and subsequently claimed to be associated with the class with the smaller residual. Results: The proposed method has been applied to the pilot site of brain imaging and it has showed general good performance, with dice similarity coefficient of greater than 0.9 in a crossvalidation study using 4 datasets. Importantly, it is robust towards consistent foreign objects (e.g., headset) and the artefacts relates to Gibbs and field heterogeneity. Conclusion: A learning perspective has been developed for inferring bone structures based on UTE MRI. The imaging setting is subject to minimal motion effects and the post-processing is efficient. The improved efficiency and robustness enables a first translation to MR-only routine. The scheme

  19. A Hybrid Approach of Stepwise Regression, Logistic Regression, Support Vector Machine, and Decision Tree for Forecasting Fraudulent Financial Statements

    Directory of Open Access Journals (Sweden)

    Suduan Chen

    2014-01-01

    Full Text Available As the fraudulent financial statement of an enterprise is increasingly serious with each passing day, establishing a valid forecasting fraudulent financial statement model of an enterprise has become an important question for academic research and financial practice. After screening the important variables using the stepwise regression, the study also matches the logistic regression, support vector machine, and decision tree to construct the classification models to make a comparison. The study adopts financial and nonfinancial variables to assist in establishment of the forecasting fraudulent financial statement model. Research objects are the companies to which the fraudulent and nonfraudulent financial statement happened between years 1998 to 2012. The findings are that financial and nonfinancial information are effectively used to distinguish the fraudulent financial statement, and decision tree C5.0 has the best classification effect 85.71%.

  20. A hybrid approach of stepwise regression, logistic regression, support vector machine, and decision tree for forecasting fraudulent financial statements.

    Science.gov (United States)

    Chen, Suduan; Goo, Yeong-Jia James; Shen, Zone-De

    2014-01-01

    As the fraudulent financial statement of an enterprise is increasingly serious with each passing day, establishing a valid forecasting fraudulent financial statement model of an enterprise has become an important question for academic research and financial practice. After screening the important variables using the stepwise regression, the study also matches the logistic regression, support vector machine, and decision tree to construct the classification models to make a comparison. The study adopts financial and nonfinancial variables to assist in establishment of the forecasting fraudulent financial statement model. Research objects are the companies to which the fraudulent and nonfraudulent financial statement happened between years 1998 to 2012. The findings are that financial and nonfinancial information are effectively used to distinguish the fraudulent financial statement, and decision tree C5.0 has the best classification effect 85.71%.

  1. A Novel Imbalanced Data Classification Approach Based on Logistic Regression and Fisher Discriminant

    Directory of Open Access Journals (Sweden)

    Baofeng Shi

    2015-01-01

    Full Text Available We introduce an imbalanced data classification approach based on logistic regression significant discriminant and Fisher discriminant. First of all, a key indicators extraction model based on logistic regression significant discriminant and correlation analysis is derived to extract features for customer classification. Secondly, on the basis of the linear weighted utilizing Fisher discriminant, a customer scoring model is established. And then, a customer rating model where the customer number of all ratings follows normal distribution is constructed. The performance of the proposed model and the classical SVM classification method are evaluated in terms of their ability to correctly classify consumers as default customer or nondefault customer. Empirical results using the data of 2157 customers in financial engineering suggest that the proposed approach better performance than the SVM model in dealing with imbalanced data classification. Moreover, our approach contributes to locating the qualified customers for the banks and the bond investors.

  2. A computational approach to compare regression modelling strategies in prediction research.

    Science.gov (United States)

    Pajouheshnia, Romin; Pestman, Wiebe R; Teerenstra, Steven; Groenwold, Rolf H H

    2016-08-25

    It is often unclear which approach to fit, assess and adjust a model will yield the most accurate prediction model. We present an extension of an approach for comparing modelling strategies in linear regression to the setting of logistic regression and demonstrate its application in clinical prediction research. A framework for comparing logistic regression modelling strategies by their likelihoods was formulated using a wrapper approach. Five different strategies for modelling, including simple shrinkage methods, were compared in four empirical data sets to illustrate the concept of a priori strategy comparison. Simulations were performed in both randomly generated data and empirical data to investigate the influence of data characteristics on strategy performance. We applied the comparison framework in a case study setting. Optimal strategies were selected based on the results of a priori comparisons in a clinical data set and the performance of models built according to each strategy was assessed using the Brier score and calibration plots. The performance of modelling strategies was highly dependent on the characteristics of the development data in both linear and logistic regression settings. A priori comparisons in four empirical data sets found that no strategy consistently outperformed the others. The percentage of times that a model adjustment strategy outperformed a logistic model ranged from 3.9 to 94.9 %, depending on the strategy and data set. However, in our case study setting the a priori selection of optimal methods did not result in detectable improvement in model performance when assessed in an external data set. The performance of prediction modelling strategies is a data-dependent process and can be highly variable between data sets within the same clinical domain. A priori strategy comparison can be used to determine an optimal logistic regression modelling strategy for a given data set before selecting a final modelling approach.

  3. A fuzzy regression with support vector machine approach to the estimation of horizontal global solar radiation

    International Nuclear Information System (INIS)

    Baser, Furkan; Demirhan, Haydar

    2017-01-01

    Accurate estimation of the amount of horizontal global solar radiation for a particular field is an important input for decision processes in solar radiation investments. In this article, we focus on the estimation of yearly mean daily horizontal global solar radiation by using an approach that utilizes fuzzy regression functions with support vector machine (FRF-SVM). This approach is not seriously affected by outlier observations and does not suffer from the over-fitting problem. To demonstrate the utility of the FRF-SVM approach in the estimation of horizontal global solar radiation, we conduct an empirical study over a dataset collected in Turkey and applied the FRF-SVM approach with several kernel functions. Then, we compare the estimation accuracy of the FRF-SVM approach to an adaptive neuro-fuzzy system and a coplot supported-genetic programming approach. We observe that the FRF-SVM approach with a Gaussian kernel function is not affected by both outliers and over-fitting problem and gives the most accurate estimates of horizontal global solar radiation among the applied approaches. Consequently, the use of hybrid fuzzy functions and support vector machine approaches is found beneficial in long-term forecasting of horizontal global solar radiation over a region with complex climatic and terrestrial characteristics. - Highlights: • A fuzzy regression functions with support vector machines approach is proposed. • The approach is robust against outlier observations and over-fitting problem. • Estimation accuracy of the model is superior to several existent alternatives. • A new solar radiation estimation model is proposed for the region of Turkey. • The model is useful under complex terrestrial and climatic conditions.

  4. Hydrodynamic approach to electronic transport in graphene

    Energy Technology Data Exchange (ETDEWEB)

    Narozhny, Boris N. [Institute for Theoretical Condensed Matter Physics, Karlsruhe Institute of Technology, Karlsruhe (Germany); National Research Nuclear University MEPhI (Moscow Engineering Physics Institute), Moscow (Russian Federation); Gornyi, Igor V. [Institute for Theoretical Condensed Matter Physics, Karlsruhe Institute of Technology, Karlsruhe (Germany); Institute of Nanotechnology, Karlsruhe Institute of Technology, Karlsruhe (Germany); Ioffe Physical Technical Institute, St. Petersburg (Russian Federation); Mirlin, Alexander D. [Institute for Theoretical Condensed Matter Physics, Karlsruhe Institute of Technology, Karlsruhe (Germany); Institute of Nanotechnology, Karlsruhe Institute of Technology, Karlsruhe (Germany); Petersburg Nuclear Physics Institute, St. Petersburg (Russian Federation); Schmalian, Joerg [Institute for Theoretical Condensed Matter Physics, Karlsruhe Institute of Technology, Karlsruhe (Germany); Institute for Solid State Physics, Karlsruhe Institute of Technology, Karlsruhe (Germany)

    2017-11-15

    The last few years have seen an explosion of interest in hydrodynamic effects in interacting electron systems in ultra-pure materials. In this paper we briefly review the recent advances, both theoretical and experimental, in the hydrodynamic approach to electronic transport in graphene, focusing on viscous phenomena, Coulomb drag, non-local transport measurements, and possibilities for observing nonlinear effects. (copyright 2017 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  5. Corporate Social Responsibility and Financial Performance: A Two Least Regression Approach

    Directory of Open Access Journals (Sweden)

    Alexander Olawumi Dabor

    2017-12-01

    Full Text Available The objective of this study is to investigate the casuality between corporate social responsibility and firm financial performance. The study employed two least square regression approaches. Fifty-two firms were selected using the scientific method. The findings revealed that corporate social responsibility and firm performance in manufacturing sector are mutually related at 5%. The study recommended that management of manufacturing companies in Nigeria should expend on CSR to boost profitability and corporate image.

  6. Modelling the return distribution of salmon farming companies : a quantile regression approach

    OpenAIRE

    Jacobsen, Fredrik

    2017-01-01

    The salmon farming industry has gained increased attention from investors, portfolio managers, financial analysts and other stakeholders the recent years. Despite this development, very little is known about the risk and return of salmon farming company stocks, and especially how the relationship between risk and return varies under different market conditions, given the volatile nature of the salmon farming industry. We approach this problem by using quantile regression to examine the relati...

  7. A land use regression model for ambient ultrafine particles in Montreal, Canada: A comparison of linear regression and a machine learning approach.

    Science.gov (United States)

    Weichenthal, Scott; Ryswyk, Keith Van; Goldstein, Alon; Bagg, Scott; Shekkarizfard, Maryam; Hatzopoulou, Marianne

    2016-04-01

    Existing evidence suggests that ambient ultrafine particles (UFPs) (regression model for UFPs in Montreal, Canada using mobile monitoring data collected from 414 road segments during the summer and winter months between 2011 and 2012. Two different approaches were examined for model development including standard multivariable linear regression and a machine learning approach (kernel-based regularized least squares (KRLS)) that learns the functional form of covariate impacts on ambient UFP concentrations from the data. The final models included parameters for population density, ambient temperature and wind speed, land use parameters (park space and open space), length of local roads and rail, and estimated annual average NOx emissions from traffic. The final multivariable linear regression model explained 62% of the spatial variation in ambient UFP concentrations whereas the KRLS model explained 79% of the variance. The KRLS model performed slightly better than the linear regression model when evaluated using an external dataset (R(2)=0.58 vs. 0.55) or a cross-validation procedure (R(2)=0.67 vs. 0.60). In general, our findings suggest that the KRLS approach may offer modest improvements in predictive performance compared to standard multivariable linear regression models used to estimate spatial variations in ambient UFPs. However, differences in predictive performance were not statistically significant when evaluated using the cross-validation procedure. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  8. Poisson regression approach for modeling fatal injury rates amongst Malaysian workers

    International Nuclear Information System (INIS)

    Kamarulzaman Ibrahim; Heng Khai Theng

    2005-01-01

    Many safety studies are based on the analysis carried out on injury surveillance data. The injury surveillance data gathered for the analysis include information on number of employees at risk of injury in each of several strata where the strata are defined in terms of a series of important predictor variables. Further insight into the relationship between fatal injury rates and predictor variables may be obtained by the poisson regression approach. Poisson regression is widely used in analyzing count data. In this study, poisson regression is used to model the relationship between fatal injury rates and predictor variables which are year (1995-2002), gender, recording system and industry type. Data for the analysis were obtained from PERKESO and Jabatan Perangkaan Malaysia. It is found that the assumption that the data follow poisson distribution has been violated. After correction for the problem of over dispersion, the predictor variables that are found to be significant in the model are gender, system of recording, industry type, two interaction effects (interaction between recording system and industry type and between year and industry type). Introduction Regression analysis is one of the most popular

  9. The quantile regression approach to efficiency measurement: insights from Monte Carlo simulations.

    Science.gov (United States)

    Liu, Chunping; Laporte, Audrey; Ferguson, Brian S

    2008-09-01

    In the health economics literature there is an ongoing debate over approaches used to estimate the efficiency of health systems at various levels, from the level of the individual hospital - or nursing home - up to that of the health system as a whole. The two most widely used approaches to evaluating the efficiency with which various units deliver care are non-parametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Productivity researchers tend to have very strong preferences over which methodology to use for efficiency estimation. In this paper, we use Monte Carlo simulation to compare the performance of DEA and SFA in terms of their ability to accurately estimate efficiency. We also evaluate quantile regression as a potential alternative approach. A Cobb-Douglas production function, random error terms and a technical inefficiency term with different distributions are used to calculate the observed output. The results, based on these experiments, suggest that neither DEA nor SFA can be regarded as clearly dominant, and that, depending on the quantile estimated, the quantile regression approach may be a useful addition to the armamentarium of methods for estimating technical efficiency.

  10. A multi-scale relevance vector regression approach for daily urban water demand forecasting

    Science.gov (United States)

    Bai, Yun; Wang, Pu; Li, Chuan; Xie, Jingjing; Wang, Yin

    2014-09-01

    Water is one of the most important resources for economic and social developments. Daily water demand forecasting is an effective measure for scheduling urban water facilities. This work proposes a multi-scale relevance vector regression (MSRVR) approach to forecast daily urban water demand. The approach uses the stationary wavelet transform to decompose historical time series of daily water supplies into different scales. At each scale, the wavelet coefficients are used to train a machine-learning model using the relevance vector regression (RVR) method. The estimated coefficients of the RVR outputs for all of the scales are employed to reconstruct the forecasting result through the inverse wavelet transform. To better facilitate the MSRVR forecasting, the chaos features of the daily water supply series are analyzed to determine the input variables of the RVR model. In addition, an adaptive chaos particle swarm optimization algorithm is used to find the optimal combination of the RVR model parameters. The MSRVR approach is evaluated using real data collected from two waterworks and is compared with recently reported methods. The results show that the proposed MSRVR method can forecast daily urban water demand much more precisely in terms of the normalized root-mean-square error, correlation coefficient, and mean absolute percentage error criteria.

  11. Partitioning of late gestation energy expenditure in ewes using indirect calorimetry and a linear regression approach

    DEFF Research Database (Denmark)

    Kiani, Alishir; Chwalibog, André; Nielsen, Mette O

    2007-01-01

    Late gestation energy expenditure (EE(gest)) originates from energy expenditure (EE) of development of conceptus (EE(conceptus)) and EE of homeorhetic adaptation of metabolism (EE(homeorhetic)). Even though EE(gest) is relatively easy to quantify, its partitioning is problematic. In the present...... study metabolizable energy (ME) intake ranges for twin-bearing ewes were 220-440, 350- 700, 350-900 kJ per metabolic body weight (W0.75) at week seven, five, two pre-partum respectively. Indirect calorimetry and a linear regression approach were used to quantify EE(gest) and then partition to EE......(conceptus) and EE(homeorhetic). Energy expenditure of basal metabolism of the non-gravid tissues (EE(bmng)), derived from the intercept of the linear regression equation of retained energy [kJ/W0.75] and ME intake [kJ/W(0.75)], was 298 [kJ/ W0.75]. Values of the intercepts of the regression equations at week seven...

  12. When homogeneity meets heterogeneity: the geographically weighted regression with spatial lag approach to prenatal care utilization

    Science.gov (United States)

    Shoff, Carla; Chen, Vivian Yi-Ju; Yang, Tse-Chuan

    2014-01-01

    Using geographically weighted regression (GWR), a recent study by Shoff and colleagues (2012) investigated the place-specific risk factors for prenatal care utilization in the US and found that most of the relationships between late or not prenatal care and its determinants are spatially heterogeneous. However, the GWR approach may be subject to the confounding effect of spatial homogeneity. The goal of this study is to address this concern by including both spatial homogeneity and heterogeneity into the analysis. Specifically, we employ an analytic framework where a spatially lagged (SL) effect of the dependent variable is incorporated into the GWR model, which is called GWR-SL. Using this innovative framework, we found evidence to argue that spatial homogeneity is neglected in the study by Shoff et al. (2012) and the results are changed after considering the spatially lagged effect of prenatal care utilization. The GWR-SL approach allows us to gain a place-specific understanding of prenatal care utilization in US counties. In addition, we compared the GWR-SL results with the results of conventional approaches (i.e., OLS and spatial lag models) and found that GWR-SL is the preferred modeling approach. The new findings help us to better estimate how the predictors are associated with prenatal care utilization across space, and determine whether and how the level of prenatal care utilization in neighboring counties matters. PMID:24893033

  13. A different approach to estimate nonlinear regression model using numerical methods

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper concerns with the computational methods namely the Gauss-Newton method, Gradient algorithm methods (Newton-Raphson method, Steepest Descent or Steepest Ascent algorithm method, the Method of Scoring, the Method of Quadratic Hill-Climbing) based on numerical analysis to estimate parameters of nonlinear regression model in a very different way. Principles of matrix calculus have been used to discuss the Gradient-Algorithm methods. Yonathan Bard [1] discussed a comparison of gradient methods for the solution of nonlinear parameter estimation problems. However this article discusses an analytical approach to the gradient algorithm methods in a different way. This paper describes a new iterative technique namely Gauss-Newton method which differs from the iterative technique proposed by Gorden K. Smyth [2]. Hans Georg Bock et.al [10] proposed numerical methods for parameter estimation in DAE’s (Differential algebraic equation). Isabel Reis Dos Santos et al [11], Introduced weighted least squares procedure for estimating the unknown parameters of a nonlinear regression metamodel. For large-scale non smooth convex minimization the Hager and Zhang (HZ) conjugate gradient Method and the modified HZ (MHZ) method were presented by Gonglin Yuan et al [12].

  14. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  15. A modified approach to estimating sample size for simple logistic regression with one continuous covariate.

    Science.gov (United States)

    Novikov, I; Fund, N; Freedman, L S

    2010-01-15

    Different methods for the calculation of sample size for simple logistic regression (LR) with one normally distributed continuous covariate give different results. Sometimes the difference can be large. Furthermore, some methods require the user to specify the prevalence of cases when the covariate equals its population mean, rather than the more natural population prevalence. We focus on two commonly used methods and show through simulations that the power for a given sample size may differ substantially from the nominal value for one method, especially when the covariate effect is large, while the other method performs poorly if the user provides the population prevalence instead of the required parameter. We propose a modification of the method of Hsieh et al. that requires specification of the population prevalence and that employs Schouten's sample size formula for a t-test with unequal variances and group sizes. This approach appears to increase the accuracy of the sample size estimates for LR with one continuous covariate.

  16. Heterogeneous effects of oil shocks on exchange rates: evidence from a quantile regression approach.

    Science.gov (United States)

    Su, Xianfang; Zhu, Huiming; You, Wanhai; Ren, Yinghua

    2016-01-01

    The determinants of exchange rates have attracted considerable attention among researchers over the past several decades. Most studies, however, ignore the possibility that the impact of oil shocks on exchange rates could vary across the exchange rate returns distribution. We employ a quantile regression approach to address this issue. Our results indicate that the effect of oil shocks on exchange rates is heterogeneous across quantiles. A large US depreciation or appreciation tends to heighten the effects of oil shocks on exchange rate returns. Positive oil demand shocks lead to appreciation pressures in oil-exporting countries and this result is robust across lower and upper return distributions. These results offer rich and useful information for investors and decision-makers.

  17. A new approach to nuclear reactor design optimization using genetic algorithms and regression analysis

    International Nuclear Information System (INIS)

    Kumar, Akansha; Tsvetkov, Pavel V.

    2015-01-01

    desired power peaking limits, desired effective and infinite neutron multiplication factors, high fast fission factor, high thermal efficiency in the conversion from thermal energy to electrical energy using the Brayton cycle, and high fuel burn-up. It is to be noted that we have kept the total mass of the fuel as constant. In this work, we present a module based (modular) approach to perform the optimization wherein, we have defined the following modules: single fuel pin cell, whole core, thermal–hydraulics, and energy conversion. In each of the modules we have defined a specific set of parameters and optimization objectives. The GA system (GAS), and RS together, play the role of optimizing each of the individual modules, and integrating the modules to determine the final nuclear reactor core. However, implementation of GA could lead to a local minimum or a non-unique set of parameters, those meet the specific optimization objectives. The GA code is built using Java, neutronic analysis using MCNP6, thermal–hydraulics calculations using Java, and regression analysis using R

  18. A Gaussian process regression based hybrid approach for short-term wind speed prediction

    International Nuclear Information System (INIS)

    Zhang, Chi; Wei, Haikun; Zhao, Xin; Liu, Tianhong; Zhang, Kanjian

    2016-01-01

    Highlights: • A novel hybrid approach is proposed for short-term wind speed prediction. • This method combines the parametric AR model with the non-parametric GPR model. • The relative importance of different inputs is considered. • Different types of covariance functions are considered and combined. • It can provide both accurate point forecasts and satisfactory prediction intervals. - Abstract: This paper proposes a hybrid model based on autoregressive (AR) model and Gaussian process regression (GPR) for probabilistic wind speed forecasting. In the proposed approach, the AR model is employed to capture the overall structure from wind speed series, and the GPR is adopted to extract the local structure. Additionally, automatic relevance determination (ARD) is used to take into account the relative importance of different inputs, and different types of covariance functions are combined to capture the characteristics of the data. The proposed hybrid model is compared with the persistence model, artificial neural network (ANN), and support vector machine (SVM) for one-step ahead forecasting, using wind speed data collected from three wind farms in China. The forecasting results indicate that the proposed method can not only improve point forecasts compared with other methods, but also generate satisfactory prediction intervals.

  19. A linear regression approach to evaluate the green supply chain management impact on industrial organizational performance.

    Science.gov (United States)

    Mumtaz, Ubaidullah; Ali, Yousaf; Petrillo, Antonella

    2018-05-15

    The increase in the environmental pollution is one of the most important topic in today's world. In this context, the industrial activities can pose a significant threat to the environment. To manage problems associate to industrial activities several methods, techniques and approaches have been developed. Green supply chain management (GSCM) is considered one of the most important "environmental management approach". In developing countries such as Pakistan the implementation of GSCM practices is still in its initial stages. Lack of knowledge about its effects on economic performance is the reason because of industries fear to implement these practices. The aim of this research is to perceive the effects of GSCM practices on organizational performance in Pakistan. In this research the GSCM practices considered are: internal practices, external practices, investment recovery and eco-design. While, the performance parameters considered are: environmental pollution, operational cost and organizational flexibility. A set of hypothesis propose the effect of each GSCM practice on the performance parameters. Factor analysis and linear regression are used to analyze the survey data of Pakistani industries, in order to authenticate these hypotheses. The findings of this research indicate a decrease in environmental pollution and operational cost with the implementation of GSCM practices, whereas organizational flexibility has not improved for Pakistani industries. These results aim to help managers regarding their decision of implementing GSCM practices in the industrial sector of Pakistan. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Explaining the heterogeneous scrapie surveillance figures across Europe: a meta-regression approach

    Directory of Open Access Journals (Sweden)

    Ru Giuseppe

    2007-06-01

    Full Text Available Abstract Background Two annual surveys, the abattoir and the fallen stock, monitor the presence of scrapie across Europe. A simple comparison between the prevalence estimates in different countries reveals that, in 2003, the abattoir survey appears to detect more scrapie in some countries. This is contrary to evidence suggesting the greater ability of the fallen stock survey to detect the disease. We applied meta-analysis techniques to study this apparent heterogeneity in the behaviour of the surveys across Europe. Furthermore, we conducted a meta-regression analysis to assess the effect of country-specific characteristics on the variability. We have chosen the odds ratios between the two surveys to inform the underlying relationship between them and to allow comparisons between the countries under the meta-regression framework. Baseline risks, those of the slaughtered populations across Europe, and country-specific covariates, available from the European Commission Report, were inputted in the model to explain the heterogeneity. Results Our results show the presence of significant heterogeneity in the odds ratios between countries and no reduction in the variability after adjustment for the different risks in the baseline populations. Three countries contributed the most to the overall heterogeneity: Germany, Ireland and The Netherlands. The inclusion of country-specific covariates did not, in general, reduce the variability except for one variable: the proportion of the total adult sheep population sampled as fallen stock by each country. A large residual heterogeneity remained in the model indicating the presence of substantial effect variability between countries. Conclusion The meta-analysis approach was useful to assess the level of heterogeneity in the implementation of the surveys and to explore the reasons for the variation between countries.

  1. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  2. Asymmetrical Responses of Ecosystem Processes to Positive Versus Negative Precipitation Extremes: a Replicated Regression Experimental Approach

    Science.gov (United States)

    Felton, A. J.; Smith, M. D.

    2016-12-01

    Heightened climatic variability due to atmospheric warming is forecast to increase the frequency and severity of climate extremes. In particular, changes to interannual variability in precipitation, characterized by increases in extreme wet and dry years, are likely to impact virtually all terrestrial ecosystem processes. However, to date experimental approaches have yet to explicitly test how ecosystem processes respond to multiple levels of climatic extremity, limiting our understanding of how ecosystems will respond to forecast increases in the magnitude of climate extremes. Here we report the results of a replicated regression experimental approach, in which we imposed 9 and 11 levels of growing season precipitation amount and extremity in mesic grassland during 2015 and 2016, respectively. Each level corresponded to a specific percentile of the long-term record, which produced a large gradient of soil moisture conditions that ranged from extreme wet to extreme dry. In both 2015 and 2016, asymptotic responses to water availability were observed for soil respiration. This asymmetry was driven in part by transitions between soil moisture versus temperature constraints on respiration as conditions became increasingly dry versus increasingly wet. In 2015, aboveground net primary production (ANPP) exhibited asymmetric responses to precipitation that largely mirrored those of soil respiration. In total, our results suggest that in this mesic ecosystem, these two carbon cycle processes were more sensitive to extreme drought than to extreme wet years. Future work will assess ANPP responses for 2016, soil nutrient supply and physiological responses of the dominant plant species. Future efforts are needed to compare our findings across a diverse array of ecosystem types, and in particular how the timing and magnitude of precipitation events may modify the response of ecosystem processes to increasing magnitudes of precipitation extremes.

  3. A Vector Approach to Regression Analysis and Its Implications to Heavy-Duty Diesel Emissions

    Energy Technology Data Exchange (ETDEWEB)

    McAdams, H.T.

    2001-02-14

    An alternative approach is presented for the regression of response data on predictor variables that are not logically or physically separable. The methodology is demonstrated by its application to a data set of heavy-duty diesel emissions. Because of the covariance of fuel properties, it is found advantageous to redefine the predictor variables as vectors, in which the original fuel properties are components, rather than as scalars each involving only a single fuel property. The fuel property vectors are defined in such a way that they are mathematically independent and statistically uncorrelated. Because the available data set does not allow definitive separation of vehicle and fuel effects, and because test fuels used in several of the studies may be unrealistically contrived to break the association of fuel variables, the data set is not considered adequate for development of a full-fledged emission model. Nevertheless, the data clearly show that only a few basic patterns of fuel-property variation affect emissions and that the number of these patterns is considerably less than the number of variables initially thought to be involved. These basic patterns, referred to as ''eigenfuels,'' may reflect blending practice in accordance with their relative weighting in specific circumstances. The methodology is believed to be widely applicable in a variety of contexts. It promises an end to the threat of collinearity and the frustration of attempting, often unrealistically, to separate variables that are inseparable.

  4. Model-free prediction and regression a transformation-based approach to inference

    CERN Document Server

    Politis, Dimitris N

    2015-01-01

    The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...

  5. A regression modeling approach for studying carbonate system variability in the northern Gulf of Alaska

    Science.gov (United States)

    Evans, Wiley; Mathis, Jeremy T.; Winsor, Peter; Statscewich, Hank; Whitledge, Terry E.

    2013-01-01

    northern Gulf of Alaska (GOA) shelf experiences carbonate system variability on seasonal and annual time scales, but little information exists to resolve higher frequency variability in this region. To resolve this variability using platforms-of-opportunity, we present multiple linear regression (MLR) models constructed from hydrographic data collected along the Northeast Pacific Global Ocean Ecosystems Dynamics (GLOBEC) Seward Line. The empirical algorithms predict dissolved inorganic carbon (DIC) and total alkalinity (TA) using observations of nitrate (NO3-), temperature, salinity and pressure from the surface to 500 m, with R2s > 0.97 and RMSE values of 11 µmol kg-1 for DIC and 9 µmol kg-1 for TA. We applied these relationships to high-resolution NO3- data sets collected during a novel 20 h glider flight and a GLOBEC mesoscale SeaSoar survey. Results from the glider flight demonstrated time/space along-isopycnal variability of aragonite saturations (Ωarag) associated with a dicothermal layer (a cold near-surface layer found in high latitude oceans) that rivaled changes seen vertically through the thermocline. The SeaSoar survey captured the uplift to aragonite saturation horizon (depth where Ωarag = 1) shoaled to a previously unseen depth in the northern GOA. This work is similar to recent studies aimed at predicting the carbonate system in continental margin settings, albeit demonstrates that a NO3--based approach can be applied to high-latitude data collected from platforms capable of high-frequency measurements.

  6. A logistic regression approach to model the willingness of consumers to adopt renewable energy sources

    Science.gov (United States)

    Ulkhaq, M. M.; Widodo, A. K.; Yulianto, M. F. A.; Widhiyaningrum; Mustikasari, A.; Akshinta, P. Y.

    2018-03-01

    The implementation of renewable energy in this globalization era is inevitable since the non-renewable energy leads to climate change and global warming; hence, it does harm the environment and human life. However, in the developing countries, such as Indonesia, the implementation of the renewable energy sources does face technical and social problems. For the latter, renewable energy sources implementation is only effective if the public is aware of its benefits. This research tried to identify the determinants that influence consumers’ intention in adopting renewable energy sources. In addition, this research also tried to predict the consumers who are willing to apply the renewable energy sources in their houses using a logistic regression approach. A case study was conducted in Semarang, Indonesia. The result showed that only eight variables (from fifteen) that are significant statistically, i.e., educational background, employment status, income per month, average electricity cost per month, certainty about the efficiency of renewable energy project, relatives’ influence to adopt the renewable energy sources, energy tax deduction, and the condition of the price of the non-renewable energy sources. The finding of this study could be used as a basis for the government to set up a policy towards an implementation of the renewable energy sources.

  7. Detection of Differential Item Functioning with Nonlinear Regression: A Non-IRT Approach Accounting for Guessing

    Czech Academy of Sciences Publication Activity Database

    Drabinová, Adéla; Martinková, Patrícia

    2017-01-01

    Roč. 54, č. 4 (2017), s. 498-517 ISSN 0022-0655 R&D Projects: GA ČR GJ15-15856Y Institutional support: RVO:67985807 Keywords : differential item functioning * non-linear regression * logistic regression * item response theory Subject RIV: AM - Education OBOR OECD: Statistics and probability Impact factor: 0.979, year: 2016

  8. Statistical approach for selection of regression model during validation of bioanalytical method

    Directory of Open Access Journals (Sweden)

    Natalija Nakov

    2014-06-01

    Full Text Available The selection of an adequate regression model is the basis for obtaining accurate and reproducible results during the bionalytical method validation. Given the wide concentration range, frequently present in bioanalytical assays, heteroscedasticity of the data may be expected. Several weighted linear and quadratic regression models were evaluated during the selection of the adequate curve fit using nonparametric statistical tests: One sample rank test and Wilcoxon signed rank test for two independent groups of samples. The results obtained with One sample rank test could not give statistical justification for the selection of linear vs. quadratic regression models because slight differences between the error (presented through the relative residuals were obtained. Estimation of the significance of the differences in the RR was achieved using Wilcoxon signed rank test, where linear and quadratic regression models were treated as two independent groups. The application of this simple non-parametric statistical test provides statistical confirmation of the choice of an adequate regression model.

  9. Demographic and socioeconomic disparity in nutrition: application of a novel Correlated Component Regression approach

    Science.gov (United States)

    Alkerwi, Ala'a; Vernier, Céderic; Sauvageot, Nicolas; Crichton, Georgina E; Elias, Merrill F

    2015-01-01

    Objectives This study aimed to examine the most important demographic and socioeconomic factors associated with diet quality, evaluated in terms of compliance with national dietary recommendations, selection of healthy and unhealthy food choices, energy density and food variety. We hypothesised that different demographic and socioeconomic factors may show disparate associations with diet quality. Study design A nationwide, cross-sectional, population-based study. Participants A total of 1352 apparently healthy and non-institutionalised subjects, aged 18–69 years, participated in the Observation of Cardiovascular Risk Factors in Luxembourg (ORISCAV-LUX) study in 2007–2008. The participants attended the nearest study centre after a telephone appointment, and were interviewed by trained research staff. Outcome measures Diet quality as measured by 5 dietary indicators, namely, recommendation compliance index (RCI), recommended foods score (RFS), non-recommended foods score (non-RFS), energy density score (EDS), and dietary diversity score (DDS). The novel Correlated Component Regression (CCR) technique was used to determine the importance and magnitude of the association of each socioeconomic factor with diet quality, in a global analytic approach. Results Increasing age, being male and living below the poverty threshold were predominant factors associated with eating a high energy density diet. Education level was an important factor associated with healthy and adequate food choices, whereas economic resources were predominant factors associated with food diversity and energy density. Conclusions Multiple demographic and socioeconomic circumstances were associated with different diet quality indicators. Efforts to improve diet quality for high-risk groups need an important public health focus. PMID:25967988

  10. Chronic subdural hematoma: Surgical management and outcome in 986 cases: A classification and regression tree approach

    Science.gov (United States)

    Rovlias, Aristedis; Theodoropoulos, Spyridon; Papoutsakis, Dimitrios

    2015-01-01

    Background: Chronic subdural hematoma (CSDH) is one of the most common clinical entities in daily neurosurgical practice which carries a most favorable prognosis. However, because of the advanced age and medical problems of patients, surgical therapy is frequently associated with various complications. This study evaluated the clinical features, radiological findings, and neurological outcome in a large series of patients with CSDH. Methods: A classification and regression tree (CART) technique was employed in the analysis of data from 986 patients who were operated at Asclepeion General Hospital of Athens from January 1986 to December 2011. Burr holes evacuation with closed system drainage has been the operative technique of first choice at our institution for 29 consecutive years. A total of 27 prognostic factors were examined to predict the outcome at 3-month postoperatively. Results: Our results indicated that neurological status on admission was the best predictor of outcome. With regard to the other data, age, brain atrophy, thickness and density of hematoma, subdural accumulation of air, and antiplatelet and anticoagulant therapy were found to correlate significantly with prognosis. The overall cross-validated predictive accuracy of CART model was 85.34%, with a cross-validated relative error of 0.326. Conclusions: Methodologically, CART technique is quite different from the more commonly used methods, with the primary benefit of illustrating the important prognostic variables as related to outcome. Since, the ideal therapy for the treatment of CSDH is still under debate, this technique may prove useful in developing new therapeutic strategies and approaches for patients with CSDH. PMID:26257985

  11. A regression approach for Zircaloy-2 in-reactor creep constitutive equations

    International Nuclear Information System (INIS)

    Yung Liu, Y.; Bement, A.L.

    1977-01-01

    In this paper the methodology of multiple regressions as applied to Zircaloy-2 in-reactor creep data analysis and construction of constitutive equation are illustrated. While the resulting constitutive equation can be used in creep analysis of in-reactor Zircaloy structural components, the methodology itself is entirely general and can be applied to any creep data analysis. The promising aspects of multiple regression creep data analysis are briefly outlined as follows: (1) When there are more than one variable involved, there is no need to make the assumption that each variable affects the response independently. No separate normalizations are required either and the estimation of parameters is obtained by solving many simultaneous equations. The number of simultaneous equations is equal to the number of data sets. (2) Regression statistics such as R 2 - and F-statistics provide measures of the significance of regression creep equation in correlating the overall data. The relative weights of each variable on the response can also be obtained. (3) Special regression techniques such as step-wise, ridge, and robust regressions and residual plots, etc., provide diagnostic tools for model selections. Multiple regression analysis performed on a set of carefully selected Zircaloy-2 in-reactor creep data leads to a model which provides excellent correlations for the data. (Auth.)

  12. A robust ridge regression approach in the presence of both multicollinearity and outliers in the data

    Science.gov (United States)

    Shariff, Nurul Sima Mohamad; Ferdaos, Nur Aqilah

    2017-08-01

    Multicollinearity often leads to inconsistent and unreliable parameter estimates in regression analysis. This situation will be more severe in the presence of outliers it will cause fatter tails in the error distributions than the normal distributions. The well-known procedure that is robust to multicollinearity problem is the ridge regression method. This method however is expected to be affected by the presence of outliers due to some assumptions imposed in the modeling procedure. Thus, the robust version of existing ridge method with some modification in the inverse matrix and the estimated response value is introduced. The performance of the proposed method is discussed and comparisons are made with several existing estimators namely, Ordinary Least Squares (OLS), ridge regression and robust ridge regression based on GM-estimates. The finding of this study is able to produce reliable parameter estimates in the presence of both multicollinearity and outliers in the data.

  13. An Integrated Approach to Battery Health Monitoring using Bayesian Regression, Classification and State Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — The application of the Bayesian theory of managing uncertainty and complexity to regression and classification in the form of Relevance Vector Machine (RVM), and to...

  14. HYBRID DATA APPROACH FOR SELECTING EFFECTIVE TEST CASES DURING THE REGRESSION TESTING

    OpenAIRE

    Mohan, M.; Shrimali, Tarun

    2017-01-01

    In the software industry, software testing becomes more important in the entire software development life cycle. Software testing is one of the fundamental components of software quality assurances. Software Testing Life Cycle (STLC)is a process involved in testing the complete software, which includes Regression Testing, Unit Testing, Smoke Testing, Integration Testing, Interface Testing, System Testing & etc. In the STLC of Regression testing, test case selection is one of the most importan...

  15. Effective approaches for managing electronic records and archives

    CERN Document Server

    Dearstyne, Bruce W

    2006-01-01

    This is a book of fresh insights, perspectives, strategies, and approaches for managing electronic records and archives. The authors draw on first-hand experience to present practical solutions, including recommendations for building and sustaining strong electronic records programs.

  16. Bias in logistic regression due to imperfect diagnostic test results and practical correction approaches.

    Science.gov (United States)

    Valle, Denis; Lima, Joanna M Tucker; Millar, Justin; Amratia, Punam; Haque, Ubydul

    2015-11-04

    Logistic regression is a statistical model widely used in cross-sectional and cohort studies to identify and quantify the effects of potential disease risk factors. However, the impact of imperfect tests on adjusted odds ratios (and thus on the identification of risk factors) is under-appreciated. The purpose of this article is to draw attention to the problem associated with modelling imperfect diagnostic tests, and propose simple Bayesian models to adequately address this issue. A systematic literature review was conducted to determine the proportion of malaria studies that appropriately accounted for false-negatives/false-positives in a logistic regression setting. Inference from the standard logistic regression was also compared with that from three proposed Bayesian models using simulations and malaria data from the western Brazilian Amazon. A systematic literature review suggests that malaria epidemiologists are largely unaware of the problem of using logistic regression to model imperfect diagnostic test results. Simulation results reveal that statistical inference can be substantially improved when using the proposed Bayesian models versus the standard logistic regression. Finally, analysis of original malaria data with one of the proposed Bayesian models reveals that microscopy sensitivity is strongly influenced by how long people have lived in the study region, and an important risk factor (i.e., participation in forest extractivism) is identified that would have been missed by standard logistic regression. Given the numerous diagnostic methods employed by malaria researchers and the ubiquitous use of logistic regression to model the results of these diagnostic tests, this paper provides critical guidelines to improve data analysis practice in the presence of misclassification error. Easy-to-use code that can be readily adapted to WinBUGS is provided, enabling straightforward implementation of the proposed Bayesian models.

  17. Mortality risk prediction in burn injury: Comparison of logistic regression with machine learning approaches.

    Science.gov (United States)

    Stylianou, Neophytos; Akbarov, Artur; Kontopantelis, Evangelos; Buchan, Iain; Dunn, Ken W

    2015-08-01

    Predicting mortality from burn injury has traditionally employed logistic regression models. Alternative machine learning methods have been introduced in some areas of clinical prediction as the necessary software and computational facilities have become accessible. Here we compare logistic regression and machine learning predictions of mortality from burn. An established logistic mortality model was compared to machine learning methods (artificial neural network, support vector machine, random forests and naïve Bayes) using a population-based (England & Wales) case-cohort registry. Predictive evaluation used: area under the receiver operating characteristic curve; sensitivity; specificity; positive predictive value and Youden's index. All methods had comparable discriminatory abilities, similar sensitivities, specificities and positive predictive values. Although some machine learning methods performed marginally better than logistic regression the differences were seldom statistically significant and clinically insubstantial. Random forests were marginally better for high positive predictive value and reasonable sensitivity. Neural networks yielded slightly better prediction overall. Logistic regression gives an optimal mix of performance and interpretability. The established logistic regression model of burn mortality performs well against more complex alternatives. Clinical prediction with a small set of strong, stable, independent predictors is unlikely to gain much from machine learning outside specialist research contexts. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  18. A regression approach for zircaloy-2 in-reactor creep constitutive equations

    International Nuclear Information System (INIS)

    Yung Liu, Y.; Bement, A.L.

    1977-01-01

    In this paper the methodology of multiple regressions as applied to zircaloy-2 in-reactor creep data analysis and construction of constitutive equation are illustrated. While the resulting constitutive equation can be used in creep analysis of in-reactor zircaloy structural components, the methodology itself is entirely general and can be applied to any creep data analysis. From data analysis and model development point of views, both the assumption of independence and prior committment to specific model forms are unacceptable. One would desire means which can not only estimate the required parameters directly from data but also provide basis for model selections, viz., one model against others. Basic understanding of the physics of deformation is important in choosing the forms of starting physical model equations, but the justifications must rely on their abilities in correlating the overall data. The promising aspects of multiple regression creep data analysis are briefly outlined as follows: (1) when there are more than one variable involved, there is no need to make the assumption that each variable affects the response independently. No separate normalizations are required either and the estimation of parameters is obtained by solving many simultaneous equations. The number of simultaneous equations is equal to the number of data sets, (2) regression statistics such as R 2 - and F-statistics provide measures of the significance of regression creep equation in correlating the overall data. The relative weights of each variable on the response can also be obtained. (3) Special regression techniques such as step-wise, ridge, and robust regressions and residual plots, etc., provide diagnostic tools for model selections

  19. An analytical approach to characterize morbidity profile dissimilarity between distinct cohorts using electronic medical records

    OpenAIRE

    Schildcrout, Jonathan S.; Basford, Melissa A.; Pulley, Jill M.; Masys, Daniel R.; Roden, Dan M.; Wang, Deede; Chute, Christopher G.; Kullo, Iftikhar J.; Carrell, David; Peissig, Peggy; Kho, Abel; Denny, Joshua C.

    2010-01-01

    We describe a two-stage analytical approach for characterizing morbidity profile dissimilarity among patient cohorts using electronic medical records. We capture morbidities using the International Statistical Classification of Diseases and Related Health Problems (ICD-9) codes. In the first stage of the approach separate logistic regression analyses for ICD-9 sections (e.g., “hypertensive disease” or “appendicitis”) are conducted, and the odds ratios that describe adjusted differences in pre...

  20. A nonparametric approach to calculate critical micelle concentrations: the local polynomial regression method

    Energy Technology Data Exchange (ETDEWEB)

    Lopez Fontan, J.L.; Costa, J.; Ruso, J.M.; Prieto, G. [Dept. of Applied Physics, Univ. of Santiago de Compostela, Santiago de Compostela (Spain); Sarmiento, F. [Dept. of Mathematics, Faculty of Informatics, Univ. of A Coruna, A Coruna (Spain)

    2004-02-01

    The application of a statistical method, the local polynomial regression method, (LPRM), based on a nonparametric estimation of the regression function to determine the critical micelle concentration (cmc) is presented. The method is extremely flexible because it does not impose any parametric model on the subjacent structure of the data but rather allows the data to speak for themselves. Good concordance of cmc values with those obtained by other methods was found for systems in which the variation of a measured physical property with concentration showed an abrupt change. When this variation was slow, discrepancies between the values obtained by LPRM and others methods were found. (orig.)

  1. Gender Gaps in Mathematics, Science and Reading Achievements in Muslim Countries: A Quantile Regression Approach

    Science.gov (United States)

    Shafiq, M. Najeeb

    2013-01-01

    Using quantile regression analyses, this study examines gender gaps in mathematics, science, and reading in Azerbaijan, Indonesia, Jordan, the Kyrgyz Republic, Qatar, Tunisia, and Turkey among 15-year-old students. The analyses show that girls in Azerbaijan achieve as well as boys in mathematics and science and overachieve in reading. In Jordan,…

  2. INTRODUCTION TO A COMBINED MULTIPLE LINEAR REGRESSION AND ARMA MODELING APPROACH FOR BEACH BACTERIA PREDICTION

    Science.gov (United States)

    Due to the complexity of the processes contributing to beach bacteria concentrations, many researchers rely on statistical modeling, among which multiple linear regression (MLR) modeling is most widely used. Despite its ease of use and interpretation, there may be time dependence...

  3. Modeling geochemical datasets for source apportionment: Comparison of least square regression and inversion approaches.

    Digital Repository Service at National Institute of Oceanography (India)

    Tripathy, G.R.; Das, Anirban.

    used methods, the Least Square Regression (LSR) and Inverse Modeling (IM), to determine the contributions of (i) solutes from different sources to global river water, and (ii) various rocks to a glacial till. The purpose of this exercise is to compare...

  4. Replicating Experimental Impact Estimates Using a Regression Discontinuity Approach. NCEE 2012-4025

    Science.gov (United States)

    Gleason, Philip M.; Resch, Alexandra M.; Berk, Jillian A.

    2012-01-01

    This NCEE Technical Methods Paper compares the estimated impacts of an educational intervention using experimental and regression discontinuity (RD) study designs. The analysis used data from two large-scale randomized controlled trials--the Education Technology Evaluation and the Teach for America Study--to provide evidence on the performance of…

  5. On the Usefulness of a Multilevel Logistic Regression Approach to Person-Fit Analysis

    Science.gov (United States)

    Conijn, Judith M.; Emons, Wilco H. M.; van Assen, Marcel A. L. M.; Sijtsma, Klaas

    2011-01-01

    The logistic person response function (PRF) models the probability of a correct response as a function of the item locations. Reise (2000) proposed to use the slope parameter of the logistic PRF as a person-fit measure. He reformulated the logistic PRF model as a multilevel logistic regression model and estimated the PRF parameters from this…

  6. Financial Aid and First-Year Collegiate GPA: A Regression Discontinuity Approach

    Science.gov (United States)

    Curs, Bradley R.; Harper, Casandra E.

    2012-01-01

    Using a regression discontinuity design, we investigate whether a merit-based financial aid program has a causal effect on the first-year grade point average of first-time out-of-state freshmen at the University of Oregon. Our results indicate that merit-based financial aid has a positive and significant effect on first-year collegiate grade point…

  7. Predicting 30-day Hospital Readmission with Publicly Available Administrative Database. A Conditional Logistic Regression Modeling Approach.

    Science.gov (United States)

    Zhu, K; Lou, Z; Zhou, J; Ballester, N; Kong, N; Parikh, P

    2015-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". Hospital readmissions raise healthcare costs and cause significant distress to providers and patients. It is, therefore, of great interest to healthcare organizations to predict what patients are at risk to be readmitted to their hospitals. However, current logistic regression based risk prediction models have limited prediction power when applied to hospital administrative data. Meanwhile, although decision trees and random forests have been applied, they tend to be too complex to understand among the hospital practitioners. Explore the use of conditional logistic regression to increase the prediction accuracy. We analyzed an HCUP statewide inpatient discharge record dataset, which includes patient demographics, clinical and care utilization data from California. We extracted records of heart failure Medicare beneficiaries who had inpatient experience during an 11-month period. We corrected the data imbalance issue with under-sampling. In our study, we first applied standard logistic regression and decision tree to obtain influential variables and derive practically meaning decision rules. We then stratified the original data set accordingly and applied logistic regression on each data stratum. We further explored the effect of interacting variables in the logistic regression modeling. We conducted cross validation to assess the overall prediction performance of conditional logistic regression (CLR) and compared it with standard classification models. The developed CLR models outperformed several standard classification models (e.g., straightforward logistic regression, stepwise logistic regression, random forest, support vector machine). For example, the best CLR model improved the classification accuracy by nearly 20% over the straightforward logistic regression model. Furthermore, the developed CLR models tend to achieve better sensitivity of

  8. Single-electron multiplication statistics as a combination of Poissonian pulse height distributions using constraint regression methods

    International Nuclear Information System (INIS)

    Ballini, J.-P.; Cazes, P.; Turpin, P.-Y.

    1976-01-01

    Analysing the histogram of anode pulse amplitudes allows a discussion of the hypothesis that has been proposed to account for the statistical processes of secondary multiplication in a photomultiplier. In an earlier work, good agreement was obtained between experimental and reconstructed spectra, assuming a first dynode distribution including two Poisson distributions of distinct mean values. This first approximation led to a search for a method which could give the weights of several Poisson distributions of distinct mean values. Three methods have been briefly exposed: classical linear regression, constraint regression (d'Esopo's method), and regression on variables subject to error. The use of these methods gives an approach of the frequency function which represents the dispersion of the punctual mean gain around the whole first dynode mean gain value. Comparison between this function and the one employed in Polya distribution allows the statement that the latter is inadequate to describe the statistical process of secondary multiplication. Numerous spectra obtained with two kinds of photomultiplier working under different physical conditions have been analysed. Then two points are discussed: - Does the frequency function represent the dynode structure and the interdynode collection process. - Is the model (the multiplication process of all dynodes but the first one, is Poissonian) valid whatever the photomultiplier and the utilization conditions. (Auth.)

  9. Instructional Approach to Molecular Electronic Structure Theory

    Science.gov (United States)

    Dykstra, Clifford E.; Schaefer, Henry F.

    1977-01-01

    Describes a graduate quantum mechanics projects in which students write a computer program that performs ab initio calculations on the electronic structure of a simple molecule. Theoretical potential energy curves are produced. (MLH)

  10. A linear algebraic approach to electron-molecule collisions

    International Nuclear Information System (INIS)

    Collins, L.A.; Schnieder, B.I.

    1982-01-01

    The linear algebraic approach to electron-molecule collisions is examined by firstly deriving the general set of coupled integrodifferential equations that describe electron collisional processes and then describing the linear algebraic approach for obtaining a solution to the coupled equations. Application of the linear algebraic method to static-exchange, separable exchange and effective optical potential, is examined. (U.K.)

  11. Comparing Kriging and Regression Approaches for Mapping Soil Clay Content in a diverse Danish Landscape

    DEFF Research Database (Denmark)

    Adhikari, Kabindra; Bou Kheir, Rania; Greve, Mette Balslev

    2013-01-01

    Information on the spatial variability of soil texture including soil clay content in a landscape is very important for agricultural and environmental use. Different prediction techniques are available to assess and map spatial variability of soil properties, but selecting the most suitable techn...... the prediction in OKst compared with that in OK, whereas RT showed the lowest performance of all (R2 = 0.52; RMSE = 0.52; and RPD = 1.17). We found RKrr to be an effective prediction method and recommend this method for any future soil mapping activities in Denmark....... technique at a given site has always been a major issue in all soil mapping applications. We studied the prediction performance of ordinary kriging (OK), stratified OK (OKst), regression trees (RT), and rule-based regression kriging (RKrr) for digital mapping of soil clay content at 30.4-m grid size using 6...

  12. Healthcare Expenditures Associated with Depression Among Individuals with Osteoarthritis: Post-Regression Linear Decomposition Approach.

    Science.gov (United States)

    Agarwal, Parul; Sambamoorthi, Usha

    2015-12-01

    Depression is common among individuals with osteoarthritis and leads to increased healthcare burden. The objective of this study was to examine excess total healthcare expenditures associated with depression among individuals with osteoarthritis in the US. Adults with self-reported osteoarthritis (n = 1881) were identified using data from the 2010 Medical Expenditure Panel Survey (MEPS). Among those with osteoarthritis, chi-square tests and ordinary least square regressions (OLS) were used to examine differences in healthcare expenditures between those with and without depression. Post-regression linear decomposition technique was used to estimate the relative contribution of different constructs of the Anderson's behavioral model, i.e., predisposing, enabling, need, personal healthcare practices, and external environment factors, to the excess expenditures associated with depression among individuals with osteoarthritis. All analysis accounted for the complex survey design of MEPS. Depression coexisted among 20.6 % of adults with osteoarthritis. The average total healthcare expenditures were $13,684 among adults with depression compared to $9284 among those without depression. Multivariable OLS regression revealed that adults with depression had 38.8 % higher healthcare expenditures (p regression linear decomposition analysis indicated that 50 % of differences in expenditures among adults with and without depression can be explained by differences in need factors. Among individuals with coexisting osteoarthritis and depression, excess healthcare expenditures associated with depression were mainly due to comorbid anxiety, chronic conditions and poor health status. These expenditures may potentially be reduced by providing timely intervention for need factors or by providing care under a collaborative care model.

  13. The effect of foreign aid on corruption: A quantile regression approach

    OpenAIRE

    Okada, Keisuke; Samreth, Sovannroeun

    2011-01-01

    This paper investigates the effect of foreign aid on corruption using a quantile regression method. Our estimation results illustrate that foreign aid generally lessens corruption and, in particular, its reduction effect is larger in countries with low levels of corruption. In addition, considering foreign aid by donors, our analysis indicates that while multilateral aid has a larger reduction impact on corruption, bilateral aid from the world’s leading donors, such as France, the United King...

  14. Performance and separation occurrence of binary probit regression estimator using maximum likelihood method and Firths approach under different sample size

    Science.gov (United States)

    Lusiana, Evellin Dewi

    2017-12-01

    The parameters of binary probit regression model are commonly estimated by using Maximum Likelihood Estimation (MLE) method. However, MLE method has limitation if the binary data contains separation. Separation is the condition where there are one or several independent variables that exactly grouped the categories in binary response. It will result the estimators of MLE method become non-convergent, so that they cannot be used in modeling. One of the effort to resolve the separation is using Firths approach instead. This research has two aims. First, to identify the chance of separation occurrence in binary probit regression model between MLE method and Firths approach. Second, to compare the performance of binary probit regression model estimator that obtained by MLE method and Firths approach using RMSE criteria. Those are performed using simulation method and under different sample size. The results showed that the chance of separation occurrence in MLE method for small sample size is higher than Firths approach. On the other hand, for larger sample size, the probability decreased and relatively identic between MLE method and Firths approach. Meanwhile, Firths estimators have smaller RMSE than MLEs especially for smaller sample sizes. But for larger sample sizes, the RMSEs are not much different. It means that Firths estimators outperformed MLE estimator.

  15. AucPR: an AUC-based approach using penalized regression for disease prediction with high-dimensional omics data.

    Science.gov (United States)

    Yu, Wenbao; Park, Taesung

    2014-01-01

    It is common to get an optimal combination of markers for disease classification and prediction when multiple markers are available. Many approaches based on the area under the receiver operating characteristic curve (AUC) have been proposed. Existing works based on AUC in a high-dimensional context depend mainly on a non-parametric, smooth approximation of AUC, with no work using a parametric AUC-based approach, for high-dimensional data. We propose an AUC-based approach using penalized regression (AucPR), which is a parametric method used for obtaining a linear combination for maximizing the AUC. To obtain the AUC maximizer in a high-dimensional context, we transform a classical parametric AUC maximizer, which is used in a low-dimensional context, into a regression framework and thus, apply the penalization regression approach directly. Two kinds of penalization, lasso and elastic net, are considered. The parametric approach can avoid some of the difficulties of a conventional non-parametric AUC-based approach, such as the lack of an appropriate concave objective function and a prudent choice of the smoothing parameter. We apply the proposed AucPR for gene selection and classification using four real microarray and synthetic data. Through numerical studies, AucPR is shown to perform better than the penalized logistic regression and the nonparametric AUC-based method, in the sense of AUC and sensitivity for a given specificity, particularly when there are many correlated genes. We propose a powerful parametric and easily-implementable linear classifier AucPR, for gene selection and disease prediction for high-dimensional data. AucPR is recommended for its good prediction performance. Beside gene expression microarray data, AucPR can be applied to other types of high-dimensional omics data, such as miRNA and protein data.

  16. Analysis of sparse data in logistic regression in medical research: A newer approach

    Directory of Open Access Journals (Sweden)

    S Devika

    2016-01-01

    Full Text Available Background and Objective: In the analysis of dichotomous type response variable, logistic regression is usually used. However, the performance of logistic regression in the presence of sparse data is questionable. In such a situation, a common problem is the presence of high odds ratios (ORs with very wide 95% confidence interval (CI (OR: >999.999, 95% CI: 999.999. In this paper, we addressed this issue by using penalized logistic regression (PLR method. Materials and Methods: Data from case-control study on hyponatremia and hiccups conducted in Christian Medical College, Vellore, Tamil Nadu, India was used. The outcome variable was the presence/absence of hiccups and the main exposure variable was the status of hyponatremia. Simulation dataset was created with different sample sizes and with a different number of covariates. Results: A total of 23 cases and 50 controls were used for the analysis of ordinary and PLR methods. The main exposure variable hyponatremia was present in nine (39.13% of the cases and in four (8.0% of the controls. Of the 23 hiccup cases, all were males and among the controls, 46 (92.0% were males. Thus, the complete separation between gender and the disease group led into an infinite OR with 95% CI (OR: >999.999, 95% CI: 999.999 whereas there was a finite and consistent regression coefficient for gender (OR: 5.35; 95% CI: 0.42, 816.48 using PLR. After adjusting for all the confounding variables, hyponatremia entailed 7.9 (95% CI: 2.06, 38.86 times higher risk for the development of hiccups as was found using PLR whereas there was an overestimation of risk OR: 10.76 (95% CI: 2.17, 53.41 using the conventional method. Simulation experiment shows that the estimated coverage probability of this method is near the nominal level of 95% even for small sample sizes and for a large number of covariates. Conclusions: PLR is almost equal to the ordinary logistic regression when the sample size is large and is superior in small cell

  17. Electronics lab instructors' approaches to troubleshooting instruction

    Science.gov (United States)

    Dounas-Frazer, Dimitri R.; Lewandowski, H. J.

    2017-06-01

    In this exploratory qualitative study, we describe instructors' self-reported practices for teaching and assessing students' ability to troubleshoot in electronics lab courses. We collected audio data from interviews with 20 electronics instructors from 18 institutions that varied by size, selectivity, and other factors. In addition to describing participants' instructional practices, we characterize their perceptions about the role of troubleshooting in electronics, the importance of the ability to troubleshoot more generally, and what it means for students to be competent troubleshooters. One major finding of this work is that, while almost all instructors in our study said that troubleshooting is an important learning outcome for students in electronics lab courses, only half of instructors said they directly assessed students' ability to troubleshoot. Based on our findings, we argue that there is a need for research-based instructional materials that attend to both cognitive and noncognitive aspects of troubleshooting proficiency. We also identify several areas for future investigation related to troubleshooting instruction in electronics lab courses.

  18. Using the mean approach in pooling cross-section and time series data for regression modelling

    International Nuclear Information System (INIS)

    Nuamah, N.N.N.N.

    1989-12-01

    The mean approach is one of the methods for pooling cross section and time series data for mathematical-statistical modelling. Though a simple approach, its results are sometimes paradoxical in nature. However, researchers still continue using it for its simplicity. Here, the paper investigates the nature and source of such unwanted phenomena. (author). 7 refs

  19. Antibiotic Resistances in Livestock: A Comparative Approach to Identify an Appropriate Regression Model for Count Data

    Directory of Open Access Journals (Sweden)

    Anke Hüls

    2017-05-01

    Full Text Available Antimicrobial resistance in livestock is a matter of general concern. To develop hygiene measures and methods for resistance prevention and control, epidemiological studies on a population level are needed to detect factors associated with antimicrobial resistance in livestock holdings. In general, regression models are used to describe these relationships between environmental factors and resistance outcome. Besides the study design, the correlation structures of the different outcomes of antibiotic resistance and structural zero measurements on the resistance outcome as well as on the exposure side are challenges for the epidemiological model building process. The use of appropriate regression models that acknowledge these complexities is essential to assure valid epidemiological interpretations. The aims of this paper are (i to explain the model building process comparing several competing models for count data (negative binomial model, quasi-Poisson model, zero-inflated model, and hurdle model and (ii to compare these models using data from a cross-sectional study on antibiotic resistance in animal husbandry. These goals are essential to evaluate which model is most suitable to identify potential prevention measures. The dataset used as an example in our analyses was generated initially to study the prevalence and associated factors for the appearance of cefotaxime-resistant Escherichia coli in 48 German fattening pig farms. For each farm, the outcome was the count of samples with resistant bacteria. There was almost no overdispersion and only moderate evidence of excess zeros in the data. Our analyses show that it is essential to evaluate regression models in studies analyzing the relationship between environmental factors and antibiotic resistances in livestock. After model comparison based on evaluation of model predictions, Akaike information criterion, and Pearson residuals, here the hurdle model was judged to be the most appropriate

  20. Identifying the Safety Factors over Traffic Signs in State Roads using a Panel Quantile Regression Approach.

    Science.gov (United States)

    Šarić, Željko; Xu, Xuecai; Duan, Li; Babić, Darko

    2018-06-20

    This study intended to investigate the interactions between accident rate and traffic signs in state roads located in Croatia, and accommodate the heterogeneity attributed to unobserved factors. The data from 130 state roads between 2012 and 2016 were collected from Traffic Accident Database System maintained by the Republic of Croatia Ministry of the Interior. To address the heterogeneity, a panel quantile regression model was proposed, in which quantile regression model offers a more complete view and a highly comprehensive analysis of the relationship between accident rate and traffic signs, while the panel data model accommodates the heterogeneity attributed to unobserved factors. Results revealed that (1) low visibility of material damage (MD) and death or injured (DI) increased the accident rate; (2) the number of mandatory signs and the number of warning signs were more likely to reduce the accident rate; (3)average speed limit and the number of invalid traffic signs per km exhibited a high accident rate. To our knowledge, it's the first attempt to analyze the interactions between accident consequences and traffic signs by employing a panel quantile regression model; by involving the visibility, the present study demonstrates that the low visibility causes a relatively higher risk of MD and DI; It is noteworthy that average speed limit corresponds with accident rate positively; The number of mandatory signs and the number of warning signs are more likely to reduce the accident rate; The number of invalid traffic signs per km are significant for accident rate, thus regular maintenance should be kept for a safer roadway environment.

  1. The N-shaped environmental Kuznets curve: an empirical evaluation using a panel quantile regression approach.

    Science.gov (United States)

    Allard, Alexandra; Takman, Johanna; Uddin, Gazi Salah; Ahmed, Ali

    2018-02-01

    We evaluate the N-shaped environmental Kuznets curve (EKC) using panel quantile regression analysis. We investigate the relationship between CO 2 emissions and GDP per capita for 74 countries over the period of 1994-2012. We include additional explanatory variables, such as renewable energy consumption, technological development, trade, and institutional quality. We find evidence for the N-shaped EKC in all income groups, except for the upper-middle-income countries. Heterogeneous characteristics are, however, observed over the N-shaped EKC. Finally, we find a negative relationship between renewable energy consumption and CO 2 emissions, which highlights the importance of promoting greener energy in order to combat global warming.

  2. Regional trends in short-duration precipitation extremes: a flexible multivariate monotone quantile regression approach

    Science.gov (United States)

    Cannon, Alex

    2017-04-01

    Estimating historical trends in short-duration rainfall extremes at regional and local scales is challenging due to low signal-to-noise ratios and the limited availability of homogenized observational data. In addition to being of scientific interest, trends in rainfall extremes are of practical importance, as their presence calls into question the stationarity assumptions that underpin traditional engineering and infrastructure design practice. Even with these fundamental challenges, increasingly complex questions are being asked about time series of extremes. For instance, users may not only want to know whether or not rainfall extremes have changed over time, they may also want information on the modulation of trends by large-scale climate modes or on the nonstationarity of trends (e.g., identifying hiatus periods or periods of accelerating positive trends). Efforts have thus been devoted to the development and application of more robust and powerful statistical estimators for regional and local scale trends. While a standard nonparametric method like the regional Mann-Kendall test, which tests for the presence of monotonic trends (i.e., strictly non-decreasing or non-increasing changes), makes fewer assumptions than parametric methods and pools information from stations within a region, it is not designed to visualize detected trends, include information from covariates, or answer questions about the rate of change in trends. As a remedy, monotone quantile regression (MQR) has been developed as a nonparametric alternative that can be used to estimate a common monotonic trend in extremes at multiple stations. Quantile regression makes efficient use of data by directly estimating conditional quantiles based on information from all rainfall data in a region, i.e., without having to precompute the sample quantiles. The MQR method is also flexible and can be used to visualize and analyze the nonlinearity of the detected trend. However, it is fundamentally a

  3. THE GENDER PAY GAP IN VIETNAM, 1993-2002: A QUANTILE REGRESSION APPROACH

    OpenAIRE

    Pham, Hung T; Reilly, Barry

    2007-01-01

    This paper uses mean and quantile regression analysis to investigate the gender pay gap for the wage employed in Vietnam over the period 1993 to 2002. It finds that the Doi moi reforms appear to have been associated with a sharp reduction in gender pay gap disparities for the wage employed. The average gender pay gap in this sector halved between 1993 and 2002 with most of the contraction evident by 1998. There has also been a narrowing in the gender pay gap at most selected points of the con...

  4. The Gender Pay Gap In Vietnam, 1993-2002: A Quantile Regression Approach

    OpenAIRE

    Barry Reilly & T. Hung Pham

    2006-01-01

    This paper uses mean and quantile regression analysis to investigate the gender pay gap for the wage employed in Vietnam over the period 1993 to 2002. It finds that the Doi moi reforms have been associated with a sharp reduction in gender wage disparities for the wage employed. The average gender pay gap in this sector halved between 1993 and 2002 with most of the contraction evident by 1998. There has also been a contraction in the gender pay at most selected points of the conditional wage d...

  5. Statistical Downscaling Output GCM Modeling with Continuum Regression and Pre-Processing PCA Approach

    Directory of Open Access Journals (Sweden)

    Sutikno Sutikno

    2010-08-01

    Full Text Available One of the climate models used to predict the climatic conditions is Global Circulation Models (GCM. GCM is a computer-based model that consists of different equations. It uses numerical and deterministic equation which follows the physics rules. GCM is a main tool to predict climate and weather, also it uses as primary information source to review the climate change effect. Statistical Downscaling (SD technique is used to bridge the large-scale GCM with a small scale (the study area. GCM data is spatial and temporal data most likely to occur where the spatial correlation between different data on the grid in a single domain. Multicollinearity problems require the need for pre-processing of variable data X. Continuum Regression (CR and pre-processing with Principal Component Analysis (PCA methods is an alternative to SD modelling. CR is one method which was developed by Stone and Brooks (1990. This method is a generalization from Ordinary Least Square (OLS, Principal Component Regression (PCR and Partial Least Square method (PLS methods, used to overcome multicollinearity problems. Data processing for the station in Ambon, Pontianak, Losarang, Indramayu and Yuntinyuat show that the RMSEP values and R2 predict in the domain 8x8 and 12x12 by uses CR method produces results better than by PCR and PLS.

  6. A Gaussian mixture copula model based localized Gaussian process regression approach for long-term wind speed prediction

    International Nuclear Information System (INIS)

    Yu, Jie; Chen, Kuilin; Mori, Junichi; Rashid, Mudassir M.

    2013-01-01

    Optimizing wind power generation and controlling the operation of wind turbines to efficiently harness the renewable wind energy is a challenging task due to the intermittency and unpredictable nature of wind speed, which has significant influence on wind power production. A new approach for long-term wind speed forecasting is developed in this study by integrating GMCM (Gaussian mixture copula model) and localized GPR (Gaussian process regression). The time series of wind speed is first classified into multiple non-Gaussian components through the Gaussian mixture copula model and then Bayesian inference strategy is employed to incorporate the various non-Gaussian components using the posterior probabilities. Further, the localized Gaussian process regression models corresponding to different non-Gaussian components are built to characterize the stochastic uncertainty and non-stationary seasonality of the wind speed data. The various localized GPR models are integrated through the posterior probabilities as the weightings so that a global predictive model is developed for the prediction of wind speed. The proposed GMCM–GPR approach is demonstrated using wind speed data from various wind farm locations and compared against the GMCM-based ARIMA (auto-regressive integrated moving average) and SVR (support vector regression) methods. In contrast to GMCM–ARIMA and GMCM–SVR methods, the proposed GMCM–GPR model is able to well characterize the multi-seasonality and uncertainty of wind speed series for accurate long-term prediction. - Highlights: • A novel predictive modeling method is proposed for long-term wind speed forecasting. • Gaussian mixture copula model is estimated to characterize the multi-seasonality. • Localized Gaussian process regression models can deal with the random uncertainty. • Multiple GPR models are integrated through Bayesian inference strategy. • The proposed approach shows higher prediction accuracy and reliability

  7. NASA and COTS Electronics: Past Approach and Successes - Future Considerations

    Science.gov (United States)

    LaBel, Kenneth A.

    2018-01-01

    NASA has a long history of using commercial grade electronics in space. In this talk, a brief history of NASAâ's trends and approaches to commercial grade electronics focusing on processing and memory systems will be presented. This will include providing summary information on the space hazards to electronics as well as NASA mission trade space. We will also discuss developing recommendations for risk management approaches to Electrical, Electronic and Electromechanical (EEE) parts and reliability in space. The final portion of the talk will discuss emerging aerospace trends and the future for Commercial Off The Shelf (COTS) usage.

  8. Regression models for categorical, count, and related variables an applied approach

    CERN Document Server

    Hoffmann, John P

    2016-01-01

    Social science and behavioral science students and researchers are often confronted with data that are categorical, count a phenomenon, or have been collected over time. Sociologists examining the likelihood of interracial marriage, political scientists studying voting behavior, criminologists counting the number of offenses people commit, health scientists studying the number of suicides across neighborhoods, and psychologists modeling mental health treatment success are all interested in outcomes that are not continuous. Instead, they must measure and analyze these events and phenomena in a discrete manner.   This book provides an introduction and overview of several statistical models designed for these types of outcomes--all presented with the assumption that the reader has only a good working knowledge of elementary algebra and has taken introductory statistics and linear regression analysis.   Numerous examples from the social sciences demonstrate the practical applications of these models. The chapte...

  9. Key factors contributing to accident severity rate in construction industry in Iran: a regression modelling approach.

    Science.gov (United States)

    Soltanzadeh, Ahmad; Mohammadfam, Iraj; Moghimbeigi, Abbas; Ghiasvand, Reza

    2016-03-01

    Construction industry involves the highest risk of occupational accidents and bodily injuries, which range from mild to very severe. The aim of this cross-sectional study was to identify the factors associated with accident severity rate (ASR) in the largest Iranian construction companies based on data about 500 occupational accidents recorded from 2009 to 2013. We also gathered data on safety and health risk management and training systems. Data were analysed using Pearson's chi-squared coefficient and multiple regression analysis. Median ASR (and the interquartile range) was 107.50 (57.24- 381.25). Fourteen of the 24 studied factors stood out as most affecting construction accident severity (p<0.05). These findings can be applied in the design and implementation of a comprehensive safety and health risk management system to reduce ASR.

  10. Flow modeling in a porous cylinder with regressing walls using semi analytical approach

    Directory of Open Access Journals (Sweden)

    M Azimi

    2016-10-01

    Full Text Available In this paper, the mathematical modeling of the flow in a porous cylinder with a focus on applications to solid rocket motors is presented. As usual, the cylindrical propellant grain of a solid rocket motor is modeled as a long tube with one end closed at the headwall, while the other remains open. The cylindrical wall is assumed to be permeable so as to simulate the propellant burning and normal gas injection. At first, the problem description and formulation are considered. The Navier-Stokes equations for the viscous flow in a porous cylinder with regressing walls are reduced to a nonlinear ODE by using a similarity transformation in time and space. Application of Differential Transformation Method (DTM as an approximate analytical method has been successfully applied. Finally the results have been presented for various cases.

  11. In search of a corrected prescription drug elasticity estimate: a meta-regression approach.

    Science.gov (United States)

    Gemmill, Marin C; Costa-Font, Joan; McGuire, Alistair

    2007-06-01

    An understanding of the relationship between cost sharing and drug consumption depends on consistent and unbiased price elasticity estimates. However, there is wide heterogeneity among studies, which constrains the applicability of elasticity estimates for empirical purposes and policy simulation. This paper attempts to provide a corrected measure of the drug price elasticity by employing meta-regression analysis (MRA). The results indicate that the elasticity estimates are significantly different from zero, and the corrected elasticity is -0.209 when the results are made robust to heteroskedasticity and clustering of observations. Elasticity values are higher when the study was published in an economic journal, when the study employed a greater number of observations, and when the study used aggregate data. Elasticity estimates are lower when the institutional setting was a tax-based health insurance system.

  12. Casemix funding for a specialist paediatrics hospital: a hedonic regression approach.

    Science.gov (United States)

    Bridges, J F; Hanson, R M

    2000-01-01

    This paper inquires into the effects that Diagnosis Related Groups (DRGs) have had on the ability to explain patient-level costs in a specialist paediatrics hospital. Two hedonic models are estimated using 1996/97 New Children's Hospital (NCH) patient level cost data, one with and one without a casemix index (CMI). The results show that the inclusion of a casemix index as an explanatory variable leads to a better accounting of cost. The full hedonic model is then used to simulate a funding model for the 1997/98 NCH cost data. These costs are highly correlated with the actual costs reported for that year. In addition, univariate regression indicates that there has been inflation in costs in the order of 4.8% between the two years. In conclusion, hedonic analysis can provide valuable evidence for the design of funding models that account for casemix.

  13. How efficient are referral hospitals in Uganda? A data envelopment analysis and tobit regression approach.

    Science.gov (United States)

    Mujasi, Paschal N; Asbu, Eyob Z; Puig-Junoy, Jaume

    2016-07-08

    Hospitals represent a significant proportion of health expenditures in Uganda, accounting for about 26 % of total health expenditure. Improving the technical efficiency of hospitals in Uganda can result in large savings which can be devoted to expand access to services and improve quality of care. This paper explores the technical efficiency of referral hospitals in Uganda during the 2012/2013 financial year. This was a cross sectional study using secondary data. Input and output data were obtained from the Uganda Ministry of Health annual health sector performance report for the period July 1, 2012 to June 30, 2013 for the 14 public sector regional referral and 4 large private not for profit hospitals. We assumed an output-oriented model with Variable Returns to Scale to estimate the efficiency score for each hospital using Data Envelopment Analysis (DEA) with STATA13. Using a Tobit model DEA, efficiency scores were regressed against selected institutional and contextual/environmental factors to estimate their impacts on efficiency. The average variable returns to scale (Pure) technical efficiency score was 91.4 % and the average scale efficiency score was 87.1 % while the average constant returns to scale technical efficiency score was 79.4 %. Technically inefficient hospitals could have become more efficient by increasing the outpatient department visits by 45,943; and inpatient days by 31,425 without changing the total number of inputs. Alternatively, they would achieve efficiency by for example transferring the excess 216 medical staff and 454 beds to other levels of the health system without changing the total number of outputs. Tobit regression indicates that significant factors in explaining hospital efficiency are: hospital size (p Uganda.

  14. Identifying multiple outliers in linear regression: robust fit and clustering approach

    International Nuclear Information System (INIS)

    Robiah Adnan; Mohd Nor Mohamad; Halim Setan

    2001-01-01

    This research provides a clustering based approach for determining potential candidates for outliers. This is modification of the method proposed by Serbert et. al (1988). It is based on using the single linkage clustering algorithm to group the standardized predicted and residual values of data set fit by least trimmed of squares (LTS). (Author)

  15. Electronic waste management approaches: an overview.

    Science.gov (United States)

    Kiddee, Peeranart; Naidu, Ravi; Wong, Ming H

    2013-05-01

    Electronic waste (e-waste) is one of the fastest-growing pollution problems worldwide given the presence if a variety of toxic substances which can contaminate the environment and threaten human health, if disposal protocols are not meticulously managed. This paper presents an overview of toxic substances present in e-waste, their potential environmental and human health impacts together with management strategies currently being used in certain countries. Several tools including life cycle assessment (LCA), material flow analysis (MFA), multi criteria analysis (MCA) and extended producer responsibility (EPR) have been developed to manage e-wastes especially in developed countries. The key to success in terms of e-waste management is to develop eco-design devices, properly collect e-waste, recover and recycle material by safe methods, dispose of e-waste by suitable techniques, forbid the transfer of used electronic devices to developing countries, and raise awareness of the impact of e-waste. No single tool is adequate but together they can complement each other to solve this issue. A national scheme such as EPR is a good policy in solving the growing e-waste problems. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  16. City housing atmospheric pollutant impact on emergency visit for asthma: A classification and regression tree approach.

    Science.gov (United States)

    Mazenq, Julie; Dubus, Jean-Christophe; Gaudart, Jean; Charpin, Denis; Viudes, Gilles; Noel, Guilhem

    2017-11-01

    Particulate matter, nitrogen dioxide (NO 2 ) and ozone are recognized as the three pollutants that most significantly affect human health. Asthma is a multifactorial disease. However, the place of residence has rarely been investigated. We compared the impact of air pollution, measured near patients' homes, on emergency department (ED) visits for asthma or trauma (controls) within the Provence-Alpes-Côte-d'Azur region. Variables were selected using classification and regression trees on asthmatic and control population, 3-99 years, visiting ED from January 1 to December 31, 2013. Then in a nested case control study, randomization was based on the day of ED visit and on defined age groups. Pollution, meteorological, pollens and viral data measured that day were linked to the patient's ZIP code. A total of 794,884 visits were reported including 6250 for asthma and 278,192 for trauma. Factors associated with an excess risk of emergency visit for asthma included short-term exposure to NO 2 , female gender, high viral load and a combination of low temperature and high humidity. Short-term exposures to high NO 2 concentrations, as assessed close to the homes of the patients, were significantly associated with asthma-related ED visits in children and adults. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Relative Age in School and Suicide among Young Individuals in Japan: A Regression Discontinuity Approach.

    Directory of Open Access Journals (Sweden)

    Tetsuya Matsubayashi

    Full Text Available Evidence collected in many parts of the world suggests that, compared to older students, students who are relatively younger at school entry tend to have worse academic performance and lower levels of income. This study examined how relative age in a grade affects suicide rates of adolescents and young adults between 15 and 25 years of age using data from Japan.We examined individual death records in the Vital Statistics of Japan from 1989 to 2010. In contrast to other countries, late entry to primary school is not allowed in Japan. We took advantage of the school entry cutoff date to implement a regression discontinuity (RD design, assuming that the timing of births around the school entry cutoff date was randomly determined and therefore that individuals who were born just before and after the cutoff date have similar baseline characteristics.We found that those who were born right before the school cutoff day and thus youngest in their cohort have higher mortality rates by suicide, compared to their peers who were born right after the cutoff date and thus older. We also found that those with relative age disadvantage tend to follow a different career path than those with relative age advantage, which may explain their higher suicide mortality rates.Relative age effects have broader consequences than was previously supposed. This study suggests that policy intervention that alleviates the relative age effect can be important.

  18. Global approaches to regulating electronic cigarettes

    OpenAIRE

    Kennedy, Ryan David; Awopegba, Ayodeji; De Le?n, Elaine; Cohen, Joanna E

    2016-01-01

    Objectives Classify and describe the policy approaches used by countries to regulate e-cigarettes. Methods National policies regulating e-cigarettes were identified by (1) conducting web searches on Ministry of Health websites, and (2) broad web searches. The mechanisms used to regulate e-cigarettes were classified as new/amended laws, or existing laws. The policy domains identified include restrictions or prohibitions on product: sale, manufacturing, importation, distribution, use, product d...

  19. A kernel regression approach to gene-gene interaction detection for case-control studies.

    Science.gov (United States)

    Larson, Nicholas B; Schaid, Daniel J

    2013-11-01

    Gene-gene interactions are increasingly being addressed as a potentially important contributor to the variability of complex traits. Consequently, attentions have moved beyond single locus analysis of association to more complex genetic models. Although several single-marker approaches toward interaction analysis have been developed, such methods suffer from very high testing dimensionality and do not take advantage of existing information, notably the definition of genes as functional units. Here, we propose a comprehensive family of gene-level score tests for identifying genetic elements of disease risk, in particular pairwise gene-gene interactions. Using kernel machine methods, we devise score-based variance component tests under a generalized linear mixed model framework. We conducted simulations based upon coalescent genetic models to evaluate the performance of our approach under a variety of disease models. These simulations indicate that our methods are generally higher powered than alternative gene-level approaches and at worst competitive with exhaustive SNP-level (where SNP is single-nucleotide polymorphism) analyses. Furthermore, we observe that simulated epistatic effects resulted in significant marginal testing results for the involved genes regardless of whether or not true main effects were present. We detail the benefits of our methods and discuss potential genome-wide analysis strategies for gene-gene interaction analysis in a case-control study design. © 2013 WILEY PERIODICALS, INC.

  20. Electronic waste management approaches: An overview

    Energy Technology Data Exchange (ETDEWEB)

    Kiddee, Peeranart [Centre for Environmental Risk Assessment and Remediation, University of South Australia, Mawson Lakes Campus, Adelaide, SA 5095 (Australia); Cooperative Research Centre for Contamination Assessment and Remediation of the Environment, Mawson Lakes Campus, Adelaide, SA 5095 (Australia); Naidu, Ravi, E-mail: ravi.naidu@crccare.com [Centre for Environmental Risk Assessment and Remediation, University of South Australia, Mawson Lakes Campus, Adelaide, SA 5095 (Australia); Cooperative Research Centre for Contamination Assessment and Remediation of the Environment, Mawson Lakes Campus, Adelaide, SA 5095 (Australia); Wong, Ming H. [Croucher Institute for Environmental Sciences, Department of Biology, Hong Kong Baptist University, Kowloon Tong (China)

    2013-05-15

    Highlights: ► Human toxicity of hazardous substances in e-waste. ► Environmental impacts of e-waste from disposal processes. ► Life Cycle Assessment (LCA), Material Flow Analysis (MFA), Multi Criteria Analysis (MCA) and Extended Producer Responsibility (EPR) to and solve e-waste problems. ► Key issues relating to tools managing e-waste for sustainable e-waste management. - Abstract: Electronic waste (e-waste) is one of the fastest-growing pollution problems worldwide given the presence if a variety of toxic substances which can contaminate the environment and threaten human health, if disposal protocols are not meticulously managed. This paper presents an overview of toxic substances present in e-waste, their potential environmental and human health impacts together with management strategies currently being used in certain countries. Several tools including Life Cycle Assessment (LCA), Material Flow Analysis (MFA), Multi Criteria Analysis (MCA) and Extended Producer Responsibility (EPR) have been developed to manage e-wastes especially in developed countries. The key to success in terms of e-waste management is to develop eco-design devices, properly collect e-waste, recover and recycle material by safe methods, dispose of e-waste by suitable techniques, forbid the transfer of used electronic devices to developing countries, and raise awareness of the impact of e-waste. No single tool is adequate but together they can complement each other to solve this issue. A national scheme such as EPR is a good policy in solving the growing e-waste problems.

  1. Electronic waste management approaches: An overview

    International Nuclear Information System (INIS)

    Kiddee, Peeranart; Naidu, Ravi; Wong, Ming H.

    2013-01-01

    Highlights: ► Human toxicity of hazardous substances in e-waste. ► Environmental impacts of e-waste from disposal processes. ► Life Cycle Assessment (LCA), Material Flow Analysis (MFA), Multi Criteria Analysis (MCA) and Extended Producer Responsibility (EPR) to and solve e-waste problems. ► Key issues relating to tools managing e-waste for sustainable e-waste management. - Abstract: Electronic waste (e-waste) is one of the fastest-growing pollution problems worldwide given the presence if a variety of toxic substances which can contaminate the environment and threaten human health, if disposal protocols are not meticulously managed. This paper presents an overview of toxic substances present in e-waste, their potential environmental and human health impacts together with management strategies currently being used in certain countries. Several tools including Life Cycle Assessment (LCA), Material Flow Analysis (MFA), Multi Criteria Analysis (MCA) and Extended Producer Responsibility (EPR) have been developed to manage e-wastes especially in developed countries. The key to success in terms of e-waste management is to develop eco-design devices, properly collect e-waste, recover and recycle material by safe methods, dispose of e-waste by suitable techniques, forbid the transfer of used electronic devices to developing countries, and raise awareness of the impact of e-waste. No single tool is adequate but together they can complement each other to solve this issue. A national scheme such as EPR is a good policy in solving the growing e-waste problems

  2. Direct energy rebound effect for road passenger transport in China: A dynamic panel quantile regression approach

    International Nuclear Information System (INIS)

    Zhang, Yue-Jun; Peng, Hua-Rong; Liu, Zhao; Tan, Weiping

    2015-01-01

    The transport sector appears a main energy consumer in China and plays a significant role in energy conservation. Improving energy efficiency proves an effective way to reduce energy consumption in transport sector, whereas its effectiveness may be affected by the rebound effect. This paper proposes a dynamic panel quantile regression model to estimate the direct energy rebound effect for road passenger transport in the whole country, eastern, central and western China, respectively, based on the data of 30 provinces from 2003 to 2012. The empirical results reveal that, first of all, the direct rebound effect does exist for road passenger transport and on the whole country, the short-term and long-term direct rebound effects are 25.53% and 26.56% on average, respectively. Second, the direct rebound effect for road passenger transport in central and eastern China tends to decrease, increase and then decrease again, whereas that in western China decreases and then increases, with the increasing passenger kilometers. Finally, when implementing energy efficiency policy in road passenger transport sector, the effectiveness of energy conservation in western China proves much better than that in central China overall, while the effectiveness in central China is relatively better than that in eastern China. - Highlights: • The direct rebound effect (RE) for road passenger transport in China is estimated. • The direct RE in the whole country, eastern, central, and western China is analyzed. • The short and long-term direct REs are 25.53% and 26.56% within the sample period. • Western China has better energy-saving performance than central and eastern China.

  3. Consistency analysis of subspace identification methods based on a linear regression approach

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2001-01-01

    In the literature results can be found which claim consistency for the subspace method under certain quite weak assumptions. Unfortunately, a new result gives a counter example showing inconsistency under these assumptions and then gives new more strict sufficient assumptions which however does n...... not include important model structures as e.g. Box-Jenkins. Based on a simple least squares approach this paper shows the possible inconsistency under the weak assumptions and develops only slightly stricter assumptions sufficient for consistency and which includes any model structure...

  4. Short-term wind speed prediction using an unscented Kalman filter based state-space support vector regression approach

    International Nuclear Information System (INIS)

    Chen, Kuilin; Yu, Jie

    2014-01-01

    Highlights: • A novel hybrid modeling method is proposed for short-term wind speed forecasting. • Support vector regression model is constructed to formulate nonlinear state-space framework. • Unscented Kalman filter is adopted to recursively update states under random uncertainty. • The new SVR–UKF approach is compared to several conventional methods for short-term wind speed prediction. • The proposed method demonstrates higher prediction accuracy and reliability. - Abstract: Accurate wind speed forecasting is becoming increasingly important to improve and optimize renewable wind power generation. Particularly, reliable short-term wind speed prediction can enable model predictive control of wind turbines and real-time optimization of wind farm operation. However, this task remains challenging due to the strong stochastic nature and dynamic uncertainty of wind speed. In this study, unscented Kalman filter (UKF) is integrated with support vector regression (SVR) based state-space model in order to precisely update the short-term estimation of wind speed sequence. In the proposed SVR–UKF approach, support vector regression is first employed to formulate a nonlinear state-space model and then unscented Kalman filter is adopted to perform dynamic state estimation recursively on wind sequence with stochastic uncertainty. The novel SVR–UKF method is compared with artificial neural networks (ANNs), SVR, autoregressive (AR) and autoregressive integrated with Kalman filter (AR-Kalman) approaches for predicting short-term wind speed sequences collected from three sites in Massachusetts, USA. The forecasting results indicate that the proposed method has much better performance in both one-step-ahead and multi-step-ahead wind speed predictions than the other approaches across all the locations

  5. Comparison of beta-binomial regression model approaches to analyze health-related quality of life data.

    Science.gov (United States)

    Najera-Zuloaga, Josu; Lee, Dae-Jin; Arostegui, Inmaculada

    2017-01-01

    Health-related quality of life has become an increasingly important indicator of health status in clinical trials and epidemiological research. Moreover, the study of the relationship of health-related quality of life with patients and disease characteristics has become one of the primary aims of many health-related quality of life studies. Health-related quality of life scores are usually assumed to be distributed as binomial random variables and often highly skewed. The use of the beta-binomial distribution in the regression context has been proposed to model such data; however, the beta-binomial regression has been performed by means of two different approaches in the literature: (i) beta-binomial distribution with a logistic link; and (ii) hierarchical generalized linear models. None of the existing literature in the analysis of health-related quality of life survey data has performed a comparison of both approaches in terms of adequacy and regression parameter interpretation context. This paper is motivated by the analysis of a real data application of health-related quality of life outcomes in patients with Chronic Obstructive Pulmonary Disease, where the use of both approaches yields to contradictory results in terms of covariate effects significance and consequently the interpretation of the most relevant factors in health-related quality of life. We present an explanation of the results in both methodologies through a simulation study and address the need to apply the proper approach in the analysis of health-related quality of life survey data for practitioners, providing an R package.

  6. Global approaches to regulating electronic cigarettes

    Science.gov (United States)

    Kennedy, Ryan David; Awopegba, Ayodeji; De León, Elaine; Cohen, Joanna E

    2017-01-01

    Objectives Classify and describe the policy approaches used by countries to regulate e-cigarettes. Methods National policies regulating e-cigarettes were identified by (1) conducting web searches on Ministry of Health websites, and (2) broad web searches. The mechanisms used to regulate e-cigarettes were classified as new/amended laws, or existing laws. The policy domains identified include restrictions or prohibitions on product: sale, manufacturing, importation, distribution, use, product design including e-liquid ingredients, advertising/promotion/sponsorship, trademarks, and regulation requiring: taxation, health warning labels and child-safety standards. The classification of the policy was reviewed by a country expert. Results The search identified 68 countries that regulate e-cigarettes: 22 countries regulate e-cigarettes using existing regulations; 25 countries enacted new policies to regulate e-cigarettes; 7 countries made amendments to existing legislation; 14 countries use a combination of new/amended and existing regulation. Common policies include a minimum-age-of-purchase, indoor-use (vape-free public places) bans and marketing restrictions. Few countries are applying a tax to e-cigarettes. Conclusions A range of regulatory approaches are being applied to e-cigarettes globally; many countries regulate e-cigarettes using legislation not written for e-cigarettes. PMID:27903958

  7. Education-Based Gaps in eHealth: A Weighted Logistic Regression Approach.

    Science.gov (United States)

    Amo, Laura

    2016-10-12

    Persons with a college degree are more likely to engage in eHealth behaviors than persons without a college degree, compounding the health disadvantages of undereducated groups in the United States. However, the extent to which quality of recent eHealth experience reduces the education-based eHealth gap is unexplored. The goal of this study was to examine how eHealth information search experience moderates the relationship between college education and eHealth behaviors. Based on a nationally representative sample of adults who reported using the Internet to conduct the most recent health information search (n=1458), I evaluated eHealth search experience in relation to the likelihood of engaging in different eHealth behaviors. I examined whether Internet health information search experience reduces the eHealth behavior gaps among college-educated and noncollege-educated adults. Weighted logistic regression models were used to estimate the probability of different eHealth behaviors. College education was significantly positively related to the likelihood of 4 eHealth behaviors. In general, eHealth search experience was negatively associated with health care behaviors, health information-seeking behaviors, and user-generated or content sharing behaviors after accounting for other covariates. Whereas Internet health information search experience has narrowed the education gap in terms of likelihood of using email or Internet to communicate with a doctor or health care provider and likelihood of using a website to manage diet, weight, or health, it has widened the education gap in the instances of searching for health information for oneself, searching for health information for someone else, and downloading health information on a mobile device. The relationship between college education and eHealth behaviors is moderated by Internet health information search experience in different ways depending on the type of eHealth behavior. After controlling for college

  8. Analysis of the impact of recreational trail usage for prioritising management decisions: a regression tree approach

    Science.gov (United States)

    Tomczyk, Aleksandra; Ewertowski, Marek; White, Piran; Kasprzak, Leszek

    2016-04-01

    The dual role of many Protected Natural Areas in providing benefits for both conservation and recreation poses challenges for management. Although recreation-based damage to ecosystems can occur very quickly, restoration can take many years. The protection of conservation interests at the same as providing for recreation requires decisions to be made about how to prioritise and direct management actions. Trails are commonly used to divert visitors from the most important areas of a site, but high visitor pressure can lead to increases in trail width and a concomitant increase in soil erosion. Here we use detailed field data on condition of recreational trails in Gorce National Park, Poland, as the basis for a regression tree analysis to determine the factors influencing trail deterioration, and link specific trail impacts with environmental, use related and managerial factors. We distinguished 12 types of trails, characterised by four levels of degradation: (1) trails with an acceptable level of degradation; (2) threatened trails; (3) damaged trails; and (4) heavily damaged trails. Damaged trails were the most vulnerable of all trails and should be prioritised for appropriate conservation and restoration. We also proposed five types of monitoring of recreational trail conditions: (1) rapid inventory of negative impacts; (2) monitoring visitor numbers and variation in type of use; (3) change-oriented monitoring focusing on sections of trail which were subjected to changes in type or level of use or subjected to extreme weather events; (4) monitoring of dynamics of trail conditions; and (5) full assessment of trail conditions, to be carried out every 10-15 years. The application of the proposed framework can enhance the ability of Park managers to prioritise their trail management activities, enhancing trail conditions and visitor safety, while minimising adverse impacts on the conservation value of the ecosystem. A.M.T. was supported by the Polish Ministry of

  9. Straight line fitting and predictions: On a marginal likelihood approach to linear regression and errors-in-variables models

    Science.gov (United States)

    Christiansen, Bo

    2015-04-01

    Linear regression methods are without doubt the most used approaches to describe and predict data in the physical sciences. They are often good first order approximations and they are in general easier to apply and interpret than more advanced methods. However, even the properties of univariate regression can lead to debate over the appropriateness of various models as witnessed by the recent discussion about climate reconstruction methods. Before linear regression is applied important choices have to be made regarding the origins of the noise terms and regarding which of the two variables under consideration that should be treated as the independent variable. These decisions are often not easy to make but they may have a considerable impact on the results. We seek to give a unified probabilistic - Bayesian with flat priors - treatment of univariate linear regression and prediction by taking, as starting point, the general errors-in-variables model (Christiansen, J. Clim., 27, 2014-2031, 2014). Other versions of linear regression can be obtained as limits of this model. We derive the likelihood of the model parameters and predictands of the general errors-in-variables model by marginalizing over the nuisance parameters. The resulting likelihood is relatively simple and easy to analyze and calculate. The well known unidentifiability of the errors-in-variables model is manifested as the absence of a well-defined maximum in the likelihood. However, this does not mean that probabilistic inference can not be made; the marginal likelihoods of model parameters and the predictands have, in general, well-defined maxima. We also include a probabilistic version of classical calibration and show how it is related to the errors-in-variables model. The results are illustrated by an example from the coupling between the lower stratosphere and the troposphere in the Northern Hemisphere winter.

  10. Semiparametric approach for non-monotone missing covariates in a parametric regression model

    KAUST Repository

    Sinha, Samiran

    2014-02-26

    Missing covariate data often arise in biomedical studies, and analysis of such data that ignores subjects with incomplete information may lead to inefficient and possibly biased estimates. A great deal of attention has been paid to handling a single missing covariate or a monotone pattern of missing data when the missingness mechanism is missing at random. In this article, we propose a semiparametric method for handling non-monotone patterns of missing data. The proposed method relies on the assumption that the missingness mechanism of a variable does not depend on the missing variable itself but may depend on the other missing variables. This mechanism is somewhat less general than the completely non-ignorable mechanism but is sometimes more flexible than the missing at random mechanism where the missingness mechansim is allowed to depend only on the completely observed variables. The proposed approach is robust to misspecification of the distribution of the missing covariates, and the proposed mechanism helps to nullify (or reduce) the problems due to non-identifiability that result from the non-ignorable missingness mechanism. The asymptotic properties of the proposed estimator are derived. Finite sample performance is assessed through simulation studies. Finally, for the purpose of illustration we analyze an endometrial cancer dataset and a hip fracture dataset.

  11. Regressive Prediction Approach to Vertical Handover in Fourth Generation Wireless Networks

    Directory of Open Access Journals (Sweden)

    Abubakar M. Miyim

    2014-11-01

    Full Text Available The over increasing demand for deployment of wireless access networks has made wireless mobile devices to face so many challenges in choosing the best suitable network from a set of available access networks. Some of the weighty issues in 4G wireless networks are fastness and seamlessness in handover process. This paper therefore, proposes a handover technique based on movement prediction in wireless mobile (WiMAX and LTE-A environment. The technique enables the system to predict signal quality between the UE and Radio Base Stations (RBS/Access Points (APs in two different networks. Prediction is achieved by employing the Markov Decision Process Model (MDPM where the movement of the UE is dynamically estimated and averaged to keep track of the signal strength of mobile users. With the help of the prediction, layer-3 handover activities are able to occur prior to layer-2 handover, and therefore, total handover latency can be reduced. The performances of various handover approaches influenced by different metrics (mobility velocities were evaluated. The results presented demonstrate good accuracy the proposed method was able to achieve in predicting the next signal level by reducing the total handover latency.

  12. Two-step superresolution approach for surveillance face image through radial basis function-partial least squares regression and locality-induced sparse representation

    Science.gov (United States)

    Jiang, Junjun; Hu, Ruimin; Han, Zhen; Wang, Zhongyuan; Chen, Jun

    2013-10-01

    Face superresolution (SR), or face hallucination, refers to the technique of generating a high-resolution (HR) face image from a low-resolution (LR) one with the help of a set of training examples. It aims at transcending the limitations of electronic imaging systems. Applications of face SR include video surveillance, in which the individual of interest is often far from cameras. A two-step method is proposed to infer a high-quality and HR face image from a low-quality and LR observation. First, we establish the nonlinear relationship between LR face images and HR ones, according to radial basis function and partial least squares (RBF-PLS) regression, to transform the LR face into the global face space. Then, a locality-induced sparse representation (LiSR) approach is presented to enhance the local facial details once all the global faces for each LR training face are constructed. A comparison of some state-of-the-art SR methods shows the superiority of the proposed two-step approach, RBF-PLS global face regression followed by LiSR-based local patch reconstruction. Experiments also demonstrate the effectiveness under both simulation conditions and some real conditions.

  13. The Public-Private Sector Wage Gap in Zambia in the 1990s: A Quantile Regression Approach

    DEFF Research Database (Denmark)

    Nielsen, Helena Skyt; Rosholm, Michael

    2001-01-01

    of economic transition, because items as privatization and deregulation were on the political agenda. The focus is placed on the public-private sector wage gap, and the results show that this gap was relatively favorable for the low-skilled and less favorable for the high-skilled. This picture was further......We investigate the determinants of wages in Zambia and based on the quantile regression approach, we analyze how their effects differ at different points in the wage distribution and over time. We use three cross-sections of Zambian household data from the early nineties, which was a period...

  14. A Two-Stage Penalized Logistic Regression Approach to Case-Control Genome-Wide Association Studies

    Directory of Open Access Journals (Sweden)

    Jingyuan Zhao

    2012-01-01

    Full Text Available We propose a two-stage penalized logistic regression approach to case-control genome-wide association studies. This approach consists of a screening stage and a selection stage. In the screening stage, main-effect and interaction-effect features are screened by using L1-penalized logistic like-lihoods. In the selection stage, the retained features are ranked by the logistic likelihood with the smoothly clipped absolute deviation (SCAD penalty (Fan and Li, 2001 and Jeffrey’s Prior penalty (Firth, 1993, a sequence of nested candidate models are formed, and the models are assessed by a family of extended Bayesian information criteria (J. Chen and Z. Chen, 2008. The proposed approach is applied to the analysis of the prostate cancer data of the Cancer Genetic Markers of Susceptibility (CGEMS project in the National Cancer Institute, USA. Simulation studies are carried out to compare the approach with the pair-wise multiple testing approach (Marchini et al. 2005 and the LASSO-patternsearch algorithm (Shi et al. 2007.

  15. Predictability of extreme weather events for NE U.S.: improvement of the numerical prediction using a Bayesian regression approach

    Science.gov (United States)

    Yang, J.; Astitha, M.; Anagnostou, E. N.; Hartman, B.; Kallos, G. B.

    2015-12-01

    Weather prediction accuracy has become very important for the Northeast U.S. given the devastating effects of extreme weather events in the recent years. Weather forecasting systems are used towards building strategies to prevent catastrophic losses for human lives and the environment. Concurrently, weather forecast tools and techniques have evolved with improved forecast skill as numerical prediction techniques are strengthened by increased super-computing resources. In this study, we examine the combination of two state-of-the-science atmospheric models (WRF and RAMS/ICLAMS) by utilizing a Bayesian regression approach to improve the prediction of extreme weather events for NE U.S. The basic concept behind the Bayesian regression approach is to take advantage of the strengths of two atmospheric modeling systems and, similar to the multi-model ensemble approach, limit their weaknesses which are related to systematic and random errors in the numerical prediction of physical processes. The first part of this study is focused on retrospective simulations of seventeen storms that affected the region in the period 2004-2013. Optimal variances are estimated by minimizing the root mean square error and are applied to out-of-sample weather events. The applicability and usefulness of this approach are demonstrated by conducting an error analysis based on in-situ observations from meteorological stations of the National Weather Service (NWS) for wind speed and wind direction, and NCEP Stage IV radar data, mosaicked from the regional multi-sensor for precipitation. The preliminary results indicate a significant improvement in the statistical metrics of the modeled-observed pairs for meteorological variables using various combinations of the sixteen events as predictors of the seventeenth. This presentation will illustrate the implemented methodology and the obtained results for wind speed, wind direction and precipitation, as well as set the research steps that will be

  16. Cost-of-illness studies based on massive data: a prevalence-based, top-down regression approach.

    Science.gov (United States)

    Stollenwerk, Björn; Welchowski, Thomas; Vogl, Matthias; Stock, Stephanie

    2016-04-01

    Despite the increasing availability of routine data, no analysis method has yet been presented for cost-of-illness (COI) studies based on massive data. We aim, first, to present such a method and, second, to assess the relevance of the associated gain in numerical efficiency. We propose a prevalence-based, top-down regression approach consisting of five steps: aggregating the data; fitting a generalized additive model (GAM); predicting costs via the fitted GAM; comparing predicted costs between prevalent and non-prevalent subjects; and quantifying the stochastic uncertainty via error propagation. To demonstrate the method, it was applied to aggregated data in the context of chronic lung disease to German sickness funds data (from 1999), covering over 7.3 million insured. To assess the gain in numerical efficiency, the computational time of the innovative approach has been compared with corresponding GAMs applied to simulated individual-level data. Furthermore, the probability of model failure was modeled via logistic regression. Applying the innovative method was reasonably fast (19 min). In contrast, regarding patient-level data, computational time increased disproportionately by sample size. Furthermore, using patient-level data was accompanied by a substantial risk of model failure (about 80 % for 6 million subjects). The gain in computational efficiency of the innovative COI method seems to be of practical relevance. Furthermore, it may yield more precise cost estimates.

  17. Validity of the reduced-sample insulin modified frequently-sampled intravenous glucose tolerance test using the nonlinear regression approach.

    Science.gov (United States)

    Sumner, Anne E; Luercio, Marcella F; Frempong, Barbara A; Ricks, Madia; Sen, Sabyasachi; Kushner, Harvey; Tulloch-Reid, Marshall K

    2009-02-01

    The disposition index, the product of the insulin sensitivity index (S(I)) and the acute insulin response to glucose, is linked in African Americans to chromosome 11q. This link was determined with S(I) calculated with the nonlinear regression approach to the minimal model and data from the reduced-sample insulin-modified frequently-sampled intravenous glucose tolerance test (Reduced-Sample-IM-FSIGT). However, the application of the nonlinear regression approach to calculate S(I) using data from the Reduced-Sample-IM-FSIGT has been challenged as being not only inaccurate but also having a high failure rate in insulin-resistant subjects. Our goal was to determine the accuracy and failure rate of the Reduced-Sample-IM-FSIGT using the nonlinear regression approach to the minimal model. With S(I) from the Full-Sample-IM-FSIGT considered the standard and using the nonlinear regression approach to the minimal model, we compared the agreement between S(I) from the Full- and Reduced-Sample-IM-FSIGT protocols. One hundred African Americans (body mass index, 31.3 +/- 7.6 kg/m(2) [mean +/- SD]; range, 19.0-56.9 kg/m(2)) had FSIGTs. Glucose (0.3 g/kg) was given at baseline. Insulin was infused from 20 to 25 minutes (total insulin dose, 0.02 U/kg). For the Full-Sample-IM-FSIGT, S(I) was calculated based on the glucose and insulin samples taken at -1, 1, 2, 3, 4, 5, 6, 7, 8,10, 12, 14, 16, 19, 22, 23, 24, 25, 27, 30, 40, 50, 60, 70, 80, 90, 100, 120, 150, and 180 minutes. For the Reduced-Sample-FSIGT, S(I) was calculated based on the time points that appear in bold. Agreement was determined by Spearman correlation, concordance, and the Bland-Altman method. In addition, for both protocols, the population was divided into tertiles of S(I). Insulin resistance was defined by the lowest tertile of S(I) from the Full-Sample-IM-FSIGT. The distribution of subjects across tertiles was compared by rank order and kappa statistic. We found that the rate of failure of resolution of S(I) by

  18. Understanding electron magnetic circular dichroism in a transition potential approach

    Science.gov (United States)

    Barthel, J.; Mayer, J.; Rusz, J.; Ho, P.-L.; Zhong, X. Y.; Lentzen, M.; Dunin-Borkowski, R. E.; Urban, K. W.; Brown, H. G.; Findlay, S. D.; Allen, L. J.

    2018-04-01

    This paper introduces an approach based on transition potentials for inelastic scattering to understand the underlying physics of electron magnetic circular dichroism (EMCD). The transition potentials are sufficiently localized to permit atomic-scale EMCD. Two-beam and three-beam systematic row cases are discussed in detail in terms of transition potentials for conventional transmission electron microscopy, and the basic symmetries which arise in the three-beam case are confirmed experimentally. Atomic-scale EMCD in scanning transmission electron microscopy (STEM), using both a standard STEM probe and vortex beams, is discussed.

  19. Grid-based electronic structure calculations: The tensor decomposition approach

    Energy Technology Data Exchange (ETDEWEB)

    Rakhuba, M.V., E-mail: rakhuba.m@gmail.com [Skolkovo Institute of Science and Technology, Novaya St. 100, 143025 Skolkovo, Moscow Region (Russian Federation); Oseledets, I.V., E-mail: i.oseledets@skoltech.ru [Skolkovo Institute of Science and Technology, Novaya St. 100, 143025 Skolkovo, Moscow Region (Russian Federation); Institute of Numerical Mathematics, Russian Academy of Sciences, Gubkina St. 8, 119333 Moscow (Russian Federation)

    2016-05-01

    We present a fully grid-based approach for solving Hartree–Fock and all-electron Kohn–Sham equations based on low-rank approximation of three-dimensional electron orbitals. Due to the low-rank structure the total complexity of the algorithm depends linearly with respect to the one-dimensional grid size. Linear complexity allows for the usage of fine grids, e.g. 8192{sup 3} and, thus, cheap extrapolation procedure. We test the proposed approach on closed-shell atoms up to the argon, several molecules and clusters of hydrogen atoms. All tests show systematical convergence with the required accuracy.

  20. An Introduction to the Hybrid Approach of Neural Networks and the Linear Regression Model : An Illustration in the Hedonic Pricing Model of Building Costs

    OpenAIRE

    浅野, 美代子; マーコ, ユー K.W.

    2007-01-01

    This paper introduces the hybrid approach of neural networks and linear regression model proposed by Asano and Tsubaki (2003). Neural networks are often credited with its superiority in data consistency whereas the linear regression model provides simple interpretation of the data enabling researchers to verify their hypotheses. The hybrid approach aims at combing the strengths of these two well-established statistical methods. A step-by-step procedure for performing the hybrid approach is pr...

  1. Two-process approach to electron beam welding control

    International Nuclear Information System (INIS)

    Lastovirya, V.N.

    1987-01-01

    The analysis and synthesis of multi-dimensional welding control systems, which require the usage of computers, should be conducted within the temporal range. From the general control theory point two approaches - one-process and two-process - are possible to electron beam welding. In case of two-process approach, subprocesses of heat source formation and direct metal melting are separated. Two-process approach leads to two-profile control system and provides the complete controlability of electron beam welding within the frameworks of systems with concentrated, as well as, with distributed parameters. Approach choice for the given problem solution is determined, first of all, by stability degree of heat source during welding

  2. Toward Environmentally Robust Organic Electronics: Approaches and Applications.

    Science.gov (United States)

    Lee, Eun Kwang; Lee, Moo Yeol; Park, Cheol Hee; Lee, Hae Rang; Oh, Joon Hak

    2017-11-01

    Recent interest in flexible electronics has led to a paradigm shift in consumer electronics, and the emergent development of stretchable and wearable electronics is opening a new spectrum of ubiquitous applications for electronics. Organic electronic materials, such as π-conjugated small molecules and polymers, are highly suitable for use in low-cost wearable electronic devices, and their charge-carrier mobilities have now exceeded that of amorphous silicon. However, their commercialization is minimal, mainly because of weaknesses in terms of operational stability, long-term stability under ambient conditions, and chemical stability related to fabrication processes. Recently, however, many attempts have been made to overcome such instabilities of organic electronic materials. Here, an overview is provided of the strategies developed for environmentally robust organic electronics to overcome the detrimental effects of various critical factors such as oxygen, water, chemicals, heat, and light. Additionally, molecular design approaches to π-conjugated small molecules and polymers that are highly stable under ambient and harsh conditions are explored; such materials will circumvent the need for encapsulation and provide a greater degree of freedom using simple solution-based device-fabrication techniques. Applications that are made possible through these strategies are highlighted. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Pseudogap in the Eliashberg approach based on electron-phonon and electron-electron-phonon interaction

    Energy Technology Data Exchange (ETDEWEB)

    Szczesniak, R. [Institute of Physics, Czestochowa University of Technology (Poland); Institute of Physics, Jan Dlugosz University in Czestochowa (Poland); Durajski, A.P.; Duda, A.M. [Institute of Physics, Czestochowa University of Technology (Poland)

    2017-04-15

    The properties of the superconducting and the anomalous normal state were described by using the Eliashberg method. The pairing mechanism was reproduced with the help of the Hamiltonian, which models the electron-phonon and the electron-electron-phonon interaction (EEPh). The set of the Eliashberg equations, which determines the order parameter function (φ), the wave function renormalization factor (Z), and the energy shift function (χ), was derived. It was proven that for the sufficiently large values of the EEPh potential, the doping dependence of the order parameter (φ/Z) has the analogous course to that observed experimentally in cuprates. The energy gap in the electron density of states is induced by Z and χ - the contribution from φ is negligible. The electron density of states possesses the characteristic asymmetric form and the pseudogap is observed above the critical temperature. (copyright 2017 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  4. Spatial prediction of landslides using a hybrid machine learning approach based on Random Subspace and Classification and Regression Trees

    Science.gov (United States)

    Pham, Binh Thai; Prakash, Indra; Tien Bui, Dieu

    2018-02-01

    A hybrid machine learning approach of Random Subspace (RSS) and Classification And Regression Trees (CART) is proposed to develop a model named RSSCART for spatial prediction of landslides. This model is a combination of the RSS method which is known as an efficient ensemble technique and the CART which is a state of the art classifier. The Luc Yen district of Yen Bai province, a prominent landslide prone area of Viet Nam, was selected for the model development. Performance of the RSSCART model was evaluated through the Receiver Operating Characteristic (ROC) curve, statistical analysis methods, and the Chi Square test. Results were compared with other benchmark landslide models namely Support Vector Machines (SVM), single CART, Naïve Bayes Trees (NBT), and Logistic Regression (LR). In the development of model, ten important landslide affecting factors related with geomorphology, geology and geo-environment were considered namely slope angles, elevation, slope aspect, curvature, lithology, distance to faults, distance to rivers, distance to roads, and rainfall. Performance of the RSSCART model (AUC = 0.841) is the best compared with other popular landslide models namely SVM (0.835), single CART (0.822), NBT (0.821), and LR (0.723). These results indicate that performance of the RSSCART is a promising method for spatial landslide prediction.

  5. Comparison of exact, efron and breslow parameter approach method on hazard ratio and stratified cox regression model

    Science.gov (United States)

    Fatekurohman, Mohamat; Nurmala, Nita; Anggraeni, Dian

    2018-04-01

    Lungs are the most important organ, in the case of respiratory system. Problems related to disorder of the lungs are various, i.e. pneumonia, emphysema, tuberculosis and lung cancer. Comparing all those problems, lung cancer is the most harmful. Considering about that, the aim of this research applies survival analysis and factors affecting the endurance of the lung cancer patient using comparison of exact, Efron and Breslow parameter approach method on hazard ratio and stratified cox regression model. The data applied are based on the medical records of lung cancer patients in Jember Paru-paru hospital on 2016, east java, Indonesia. The factors affecting the endurance of the lung cancer patients can be classified into several criteria, i.e. sex, age, hemoglobin, leukocytes, erythrocytes, sedimentation rate of blood, therapy status, general condition, body weight. The result shows that exact method of stratified cox regression model is better than other. On the other hand, the endurance of the patients is affected by their age and the general conditions.

  6. Regression-based approach for testing the association between multi-region haplotype configuration and complex trait

    Directory of Open Access Journals (Sweden)

    Zhao Hongbo

    2009-09-01

    Full Text Available Abstract Background It is quite common that the genetic architecture of complex traits involves many genes and their interactions. Therefore, dealing with multiple unlinked genomic regions simultaneously is desirable. Results In this paper we develop a regression-based approach to assess the interactions of haplotypes that belong to different unlinked regions, and we use score statistics to test the null hypothesis of non-genetic association. Additionally, multiple marker combinations at each unlinked region are considered. The multiple tests are settled via the minP approach. The P value of the "best" multi-region multi-marker configuration is corrected via Monte-Carlo simulations. Through simulation studies, we assess the performance of the proposed approach and demonstrate its validity and power in testing for haplotype interaction association. Conclusion Our simulations showed that, for binary trait without covariates, our proposed methods prove to be equal and even more powerful than htr and hapcc which are part of the FAMHAP program. Additionally, our model can be applied to a wider variety of traits and allow adjustment for other covariates. To test the validity, our methods are applied to analyze the association between four unlinked candidate genes and pig meat quality.

  7. Investigating the complex relationship between in situ Southern Ocean pCO2 and its ocean physics and biogeochemical drivers using a nonparametric regression approach

    CSIR Research Space (South Africa)

    Pretorius, W

    2014-01-01

    Full Text Available the relationship more accurately in terms of MSE, RMSE and MAE, than a standard parametric approach (multiple linear regression). These results provide a platform for using the developed nonparametric regression model based on in situ measurements to predict p...

  8. Assessment of perfusion by dynamic contrast-enhanced imaging using a deconvolution approach based on regression and singular value decomposition.

    Science.gov (United States)

    Koh, T S; Wu, X Y; Cheong, L H; Lim, C C T

    2004-12-01

    The assessment of tissue perfusion by dynamic contrast-enhanced (DCE) imaging involves a deconvolution process. For analysis of DCE imaging data, we implemented a regression approach to select appropriate regularization parameters for deconvolution using the standard and generalized singular value decomposition methods. Monte Carlo simulation experiments were carried out to study the performance and to compare with other existing methods used for deconvolution analysis of DCE imaging data. The present approach is found to be robust and reliable at the levels of noise commonly encountered in DCE imaging, and for different models of the underlying tissue vasculature. The advantages of the present method, as compared with previous methods, include its efficiency of computation, ability to achieve adequate regularization to reproduce less noisy solutions, and that it does not require prior knowledge of the noise condition. The proposed method is applied on actual patient study cases with brain tumors and ischemic stroke, to illustrate its applicability as a clinical tool for diagnosis and assessment of treatment response.

  9. Electronic structure of a striped nickelate studied by the exact exchange for correlated electrons (EECE) approach

    KAUST Repository

    Schwingenschlögl, Udo

    2009-12-01

    Motivated by a RIXS study of Wakimoto, et al.(Phys. Rev. Lett., 102 (2009) 157001) we use density functional theory to analyze the magnetic order in the nickelate La5/3Sr1/3NiO4 and the details of its crystal and electronic structure. We compare the generalized gradient approximation to the hybrid functional approach of exact exchange for correlated electrons (EECE). In contrast to the former, the latter reproduces the insulating state of the compound and the midgap states. The EECE approach, in general, appears to be appropriate for describing stripe phases in systems with orbital degrees of freedom. Copyright © EPLA, 2009.

  10. Neo-Institutional Approach to the Study of Electronic Government

    Directory of Open Access Journals (Sweden)

    Yan I. Vaslavskiy

    2016-01-01

    Full Text Available The article is devoted to the neo-institutional approach as a methodological basis in the study of electronic government. In this article substantiates the choice of neo-institutional approach to the study of the processes of implementation of information and communication technologies in the activity of state institutions, analyzes the differences of neoinstitutionalism from traditional institutional approach, considers the features of the different directions of neo-institutionalism, namely sociological, historical and rational choice theory. Attention is paid to the reasons for the renewed interest in political institutions in political science. The article emphasizes the importance of considering the electronic government as an institution, and the conditions for its implementation in the Russian political system as the institutional environment. The authors pay special attention to the variety of sociological neo-institutionalism, used, in addition to political science in sociology of organizations. The article substantiates the value of using sociological institutionalism to explore the electronic government based on a comparative analysis of e-government projects in Russia and abroad and explores its heuristic capabilities. It examines the impact of the system of norms and values of the institutional environment on the processes of formation and development of electronic government in Russia. The research capacity of this theory is due to the fact that it allows us to trace the reasons for copying and replication of inefficient practices and organizational and management schemes, to identify the factors impeding innovation use by the state of electronic interaction technologies. It is emphasized that the use of the theory of institutional isomorphism is useful in the sphere of implementation of electronic technologies, in which a key role play pluralism, horizontal managerial communication, inter-agency coordination.

  11. River flow prediction using hybrid models of support vector regression with the wavelet transform, singular spectrum analysis and chaotic approach

    Science.gov (United States)

    Baydaroğlu, Özlem; Koçak, Kasım; Duran, Kemal

    2018-06-01

    Prediction of water amount that will enter the reservoirs in the following month is of vital importance especially for semi-arid countries like Turkey. Climate projections emphasize that water scarcity will be one of the serious problems in the future. This study presents a methodology for predicting river flow for the subsequent month based on the time series of observed monthly river flow with hybrid models of support vector regression (SVR). Monthly river flow over the period 1940-2012 observed for the Kızılırmak River in Turkey has been used for training the method, which then has been applied for predictions over a period of 3 years. SVR is a specific implementation of support vector machines (SVMs), which transforms the observed input data time series into a high-dimensional feature space (input matrix) by way of a kernel function and performs a linear regression in this space. SVR requires a special input matrix. The input matrix was produced by wavelet transforms (WT), singular spectrum analysis (SSA), and a chaotic approach (CA) applied to the input time series. WT convolutes the original time series into a series of wavelets, and SSA decomposes the time series into a trend, an oscillatory and a noise component by singular value decomposition. CA uses a phase space formed by trajectories, which represent the dynamics producing the time series. These three methods for producing the input matrix for the SVR proved successful, while the SVR-WT combination resulted in the highest coefficient of determination and the lowest mean absolute error.

  12. A new non-invasive approach based on polyhexamethylene biguanide increases the regression rate of HPV infection

    Directory of Open Access Journals (Sweden)

    Gentile Antonio

    2012-09-01

    Full Text Available Abstract Background HPV infection is a worldwide problem strictly linked to the development of cervical cancer. Persistence of the infection is one of the main factors responsible for the invasive progression and women diagnosed with intraepithelial squamous lesions are referred for further assessment and surgical treatments which are prone to complications. Despite this, there are several reports on the spontaneous regression of the infection. This study was carried out to evaluate the effectiveness of a long term polyhexamethylene biguanide (PHMB-based local treatment in improving the viral clearance, reducing the time exposure to the infection and avoiding the complications associated with the invasive treatments currently available. Method 100 women diagnosed with HPV infection were randomly assigned to receive six months of treatment with a PHMB-based gynecological solution (Monogin®, Lo.Li. Pharma, Rome - Italy or to remain untreated for the same period of time. Results A greater number of patients, who received the treatment were cleared of the infection at the two time points of the study (three and six months compared to that of the control group. A significant difference in the regression rate (90% Monogin group vs 70% control group was observed at the end of the study highlighting the time-dependent ability of PHMB to interact with the infection progression. Conclusions The topic treatment with PHMB is a preliminary safe and promising approach for patients with detected HPV infection increasing the chance of clearance and avoiding the use of invasive treatments when not strictly necessary. Trial registration ClinicalTrials.gov Identifier NCT01571141

  13. Orbital approach to the electronic structure of solids

    CERN Document Server

    Canadell, Enric; Iung, Christophe

    2012-01-01

    This book provides an intuitive yet sound understanding of how structure and properties of solids may be related. The natural link is provided by the band theory approach to the electronic structure of solids. The chemically insightful concept of orbital interaction and the essential machinery of band theory are used throughout the book to build links between the crystal and electronic structure of periodic systems. In such a way, it is shown how important tools for understandingproperties of solids like the density of states, the Fermi surface etc. can be qualitatively sketched and used to ei

  14. Path integral approach to electron scattering in classical electromagnetic potential

    International Nuclear Information System (INIS)

    Xu Chuang; Feng Feng; Li Ying-Jun

    2016-01-01

    As is known to all, the electron scattering in classical electromagnetic potential is one of the most widespread applications of quantum theory. Nevertheless, many discussions about electron scattering are based upon single-particle Schrodinger equation or Dirac equation in quantum mechanics rather than the method of quantum field theory. In this paper, by using the path integral approach of quantum field theory, we perturbatively evaluate the scattering amplitude up to the second order for the electron scattering by the classical electromagnetic potential. The results we derive are convenient to apply to all sorts of potential forms. Furthermore, by means of the obtained results, we give explicit calculations for the one-dimensional electric potential. (paper)

  15. Cyber Physical Systems Approach to Power Electronics Education

    Directory of Open Access Journals (Sweden)

    Marko Vekić

    2012-12-01

    Full Text Available This paper proposes a Cyber Physical Approach (CPS to power electronics (PE education where all aspects of PE technology from circuit topology to the implementation of real time control code on a microprocessor are dealt with as an inseparable whole, and only the system complexity is increased during the course of instruction. This approach is now made practical thanks to the affordable and unrestricted access to high-power PE laboratory infrastructure (PE laboratory in a box in the form of high-fidelity digital PE emulators with 1us calculation time step and latency.

  16. Quantitative vs. qualitative approaches to the electronic structure of solids

    International Nuclear Information System (INIS)

    Oliva, J.M.; Llunell, Miquel; Alemany, Pere; Canadell, Enric

    2003-01-01

    The usefulness of qualitative and quantitative theoretical approaches in solid state chemistry is discussed by considering three different types of problems: (a) the distribution of boron and carbon atoms in MB 2 C 2 (M=Ca, La, etc.) phases, (b) the band structure and Fermi surface of low-dimensional transition metal oxides and bronzes, and (c) the correlation between the crystal and electronic structure of the ternary nitride Ca 2 AuN

  17. An effective approach for choosing an electronic health record.

    Science.gov (United States)

    Rowley, Robert

    2009-01-01

    With government stimulus money becoming available to encourage healthcare facilities to adopt electronic health record (EHR) systems, the decision to move forward with implementing an EHR system has taken on an urgency not previously seen. The EHR landscape is evolving rapidly and the underlying technology platform is becoming increasingly interconnected. One must make sure that an EHR decision does not lock oneself into technology obsolescence. The best approach for evaluating an EHR is on the basis of:usability, interoperability, and affordability.

  18. A multi-frequency approach to free electron lasers driven by short electron bunches

    International Nuclear Information System (INIS)

    Piovella, Nicola

    1997-01-01

    A multi-frequency model for free electron lasers (FELs), based on the Fourier decomposition of the radiation field coupled with the beam electrons, is discussed. We show that the multi-frequency approach allows for an accurate description of the evolution of the radiation spectrum, also when the FEL is driven by short electron bunches, of arbitrary longitudinal profile. We derive from the multi-frequency model, by averaging over one radiation period, the usual FEL equations modelling the slippage between radiation and particles and describing the super-radiant regime in high-gain FELs. As an example of application of the multi-frequency model, we discuss the coherent spontaneous emission (CSE) from short electron bunches

  19. How do urban households in China respond to increasing block pricing in electricity? Evidence from a fuzzy regression discontinuity approach

    International Nuclear Information System (INIS)

    Zhang, Zibin; Cai, Wenxin; Feng, Xiangzhao

    2017-01-01

    China is the largest electricity consumption country after it has passed the United States in 2011. Residential electricity consumption in China grew by 381.35% (12.85% per annum) between 2000 and 2013. In order to deal with rapid growth in residential electricity consumption, an increasing block pricing policy was introduced for residential electricity consumers in China on July 1st, 2012. Using difference-in-differences models with a fuzzy regression discontinuity design, we estimate a causal effect of price on electricity consumption for urban households during the introduction of increasing block pricing policy in Guangdong province of China. We find that consumers do not respond to a smaller (approximately 8%) increase in marginal price. However, consumers do respond to a larger increase in marginal price. An approximately 40% increase in marginal price induces an approximately 35% decrease in electricity use (284 kW h per month). Our results suggest that although the increasing block pricing could affect the behavior of households with higher electricity use, there is only a limit potential to overall energy conservation. - Highlights: • Estimate electricity consumption changes in response to the IBP in China. • Employ quasi-experimental approach and micro household level data in China. • Households do not respond to a smaller increase in marginal price. • 40% increase in marginal price induces a 35% decrease in electricity use.

  20. A Short-Term and High-Resolution System Load Forecasting Approach Using Support Vector Regression with Hybrid Parameters Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Huaiguang [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-25

    This work proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of the hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system.

  1. Impact of a New Law to Reduce the Legal Blood Alcohol Concentration Limit - A Poisson Regression Analysis and Descriptive Approach.

    Science.gov (United States)

    Nistal-Nuño, Beatriz

    2017-03-31

    In Chile, a new law introduced in March 2012 lowered the blood alcohol concentration (BAC) limit for impaired drivers from 0.1% to 0.08% and the BAC limit for driving under the influence of alcohol from 0.05% to 0.03%, but its effectiveness remains uncertain. The goal of this investigation was to evaluate the effects of this enactment on road traffic injuries and fatalities in Chile. A retrospective cohort study. Data were analyzed using a descriptive and a Generalized Linear Models approach, type of Poisson regression, to analyze deaths and injuries in a series of additive Log-Linear Models accounting for the effects of law implementation, month influence, a linear time trend and population exposure. A review of national databases in Chile was conducted from 2003 to 2014 to evaluate the monthly rates of traffic fatalities and injuries associated to alcohol and in total. It was observed a decrease by 28.1 percent in the monthly rate of traffic fatalities related to alcohol as compared to before the law (Plaw (Plaw implemented in 2012 in Chile. Chile experienced a significant reduction in alcohol-related traffic fatalities and injuries, being a successful public health intervention.

  2. Prediction of Currency Volume Issued in Taiwan Using a Hybrid Artificial Neural Network and Multiple Regression Approach

    Directory of Open Access Journals (Sweden)

    Yuehjen E. Shao

    2013-01-01

    Full Text Available Because the volume of currency issued by a country always affects its interest rate, price index, income levels, and many other important macroeconomic variables, the prediction of currency volume issued has attracted considerable attention in recent years. In contrast to the typical single-stage forecast model, this study proposes a hybrid forecasting approach to predict the volume of currency issued in Taiwan. The proposed hybrid models consist of artificial neural network (ANN and multiple regression (MR components. The MR component of the hybrid models is established for a selection of fewer explanatory variables, wherein the selected variables are of higher importance. The ANN component is then designed to generate forecasts based on those important explanatory variables. Subsequently, the model is used to analyze a real dataset of Taiwan's currency from 1996 to 2011 and twenty associated explanatory variables. The prediction results reveal that the proposed hybrid scheme exhibits superior forecasting performance for predicting the volume of currency issued in Taiwan.

  3. Seemingly Unrelated Regression Approach for GSTARIMA Model to Forecast Rain Fall Data in Malang Southern Region Districts

    Directory of Open Access Journals (Sweden)

    Siti Choirun Nisak

    2016-06-01

    Full Text Available Time series forecasting models can be used to predict phenomena that occur in nature. Generalized Space Time Autoregressive (GSTAR is one of time series model used to forecast the data consisting the elements of time and space. This model is limited to the stationary and non-seasonal data. Generalized Space Time Autoregressive Integrated Moving Average (GSTARIMA is GSTAR development model that accommodates the non-stationary and seasonal data. Ordinary Least Squares (OLS is method used to estimate parameter of GSTARIMA model. Estimation parameter of GSTARIMA model using OLS will not produce efficiently estimator if there is an error correlation between spaces. Ordinary Least Square (OLS assumes the variance-covariance matrix has a constant error ~(, but in fact, the observatory spaces are correlated so that variance-covariance matrix of the error is not constant. Therefore, Seemingly Unrelated Regression (SUR approach is used to accommodate the weakness of the OLS. SUR assumption is ~(, for estimating parameters GSTARIMA model. The method to estimate parameter of SUR is Generalized Least Square (GLS. Applications GSTARIMA-SUR models for rainfall data in the region Malang obtained GSTARIMA models ((1(1,12,36,(0,(1-SUR with determination coefficient generated with the average of 57.726%.

  4. Materials and processing approaches for foundry-compatible transient electronics

    Science.gov (United States)

    Chang, Jan-Kai; Fang, Hui; Bower, Christopher A.; Song, Enming; Yu, Xinge; Rogers, John A.

    2017-07-01

    Foundry-based routes to transient silicon electronic devices have the potential to serve as the manufacturing basis for “green” electronic devices, biodegradable implants, hardware secure data storage systems, and unrecoverable remote devices. This article introduces materials and processing approaches that enable state-of-the-art silicon complementary metal-oxide-semiconductor (CMOS) foundries to be leveraged for high-performance, water-soluble forms of electronics. The key elements are (i) collections of biodegradable electronic materials (e.g., silicon, tungsten, silicon nitride, silicon dioxide) and device architectures that are compatible with manufacturing procedures currently used in the integrated circuit industry, (ii) release schemes and transfer printing methods for integration of multiple ultrathin components formed in this way onto biodegradable polymer substrates, and (iii) planarization and metallization techniques to yield interconnected and fully functional systems. Various CMOS devices and circuit elements created in this fashion and detailed measurements of their electrical characteristics highlight the capabilities. Accelerated dissolution studies in aqueous environments reveal the chemical kinetics associated with the underlying transient behaviors. The results demonstrate the technical feasibility for using foundry-based routes to sophisticated forms of transient electronic devices, with functional capabilities and cost structures that could support diverse applications in the biomedical, military, industrial, and consumer industries.

  5. Analytic approach to auroral electron transport and energy degradation

    International Nuclear Information System (INIS)

    Stamnes, K.

    1980-01-01

    The interaction of a beam of auroral electrons with the atmosphere is described by the linear transport equation, encompassing discrete energy loss, multiple scattering, and secondary electrons. A solution to the transport equation provides the electron intensity as a function of altitude, pitch angle (with respect to the geomagnetic field) and energy. A multi-stream (discrete ordinate) approximation to the transport equation is developed. An analytic solution is obtained in this approximation. The computational scheme obtained by combining the present transport code with the energy degradation method of Swartz (1979) conserves energy identically. The theory provides a framework within which angular distributions can be easily calculated and interpreted. Thus, a detailed study of the angular distributions of 'non-absorbed' electrons (i.e., electrons that have lost just a small fraction of their incident energy) reveals a systematic variation with incident angle and energy, and with penetration depth. The present approach also gives simple yet accurate solutions in low order multi-stream approximations. The accuracy of the four-stream approximation is generally within a few per cent, whereas two-stream results for backscattered mean intensities and fluxes are accurate to within 10-15%. (author)

  6. Path-integral approach to resonant electron-molecule scattering

    International Nuclear Information System (INIS)

    Winterstetter, M.; Domcke, W.

    1993-01-01

    A path-integral formulation of resonant electron-molecule scattering is developed within the framework of the projection-operator formalism of scattering theory. The formation and decay of resonances is treated in real time as a quantum-mechanical electronic-tunneling process, modified by the coupling of the electronic motion with the nuclear degrees of freedom. It is shown that the electronic continuum can be summed over in the path-integral formulation, resulting formally in the path integral for an effective two-state system with coupling to vibrations. The harmonic-oscillator approximation is adopted for the vibrational motion in the present work. Approximation methods are introduced which render the numerical evaluation of the sum over paths feasible for up to ∼10 3 elementary time slices. The theory is numerically realized for simple but nontrivial models representing the 2 Π g d-wave shape resonance in e - +N 2 collisions and the 2 Σ u + p-wave shape resonance in e - +H 2 collisions, respectively. The accuracy of the path-integral results is assessed by comparison with exact numerical reference data for these models. The essential virtue of the path-integral approach is the fact that the computational effort scales at most linearly with the number of vibrational degrees of freedom. The path-integral method is thus well suited to treat electron collisions with polyatomic molecules and molecular aggregates

  7. Electronic excitation of atoms and molecules by electron impact in a linear algebraic, separable potential approach

    International Nuclear Information System (INIS)

    Collins, L.A.; Schneider, B.I.

    1984-01-01

    The linear algebraic, separable potential approach is applied to the electronic excitation of atoms and molecules by electron impact. By representing the exchange and off-diagonal direct terms on a basis, the standard set of coupled inelastic equations is reduced to a set of elastic inhomogeneous equations. The procedure greatly simplifies the formulation by allowing a large portion of the problem to be handled by standard bound-state techniques and by greatly reducing the order of the scattering equations that must be solved. Application is made to the excitation of atomic hydrogen in the three-state close-coupling (1s, 2s, 2p) approximation. (author)

  8. Regression Phalanxes

    OpenAIRE

    Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.

    2017-01-01

    Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...

  9. Reconstruction of Local Sea Levels at South West Pacific Islands—A Multiple Linear Regression Approach (1988-2014)

    Science.gov (United States)

    Kumar, V.; Melet, A.; Meyssignac, B.; Ganachaud, A.; Kessler, W. S.; Singh, A.; Aucan, J.

    2018-02-01

    Rising sea levels are a critical concern in small island nations. The problem is especially serious in the western south Pacific, where the total sea level rise over the last 60 years has been up to 3 times the global average. In this study, we aim at reconstructing sea levels at selected sites in the region (Suva, Lautoka—Fiji, and Nouméa—New Caledonia) as a multilinear regression (MLR) of atmospheric and oceanic variables. We focus on sea level variability at interannual-to-interdecadal time scales, and trend over the 1988-2014 period. Local sea levels are first expressed as a sum of steric and mass changes. Then a dynamical approach is used based on wind stress curl as a proxy for the thermosteric component, as wind stress curl anomalies can modulate the thermocline depth and resultant sea levels via Rossby wave propagation. Statistically significant predictors among wind stress curl, halosteric sea level, zonal/meridional wind stress components, and sea surface temperature are used to construct a MLR model simulating local sea levels. Although we are focusing on the local scale, the global mean sea level needs to be adjusted for. Our reconstructions provide insights on key drivers of sea level variability at the selected sites, showing that while local dynamics and the global signal modulate sea level to a given extent, most of the variance is driven by regional factors. On average, the MLR model is able to reproduce 82% of the variance in island sea level, and could be used to derive local sea level projections via downscaling of climate models.

  10. Percentile-Based ETCCDI Temperature Extremes Indices for CMIP5 Model Output: New Results through Semiparametric Quantile Regression Approach

    Science.gov (United States)

    Li, L.; Yang, C.

    2017-12-01

    Climate extremes often manifest as rare events in terms of surface air temperature and precipitation with an annual reoccurrence period. In order to represent the manifold characteristics of climate extremes for monitoring and analysis, the Expert Team on Climate Change Detection and Indices (ETCCDI) had worked out a set of 27 core indices based on daily temperature and precipitation data, describing extreme weather and climate events on an annual basis. The CLIMDEX project (http://www.climdex.org) had produced public domain datasets of such indices for data from a variety of sources, including output from global climate models (GCM) participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Among the 27 ETCCDI indices, there are six percentile-based temperature extremes indices that may fall into two groups: exceedance rates (ER) (TN10p, TN90p, TX10p and TX90p) and durations (CSDI and WSDI). Percentiles must be estimated prior to the calculation of the indices, and could more or less be biased by the adopted algorithm. Such biases will in turn be propagated to the final results of indices. The CLIMDEX used an empirical quantile estimator combined with a bootstrap resampling procedure to reduce the inhomogeneity in the annual series of the ER indices. However, there are still some problems remained in the CLIMDEX datasets, namely the overestimated climate variability due to unaccounted autocorrelation in the daily temperature data, seasonally varying biases and inconsistency between algorithms applied to the ER indices and to the duration indices. We now present new results of the six indices through a semiparametric quantile regression approach for the CMIP5 model output. By using the base-period data as a whole and taking seasonality and autocorrelation into account, this approach successfully addressed the aforementioned issues and came out with consistent results. The new datasets cover the historical and three projected (RCP2.6, RCP4.5 and RCP

  11. Modeling Electronic Circular Dichroism within the Polarizable Embedding Approach

    DEFF Research Database (Denmark)

    Nørby, Morten S; Olsen, Jógvan Magnus Haugaard; Steinmann, Casper

    2017-01-01

    We present a systematic investigation of the key components needed to model single chromophore electronic circular dichroism (ECD) within the polarizable embedding (PE) approach. By relying on accurate forms of the embedding potential, where especially the inclusion of local field effects...... are in focus, we show that qualitative agreement between rotatory strength parameters calculated by full quantum mechanical calculations and the more efficient embedding calculations can be obtained. An important aspect in the computation of reliable absorption parameters is the need for conformational...... sampling. We show that a significant number of snapshots are needed to avoid artifacts in the calculated electronic circular dichroism parameters due to insufficient configurational sampling, thus highlighting the efficiency of the PE model....

  12. Does the Magnitude of the Link between Unemployment and Crime Depend on the Crime Level? A Quantile Regression Approach

    Directory of Open Access Journals (Sweden)

    Horst Entorf

    2015-07-01

    Full Text Available Two alternative hypotheses – referred to as opportunity- and stigma-based behavior – suggest that the magnitude of the link between unemployment and crime also depends on preexisting local crime levels. In order to analyze conjectured nonlinearities between both variables, we use quantile regressions applied to German district panel data. While both conventional OLS and quantile regressions confirm the positive link between unemployment and crime for property crimes, results for assault differ with respect to the method of estimation. Whereas conventional mean regressions do not show any significant effect (which would confirm the usual result found for violent crimes in the literature, quantile regression reveals that size and importance of the relationship are conditional on the crime rate. The partial effect is significantly positive for moderately low and median quantiles of local assault rates.

  13. Comparison of two regression-based approaches for determining nutrient and sediment fluxes and trends in the Chesapeake Bay watershed

    Science.gov (United States)

    Moyer, Douglas; Hirsch, Robert M.; Hyer, Kenneth

    2012-01-01

    Nutrient and sediment fluxes and changes in fluxes over time are key indicators that water resource managers can use to assess the progress being made in improving the structure and function of the Chesapeake Bay ecosystem. The U.S. Geological Survey collects annual nutrient (nitrogen and phosphorus) and sediment flux data and computes trends that describe the extent to which water-quality conditions are changing within the major Chesapeake Bay tributaries. Two regression-based approaches were compared for estimating annual nutrient and sediment fluxes and for characterizing how these annual fluxes are changing over time. The two regression models compared are the traditionally used ESTIMATOR and the newly developed Weighted Regression on Time, Discharge, and Season (WRTDS). The model comparison focused on answering three questions: (1) What are the differences between the functional form and construction of each model? (2) Which model produces estimates of flux with the greatest accuracy and least amount of bias? (3) How different would the historical estimates of annual flux be if WRTDS had been used instead of ESTIMATOR? One additional point of comparison between the two models is how each model determines trends in annual flux once the year-to-year variations in discharge have been determined. All comparisons were made using total nitrogen, nitrate, total phosphorus, orthophosphorus, and suspended-sediment concentration data collected at the nine U.S. Geological Survey River Input Monitoring stations located on the Susquehanna, Potomac, James, Rappahannock, Appomattox, Pamunkey, Mattaponi, Patuxent, and Choptank Rivers in the Chesapeake Bay watershed. Two model characteristics that uniquely distinguish ESTIMATOR and WRTDS are the fundamental model form and the determination of model coefficients. ESTIMATOR and WRTDS both predict water-quality constituent concentration by developing a linear relation between the natural logarithm of observed constituent

  14. The nuisance of nuisance regression: spectral misspecification in a common approach to resting-state fMRI preprocessing reintroduces noise and obscures functional connectivity.

    Science.gov (United States)

    Hallquist, Michael N; Hwang, Kai; Luna, Beatriz

    2013-11-15

    Recent resting-state functional connectivity fMRI (RS-fcMRI) research has demonstrated that head motion during fMRI acquisition systematically influences connectivity estimates despite bandpass filtering and nuisance regression, which are intended to reduce such nuisance variability. We provide evidence that the effects of head motion and other nuisance signals are poorly controlled when the fMRI time series are bandpass-filtered but the regressors are unfiltered, resulting in the inadvertent reintroduction of nuisance-related variation into frequencies previously suppressed by the bandpass filter, as well as suboptimal correction for noise signals in the frequencies of interest. This is important because many RS-fcMRI studies, including some focusing on motion-related artifacts, have applied this approach. In two cohorts of individuals (n=117 and 22) who completed resting-state fMRI scans, we found that the bandpass-regress approach consistently overestimated functional connectivity across the brain, typically on the order of r=.10-.35, relative to a simultaneous bandpass filtering and nuisance regression approach. Inflated correlations under the bandpass-regress approach were associated with head motion and cardiac artifacts. Furthermore, distance-related differences in the association of head motion and connectivity estimates were much weaker for the simultaneous filtering approach. We recommend that future RS-fcMRI studies ensure that the frequencies of nuisance regressors and fMRI data match prior to nuisance regression, and we advocate a simultaneous bandpass filtering and nuisance regression strategy that better controls nuisance-related variability. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Problems and Projects Based Approach For Analog Electronic Circuits' Course

    Directory of Open Access Journals (Sweden)

    Vahé Nerguizian

    2009-04-01

    Full Text Available New educational methods and approaches are recently introduced and implemented at several North American and European universities using Problems and Projects Based Approach (PPBA. The PPBA employs a teaching technique based mostly on competences/skills rather than only on knowledge. This method has been implemented and proven by several pedagogical instructors and authors at several educational institutions. This approach is used at different disciplines such as medicine, biology, engineering and many others. It has the advantage to improve the student's skills and the knowledge retention rate, and reflects the 21st century industrial/company needs and demands. Before implementing this approach to a course, a good resources preparation and planning is needed upfront by the responsible or instructor of the course to achieve the course and students related objectives. This paper presents the preparation, the generated documentation and the implementation of a pilot project utilizing PPBA education for a second year undergraduate electronic course over a complete semester, and for two different class groups (morning and evening groups. The outcome of this project (achieved goals, observed difficulties and lessons learned is presented based on different tools such as students 'in class' communication and feedback, different course evaluation forms and the professor/instructor feedback. Resources, challenges, difficulties and recommendations are also assessed and presented. The impact, the effect and the results (during and at the end of the academic fall session of the PPBA on students and instructor are discussed, validated, managed and communicated to help other instructor in taking appropriate approach decisions with respect to this new educational approach compared to the classical one.

  16. Exciton Scattering approach for conjugated macromolecules: from electronic spectra to electron-phonon coupling

    Science.gov (United States)

    Tretiak, Sergei

    2014-03-01

    The exciton scattering (ES) technique is a multiscale approach developed for efficient calculations of excited-state electronic structure and optical spectra in low-dimensional conjugated macromolecules. Within the ES method, the electronic excitations in the molecular structure are attributed to standing waves representing quantum quasi-particles (excitons), which reside on the graph. The exciton propagation on the linear segments is characterized by the exciton dispersion, whereas the exciton scattering on the branching centers is determined by the energy-dependent scattering matrices. Using these ES energetic parameters, the excitation energies are then found by solving a set of generalized ``particle in a box'' problems on the graph that represents the molecule. All parameters can be extracted from quantum-chemical computations of small molecular fragments and tabulated in the ES library for further applications. Subsequently, spectroscopic modeling for any macrostructure within considered molecular family could be performed with negligible numerical effort. The exciton scattering properties of molecular vertices can be further described by tight-binding or equivalently lattice models. The on-site energies and hopping constants are obtained from the exciton dispersion and scattering matrices. Such tight-binding model approach is particularly useful to describe the exciton-phonon coupling, energetic disorder and incoherent energy transfer in large branched conjugated molecules. Overall the ES applications accurately reproduce the optical spectra compared to the reference quantum chemistry results, and make possible to predict spectra of complex macromolecules, where conventional electronic structure calculations are unfeasible.

  17. Translating Response During Therapy into Ultimate Treatment Outcome: A Personalized 4-Dimensional MRI Tumor Volumetric Regression Approach in Cervical Cancer

    International Nuclear Information System (INIS)

    Mayr, Nina A.; Wang, Jian Z.; Lo, Simon S.; Zhang Dongqing; Grecula, John C.; Lu Lanchun; Montebello, Joseph F.; Fowler, Jeffrey M.; Yuh, William T.C.

    2010-01-01

    Purpose: To assess individual volumetric tumor regression pattern in cervical cancer during therapy using serial four-dimensional MRI and to define the regression parameters' prognostic value validated with local control and survival correlation. Methods and Materials: One hundred and fifteen patients with Stage IB 2 -IVA cervical cancer treated with radiation therapy (RT) underwent serial MRI before (MRI 1) and during RT, at 2-2.5 weeks (MRI 2, at 20-25 Gy), and at 4-5 weeks (MRI 3, at 40-50 Gy). Eighty patients had a fourth MRI 1-2 months post-RT. Mean follow-up was 5.3 years. Tumor volume was measured by MRI-based three-dimensional volumetry, and plotted as dose(time)/volume regression curves. Volume regression parameters were correlated with local control, disease-specific, and overall survival. Results: Residual tumor volume, slope, and area under the regression curve correlated significantly with local control and survival. Residual volumes ≥20% at 40-50 Gy were independently associated with inferior 5-year local control (53% vs. 97%, p <0.001) and disease-specific survival rates (50% vs. 72%, p = 0.009) than smaller volumes. Patients with post-RT residual volumes ≥10% had 0% local control and 17% disease-specific survival, compared with 91% and 72% for <10% volume (p <0.001). Conclusion: Using more accurate four-dimensional volumetric regression analysis, tumor response can now be directly translated into individual patients' outcome for clinical application. Our results define two temporal thresholds critically influencing local control and survival. In patients with ≥20% residual volume at 40-50 Gy and ≥10% post-RT, the risk for local failure and death are so high that aggressive intervention may be warranted.

  18. Assessing the response of area burned to changing climate in western boreal North America using a Multivariate Adaptive Regression Splines (MARS) approach

    Science.gov (United States)

    Michael S. Balshi; A. David McGuire; Paul Duffy; Mike Flannigan; John Walsh; Jerry Melillo

    2009-01-01

    We developed temporally and spatially explicit relationships between air temperature and fuel moisture codes derived from the Canadian Fire Weather Index System to estimate annual area burned at 2.5o (latitude x longitude) resolution using a Multivariate Adaptive Regression Spline (MARS) approach across Alaska and Canada. Burned area was...

  19. New developments in fruit and vegetables consumption in the period 1999-2004 in Denmark - a quantile regression approach

    DEFF Research Database (Denmark)

    Hansen, Aslak Hedemann

    2008-01-01

    The development in the consumption of fruit and vegetables in the period 1999-2004 in Denmark was investigated using quantile regression and two previously overlooked problems were identified. First, the change in the ten percent quantile samples decreased. This could have been caused by changes ...

  20. Predicting attention-deficit/hyperactivity disorder severity from psychosocial stress and stress-response genes : A random forest regression approach

    NARCIS (Netherlands)

    Van Der Meer, D.; Hoekstra, P. J.; Van Donkelaar, M.; Bralten, J.; Oosterlaan, J.; Heslenfeld, D.; Faraone, S. V.; Franke, B.; Buitelaar, J. K.; Hartman, C. A.

    2017-01-01

    Identifying genetic variants contributing to attention-deficit/hyperactivity disorder (ADHD) is complicated by the involvement of numerous common genetic variants with small effects, interacting with each other as well as with environmental factors, such as stress exposure. Random forest regression

  1. A Generalized Least Squares Regression Approach for Computing Effect Sizes in Single-Case Research: Application Examples

    Science.gov (United States)

    Maggin, Daniel M.; Swaminathan, Hariharan; Rogers, Helen J.; O'Keeffe, Breda V.; Sugai, George; Horner, Robert H.

    2011-01-01

    A new method for deriving effect sizes from single-case designs is proposed. The strategy is applicable to small-sample time-series data with autoregressive errors. The method uses Generalized Least Squares (GLS) to model the autocorrelation of the data and estimate regression parameters to produce an effect size that represents the magnitude of…

  2. Predicting attention-deficit/hyperactivity disorder severity from psychosocial stress and stress-response genes : a random forest regression approach

    NARCIS (Netherlands)

    van der Meer, D.; Hoekstra, P. J.; van Donkelaar, Marjolein M. J.; Bralten, Janita; Oosterlaan, J; Heslenfeld, Dirk J.; Faraone, S. V.; Franke, B.; Buitelaar, J. K.; Hartman, C. A.

    2017-01-01

    Identifying genetic variants contributing to attention-deficit/hyperactivity disorder (ADHD) is complicated by the involvement of numerous common genetic variants with small effects, interacting with each other as well as with environmental factors, such as stress exposure. Random forest regression

  3. A comparative study between nonlinear regression and artificial neural network approaches for modelling wild oat (Avena fatua) field emergence

    Science.gov (United States)

    Non-linear regression techniques are used widely to fit weed field emergence patterns to soil microclimatic indices using S-type functions. Artificial neural networks present interesting and alternative features for such modeling purposes. In this work, a univariate hydrothermal-time based Weibull m...

  4. Comparison of autoregressive (AR) strategy with that of regression approach for determining ozone layer depletion as a physical process

    International Nuclear Information System (INIS)

    Yousufzai, M.A.K; Aansari, M.R.K.; Quamar, J.; Iqbal, J.; Hussain, M.A.

    2010-01-01

    This communication presents the development of a comprehensive characterization of ozone layer depletion (OLD) phenomenon as a physical process in the form of mathematical models that comprise the usual regression, multiple or polynomial regression and stochastic strategy. The relevance of these models has been illuminated using predicted values of different parameters under a changing environment. The information obtained from such analysis can be employed to alter the possible factors and variables to achieve optimum performance. This kind of analysis initiates a study towards formulating the phenomenon of OLD as a physical process with special reference to the stratospheric region of Pakistan. The data presented here establishes that the Auto regressive (AR) nature of modeling OLD as a physical process is an appropriate scenario rather than using usual regression. The data reported in literature suggest quantitatively the OLD is occurring in our region. For this purpose we have modeled this phenomenon using the data recorded at the Geophysical Centre Quetta during the period 1960-1999. The predictions made by this analysis are useful for public, private and other relevant organizations. (author)

  5. Quantile Regression Methods

    DEFF Research Database (Denmark)

    Fitzenberger, Bernd; Wilke, Ralf Andreas

    2015-01-01

    if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...

  6. A practical approach for electron monitor unit calculation

    International Nuclear Information System (INIS)

    Choi, David; Patyal, Baldev; Cho, Jongmin; Cheng, Ing Y; Nookala, Prashanth

    2009-01-01

    Electron monitor unit (MU) calculation requires measured beam data such as the relative output factor (ROF) of a cone, insert correction factor (ICF) and effective source-to-surface distance (ESD). Measuring the beam data to cover all possible clinical cases is not practical for a busy clinic because it takes tremendous time and labor. In this study, we propose a practical approach to reduce the number of data measurements without affecting accuracy. It is based on two findings of dosimetric properties of electron beams. One is that the output ratio of two inserts is independent of the cone used, and the other is that ESD is a function of field size but independent of cone and jaw opening. For the measurements to prove the findings, a parallel plate ion chamber (Markus, PTW 23343) with an electrometer (Cardinal Health 35040) was used. We measured the outputs to determine ROF, ICF and ESD of different energies (5-21 MeV). Measurements were made in a Plastic Water(TM) phantom or in water. Three linear accelerators were used: Siemens MD2 (S/N 2689), Siemens Primus (S/N 3305) and Varian Clinic 21-EX (S/N 1495). With these findings, the number of data set to be measured can be reduced to less than 20% of the data points. (note)

  7. Inverse Problem Approach for the Alignment of Electron Tomographic Series

    International Nuclear Information System (INIS)

    Tran, V.D.; Moreaud, M.; Thiebaut, E.; Denis, L.; Becker, J.M.

    2014-01-01

    In the refining industry, morphological measurements of particles have become an essential part in the characterization catalyst supports. Through these parameters, one can infer the specific physico-chemical properties of the studied materials. One of the main acquisition techniques is electron tomography (or nano-tomography). 3D volumes are reconstructed from sets of projections from different angles made by a Transmission Electron Microscope (TEM). This technique provides a real three-dimensional information at the nano-metric scale. A major issue in this method is the misalignment of the projections that contributes to the reconstruction. The current alignment techniques usually employ fiducial markers such as gold particles for a correct alignment of the images. When the use of markers is not possible, the correlation between adjacent projections is used to align them. However, this method sometimes fails. In this paper, we propose a new method based on the inverse problem approach where a certain criterion is minimized using a variant of the Nelder and Mead simplex algorithm. The proposed approach is composed of two steps. The first step consists of an initial alignment process, which relies on the minimization of a cost function based on robust statistics measuring the similarity of a projection to its previous projections in the series. It reduces strong shifts resulting from the acquisition between successive projections. In the second step, the pre-registered projections are used to initialize an iterative alignment-refinement process which alternates between (i) volume reconstructions and (ii) registrations of measured projections onto simulated projections computed from the volume reconstructed in (i). At the end of this process, we have a correct reconstruction of the volume, the projections being correctly aligned. Our method is tested on simulated data and shown to estimate accurately the translation, rotation and scale of arbitrary transforms. We

  8. A Poisson regression approach to model monthly hail occurrence in Northern Switzerland using large-scale environmental variables

    Science.gov (United States)

    Madonna, Erica; Ginsbourger, David; Martius, Olivia

    2018-05-01

    In Switzerland, hail regularly causes substantial damage to agriculture, cars and infrastructure, however, little is known about its long-term variability. To study the variability, the monthly number of days with hail in northern Switzerland is modeled in a regression framework using large-scale predictors derived from ERA-Interim reanalysis. The model is developed and verified using radar-based hail observations for the extended summer season (April-September) in the period 2002-2014. The seasonality of hail is explicitly modeled with a categorical predictor (month) and monthly anomalies of several large-scale predictors are used to capture the year-to-year variability. Several regression models are applied and their performance tested with respect to standard scores and cross-validation. The chosen model includes four predictors: the monthly anomaly of the two meter temperature, the monthly anomaly of the logarithm of the convective available potential energy (CAPE), the monthly anomaly of the wind shear and the month. This model well captures the intra-annual variability and slightly underestimates its inter-annual variability. The regression model is applied to the reanalysis data back in time to 1980. The resulting hail day time series shows an increase of the number of hail days per month, which is (in the model) related to an increase in temperature and CAPE. The trend corresponds to approximately 0.5 days per month per decade. The results of the regression model have been compared to two independent data sets. All data sets agree on the sign of the trend, but the trend is weaker in the other data sets.

  9. Autistic Regression

    Science.gov (United States)

    Matson, Johnny L.; Kozlowski, Alison M.

    2010-01-01

    Autistic regression is one of the many mysteries in the developmental course of autism and pervasive developmental disorders not otherwise specified (PDD-NOS). Various definitions of this phenomenon have been used, further clouding the study of the topic. Despite this problem, some efforts at establishing prevalence have been made. The purpose of…

  10. Generalized regression neural network (GRNN)-based approach for colored dissolved organic matter (CDOM) retrieval: case study of Connecticut River at Middle Haddam Station, USA.

    Science.gov (United States)

    Heddam, Salim

    2014-11-01

    The prediction of colored dissolved organic matter (CDOM) using artificial neural network approaches has received little attention in the past few decades. In this study, colored dissolved organic matter (CDOM) was modeled using generalized regression neural network (GRNN) and multiple linear regression (MLR) models as a function of Water temperature (TE), pH, specific conductance (SC), and turbidity (TU). Evaluation of the prediction accuracy of the models is based on the root mean square error (RMSE), mean absolute error (MAE), coefficient of correlation (CC), and Willmott's index of agreement (d). The results indicated that GRNN can be applied successfully for prediction of colored dissolved organic matter (CDOM).

  11. AucPR: An AUC-based approach using penalized regression for disease prediction with high-dimensional omics data

    OpenAIRE

    Yu, Wenbao; Park, Taesung

    2014-01-01

    Motivation It is common to get an optimal combination of markers for disease classification and prediction when multiple markers are available. Many approaches based on the area under the receiver operating characteristic curve (AUC) have been proposed. Existing works based on AUC in a high-dimensional context depend mainly on a non-parametric, smooth approximation of AUC, with no work using a parametric AUC-based approach, for high-dimensional data. Results We propose an AUC-based approach u...

  12. A Dictionary Approach to Electron Backscatter Diffraction Indexing.

    Science.gov (United States)

    Chen, Yu H; Park, Se Un; Wei, Dennis; Newstadt, Greg; Jackson, Michael A; Simmons, Jeff P; De Graef, Marc; Hero, Alfred O

    2015-06-01

    We propose a framework for indexing of grain and subgrain structures in electron backscatter diffraction patterns of polycrystalline materials. We discretize the domain of a dynamical forward model onto a dense grid of orientations, producing a dictionary of patterns. For each measured pattern, we identify the most similar patterns in the dictionary, and identify boundaries, detect anomalies, and index crystal orientations. The statistical distribution of these closest matches is used in an unsupervised binary decision tree (DT) classifier to identify grain boundaries and anomalous regions. The DT classifies a pattern as an anomaly if it has an abnormally low similarity to any pattern in the dictionary. It classifies a pixel as being near a grain boundary if the highly ranked patterns in the dictionary differ significantly over the pixel's neighborhood. Indexing is accomplished by computing the mean orientation of the closest matches to each pattern. The mean orientation is estimated using a maximum likelihood approach that models the orientation distribution as a mixture of Von Mises-Fisher distributions over the quaternionic three sphere. The proposed dictionary matching approach permits segmentation, anomaly detection, and indexing to be performed in a unified manner with the additional benefit of uncertainty quantification.

  13. Shaping the Electronic Library--The UW-Madison Approach.

    Science.gov (United States)

    Dean, Charles W., Ed.; Frazier, Ken; Pope, Nolan F.; Gorman, Peter C.; Dentinger, Sue; Boston, Jeanne; Phillips, Hugh; Daggett, Steven C.; Lundquist, Mitch; McClung, Mark; Riley, Curran; Allan, Craig; Waugh, David

    1998-01-01

    This special theme section describes the University of Wisconsin-Madison's experience building its Electronic Library. Highlights include integrating resources and services; the administrative framework; the public electronic library, including electronic publishing capability and access to World Wide Web-based and other electronic resources;…

  14. Molecular self-assembly approaches for supramolecular electronic and organic electronic devices

    Science.gov (United States)

    Yip, Hin-Lap

    Molecular self-assembly represents an efficient bottom-up strategy to generate structurally well-defined aggregates of semiconducting pi-conjugated materials. The capability of tuning the chemical structures, intermolecular interactions and nanostructures through molecular engineering and novel materials processing renders it possible to tailor a large number of unprecedented properties such as charge transport, energy transfer and light harvesting. This approach does not only benefit traditional electronic devices based on bulk materials, but also generate a new research area so called "supramolecular electronics" in which electronic devices are built up with individual supramolecular nanostructures with size in the sub-hundred nanometers range. My work combined molecular self-assembly together with several novel materials processing techniques to control the nucleation and growth of organic semiconducting nanostructures from different type of pi-conjugated materials. By tailoring the interactions between the molecules using hydrogen bonds and pi-pi stacking, semiconducting nanoplatelets and nanowires with tunable sizes can be fabricated in solution. These supramolecular nanostructures were further patterned and aligned on solid substrates through printing and chemical templating methods. The capability to control the different hierarchies of organization on surface provides an important platform to study their structural-induced electronic properties. In addition to using molecular self-assembly to create different organic nanostructures, functional self-assembled monolayer (SAM) formed by spontaneous chemisorption on surfaces was used to tune the interfacial property in organic solar cells. Devices showed dramatically improved performance when appropriate SAMs were applied to optimize the contact property for efficiency charge collection.

  15. An analytical approach to characterize morbidity profile dissimilarity between distinct cohorts using electronic medical records.

    Science.gov (United States)

    Schildcrout, Jonathan S; Basford, Melissa A; Pulley, Jill M; Masys, Daniel R; Roden, Dan M; Wang, Deede; Chute, Christopher G; Kullo, Iftikhar J; Carrell, David; Peissig, Peggy; Kho, Abel; Denny, Joshua C

    2010-12-01

    We describe a two-stage analytical approach for characterizing morbidity profile dissimilarity among patient cohorts using electronic medical records. We capture morbidities using the International Statistical Classification of Diseases and Related Health Problems (ICD-9) codes. In the first stage of the approach separate logistic regression analyses for ICD-9 sections (e.g., "hypertensive disease" or "appendicitis") are conducted, and the odds ratios that describe adjusted differences in prevalence between two cohorts are displayed graphically. In the second stage, the results from ICD-9 section analyses are combined into a general morbidity dissimilarity index (MDI). For illustration, we examine nine cohorts of patients representing six phenotypes (or controls) derived from five institutions, each a participant in the electronic MEdical REcords and GEnomics (eMERGE) network. The phenotypes studied include type II diabetes and type II diabetes controls, peripheral arterial disease and peripheral arterial disease controls, normal cardiac conduction as measured by electrocardiography, and senile cataracts. Copyright © 2010 Elsevier Inc. All rights reserved.

  16. Penalized linear regression for discrete ill-posed problems: A hybrid least-squares and mean-squared error approach

    KAUST Repository

    Suliman, Mohamed Abdalla Elhag; Ballal, Tarig; Kammoun, Abla; Al-Naffouri, Tareq Y.

    2016-01-01

    This paper proposes a new approach to find the regularization parameter for linear least-squares discrete ill-posed problems. In the proposed approach, an artificial perturbation matrix with a bounded norm is forced into the discrete ill-posed model

  17. Penalized linear regression for discrete ill-posed problems: A hybrid least-squares and mean-squared error approach

    KAUST Repository

    Suliman, Mohamed Abdalla Elhag

    2016-12-19

    This paper proposes a new approach to find the regularization parameter for linear least-squares discrete ill-posed problems. In the proposed approach, an artificial perturbation matrix with a bounded norm is forced into the discrete ill-posed model matrix. This perturbation is introduced to enhance the singular-value (SV) structure of the matrix and hence to provide a better solution. The proposed approach is derived to select the regularization parameter in a way that minimizes the mean-squared error (MSE) of the estimator. Numerical results demonstrate that the proposed approach outperforms a set of benchmark methods in most cases when applied to different scenarios of discrete ill-posed problems. Jointly, the proposed approach enjoys the lowest run-time and offers the highest level of robustness amongst all the tested methods.

  18. The Spatial Association Between Federally Qualified Health Centers and County-Level Reported Sexually Transmitted Infections: A Spatial Regression Approach.

    Science.gov (United States)

    Owusu-Edusei, Kwame; Gift, Thomas L; Leichliter, Jami S; Romaguera, Raul A

    2018-02-01

    The number of categorical sexually transmitted disease (STD) clinics is declining in the United States. Federally qualified health centers (FQHCs) have the potential to supplement the needed sexually transmitted infection (STI) services. In this study, we describe the spatial distribution of FQHC sites and determine if reported county-level nonviral STI morbidity were associated with having FQHC(s) using spatial regression techniques. We extracted map data from the Health Resources and Services Administration data warehouse on FQHCs (ie, geocoded health care service delivery [HCSD] sites) and extracted county-level data on the reported rates of chlamydia, gonorrhea and, primary and secondary (P&S) syphilis (2008-2012) from surveillance data. A 3-equation seemingly unrelated regression estimation procedure (with a spatial regression specification that controlled for county-level multiyear (2008-2012) demographic and socioeconomic factors) was used to determine the association between reported county-level STI morbidity and HCSD sites. Counties with HCSD sites had higher STI, poverty, unemployment, and violent crime rates than counties with no HCSD sites (P < 0.05). The number of HCSD sites was associated (P < 0.01) with increases in the temporally smoothed rates of chlamydia, gonorrhea, and P&S syphilis, but there was no significant association between the number of HCSD per 100,000 population and reported STI rates. There is a positive association between STI morbidity and the number of HCSD sites; however, this association does not exist when adjusting by population size. Further work may determine the extent to which HCSD sites can meet unmet needs for safety net STI services.

  19. Linear regression

    CERN Document Server

    Olive, David J

    2017-01-01

    This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...

  20. Plateletpheresis efficiency and mathematical correction of software-derived platelet yield prediction: A linear regression and ROC modeling approach.

    Science.gov (United States)

    Jaime-Pérez, José Carlos; Jiménez-Castillo, Raúl Alberto; Vázquez-Hernández, Karina Elizabeth; Salazar-Riojas, Rosario; Méndez-Ramírez, Nereida; Gómez-Almaguer, David

    2017-10-01

    Advances in automated cell separators have improved the efficiency of plateletpheresis and the possibility of obtaining double products (DP). We assessed cell processor accuracy of predicted platelet (PLT) yields with the goal of a better prediction of DP collections. This retrospective proof-of-concept study included 302 plateletpheresis procedures performed on a Trima Accel v6.0 at the apheresis unit of a hematology department. Donor variables, software predicted yield and actual PLT yield were statistically evaluated. Software prediction was optimized by linear regression analysis and its optimal cut-off to obtain a DP assessed by receiver operating characteristic curve (ROC) modeling. Three hundred and two plateletpheresis procedures were performed; in 271 (89.7%) occasions, donors were men and in 31 (10.3%) women. Pre-donation PLT count had the best direct correlation with actual PLT yield (r = 0.486. P Simple correction derived from linear regression analysis accurately corrected this underestimation and ROC analysis identified a precise cut-off to reliably predict a DP. © 2016 Wiley Periodicals, Inc.

  1. The Covariance Adjustment Approaches for Combining Incomparable Cox Regressions Caused by Unbalanced Covariates Adjustment: A Multivariate Meta-Analysis Study

    Directory of Open Access Journals (Sweden)

    Tania Dehesh

    2015-01-01

    Full Text Available Background. Univariate meta-analysis (UM procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS method as a multivariate meta-analysis approach. Methods. We evaluated the efficiency of four new approaches including zero correlation (ZC, common correlation (CC, estimated correlation (EC, and multivariate multilevel correlation (MMC on the estimation bias, mean square error (MSE, and 95% probability coverage of the confidence interval (CI in the synthesis of Cox proportional hazard models coefficients in a simulation study. Result. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. Conclusion. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients.

  2. The Covariance Adjustment Approaches for Combining Incomparable Cox Regressions Caused by Unbalanced Covariates Adjustment: A Multivariate Meta-Analysis Study.

    Science.gov (United States)

    Dehesh, Tania; Zare, Najaf; Ayatollahi, Seyyed Mohammad Taghi

    2015-01-01

    Univariate meta-analysis (UM) procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS) method as a multivariate meta-analysis approach. We evaluated the efficiency of four new approaches including zero correlation (ZC), common correlation (CC), estimated correlation (EC), and multivariate multilevel correlation (MMC) on the estimation bias, mean square error (MSE), and 95% probability coverage of the confidence interval (CI) in the synthesis of Cox proportional hazard models coefficients in a simulation study. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients.

  3. Growth curves of preschool children in the northeast of iran: a population based study using quantile regression approach.

    Science.gov (United States)

    Payande, Abolfazl; Tabesh, Hamed; Shakeri, Mohammad Taghi; Saki, Azadeh; Safarian, Mohammad

    2013-01-14

    Growth charts are widely used to assess children's growth status and can provide a trajectory of growth during early important months of life. The objectives of this study are going to construct growth charts and normal values of weight-for-age for children aged 0 to 5 years using a powerful and applicable methodology. The results compare with the World Health Organization (WHO) references and semi-parametric LMS method of Cole and Green. A total of 70737 apparently healthy boys and girls aged 0 to 5 years were recruited in July 2004 for 20 days from those attending community clinics for routine health checks as a part of a national survey. Anthropometric measurements were done by trained health staff using WHO methodology. The nonparametric quantile regression method obtained by local constant kernel estimation of conditional quantiles curves using for estimation of curves and normal values. The weight-for-age growth curves for boys and girls aged from 0 to 5 years were derived utilizing a population of children living in the northeast of Iran. The results were similar to the ones obtained by the semi-parametric LMS method in the same data. Among all age groups from 0 to 5 years, the median values of children's weight living in the northeast of Iran were lower than the corresponding values in WHO reference data. The weight curves of boys were higher than those of girls in all age groups. The differences between growth patterns of children living in the northeast of Iran versus international ones necessitate using local and regional growth charts. International normal values may not properly recognize the populations at risk for growth problems in Iranian children. Quantile regression (QR) as a flexible method which doesn't require restricted assumptions, proposed for estimation reference curves and normal values.

  4. Financial performance monitoring of the technical efficiency of critical access hospitals: a data envelopment analysis and logistic regression modeling approach.

    Science.gov (United States)

    Wilson, Asa B; Kerr, Bernard J; Bastian, Nathaniel D; Fulton, Lawrence V

    2012-01-01

    From 1980 to 1999, rural designated hospitals closed at a disproportionally high rate. In response to this emergent threat to healthcare access in rural settings, the Balanced Budget Act of 1997 made provisions for the creation of a new rural hospital--the critical access hospital (CAH). The conversion to CAH and the associated cost-based reimbursement scheme significantly slowed the closure rate of rural hospitals. This work investigates which methods can ensure the long-term viability of small hospitals. This article uses a two-step design to focus on a hypothesized relationship between technical efficiency of CAHs and a recently developed set of financial monitors for these entities. The goal is to identify the financial performance measures associated with efficiency. The first step uses data envelopment analysis (DEA) to differentiate efficient from inefficient facilities within a data set of 183 CAHs. Determining DEA efficiency is an a priori categorization of hospitals in the data set as efficient or inefficient. In the second step, DEA efficiency is the categorical dependent variable (efficient = 0, inefficient = 1) in the subsequent binary logistic regression (LR) model. A set of six financial monitors selected from the array of 20 measures were the LR independent variables. We use a binary LR to test the null hypothesis that recently developed CAH financial indicators had no predictive value for categorizing a CAH as efficient or inefficient, (i.e., there is no relationship between DEA efficiency and fiscal performance).

  5. Disease Mapping and Regression with Count Data in the Presence of Overdispersion and Spatial Autocorrelation: A Bayesian Model Averaging Approach

    Science.gov (United States)

    Mohebbi, Mohammadreza; Wolfe, Rory; Forbes, Andrew

    2014-01-01

    This paper applies the generalised linear model for modelling geographical variation to esophageal cancer incidence data in the Caspian region of Iran. The data have a complex and hierarchical structure that makes them suitable for hierarchical analysis using Bayesian techniques, but with care required to deal with problems arising from counts of events observed in small geographical areas when overdispersion and residual spatial autocorrelation are present. These considerations lead to nine regression models derived from using three probability distributions for count data: Poisson, generalised Poisson and negative binomial, and three different autocorrelation structures. We employ the framework of Bayesian variable selection and a Gibbs sampling based technique to identify significant cancer risk factors. The framework deals with situations where the number of possible models based on different combinations of candidate explanatory variables is large enough such that calculation of posterior probabilities for all models is difficult or infeasible. The evidence from applying the modelling methodology suggests that modelling strategies based on the use of generalised Poisson and negative binomial with spatial autocorrelation work well and provide a robust basis for inference. PMID:24413702

  6. Relative accuracy of spatial predictive models for lynx Lynx canadensis derived using logistic regression-AIC, multiple criteria evaluation and Bayesian approaches

    Directory of Open Access Journals (Sweden)

    Shelley M. ALEXANDER

    2009-02-01

    Full Text Available We compared probability surfaces derived using one set of environmental variables in three Geographic Information Systems (GIS-based approaches: logistic regression and Akaike’s Information Criterion (AIC, Multiple Criteria Evaluation (MCE, and Bayesian Analysis (specifically Dempster-Shafer theory. We used lynx Lynx canadensis as our focal species, and developed our environment relationship model using track data collected in Banff National Park, Alberta, Canada, during winters from 1997 to 2000. The accuracy of the three spatial models were compared using a contingency table method. We determined the percentage of cases in which both presence and absence points were correctly classified (overall accuracy, the failure to predict a species where it occurred (omission error and the prediction of presence where there was absence (commission error. Our overall accuracy showed the logistic regression approach was the most accurate (74.51%. The multiple criteria evaluation was intermediate (39.22%, while the Dempster-Shafer (D-S theory model was the poorest (29.90%. However, omission and commission error tell us a different story: logistic regression had the lowest commission error, while D-S theory produced the lowest omission error. Our results provide evidence that habitat modellers should evaluate all three error measures when ascribing confidence in their model. We suggest that for our study area at least, the logistic regression model is optimal. However, where sample size is small or the species is very rare, it may also be useful to explore and/or use a more ecologically cautious modelling approach (e.g. Dempster-Shafer that would over-predict, protect more sites, and thereby minimize the risk of missing critical habitat in conservation plans[Current Zoology 55(1: 28 – 40, 2009].

  7. Optimization of classification and regression analysis of four monoclonal antibodies from Raman spectra using collaborative machine learning approach.

    Science.gov (United States)

    Le, Laetitia Minh Maï; Kégl, Balázs; Gramfort, Alexandre; Marini, Camille; Nguyen, David; Cherti, Mehdi; Tfaili, Sana; Tfayli, Ali; Baillet-Guffroy, Arlette; Prognon, Patrice; Chaminade, Pierre; Caudron, Eric

    2018-07-01

    The use of monoclonal antibodies (mAbs) constitutes one of the most important strategies to treat patients suffering from cancers such as hematological malignancies and solid tumors. These antibodies are prescribed by the physician and prepared by hospital pharmacists. An analytical control enables the quality of the preparations to be ensured. The aim of this study was to explore the development of a rapid analytical method for quality control. The method used four mAbs (Infliximab, Bevacizumab, Rituximab and Ramucirumab) at various concentrations and was based on recording Raman data and coupling them to a traditional chemometric and machine learning approach for data analysis. Compared to conventional linear approach, prediction errors are reduced with a data-driven approach using statistical machine learning methods. In the latter, preprocessing and predictive models are jointly optimized. An additional original aspect of the work involved on submitting the problem to a collaborative data challenge platform called Rapid Analytics and Model Prototyping (RAMP). This allowed using solutions from about 300 data scientists in collaborative work. Using machine learning, the prediction of the four mAbs samples was considerably improved. The best predictive model showed a combined error of 2.4% versus 14.6% using linear approach. The concentration and classification errors were 5.8% and 0.7%, only three spectra were misclassified over the 429 spectra of the test set. This large improvement obtained with machine learning techniques was uniform for all molecules but maximal for Bevacizumab with an 88.3% reduction on combined errors (2.1% versus 17.9%). Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Understanding poisson regression.

    Science.gov (United States)

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  9. Representing electrons a biographical approach to theoretical entities

    CERN Document Server

    Arabatzis, Theodore

    2006-01-01

    Both a history and a metahistory, Representing Electrons focuses on the development of various theoretical representations of electrons from the late 1890s to 1925 and the methodological problems associated with writing about unobservable scientific entities. Using the electron-or rather its representation-as a historical actor, Theodore Arabatzis illustrates the emergence and gradual consolidation of its representation in physics, its career throughout old quantum theory, and its appropriation and reinterpretation by chemists. As Arabatzis develops this novel biographical

  10. A Forecasting Approach Combining Self-Organizing Map with Support Vector Regression for Reservoir Inflow during Typhoon Periods

    Directory of Open Access Journals (Sweden)

    Gwo-Fong Lin

    2016-01-01

    Full Text Available This study describes the development of a reservoir inflow forecasting model for typhoon events to improve short lead-time flood forecasting performance. To strengthen the forecasting ability of the original support vector machines (SVMs model, the self-organizing map (SOM is adopted to group inputs into different clusters in advance of the proposed SOM-SVM model. Two different input methods are proposed for the SVM-based forecasting method, namely, SOM-SVM1 and SOM-SVM2. The methods are applied to an actual reservoir watershed to determine the 1 to 3 h ahead inflow forecasts. For 1, 2, and 3 h ahead forecasts, improvements in mean coefficient of efficiency (MCE due to the clusters obtained from SOM-SVM1 are 21.5%, 18.5%, and 23.0%, respectively. Furthermore, improvement in MCE for SOM-SVM2 is 20.9%, 21.2%, and 35.4%, respectively. Another SOM-SVM2 model increases the SOM-SVM1 model for 1, 2, and 3 h ahead forecasts obtained improvement increases of 0.33%, 2.25%, and 10.08%, respectively. These results show that the performance of the proposed model can provide improved forecasts of hourly inflow, especially in the proposed SOM-SVM2 model. In conclusion, the proposed model, which considers limit and higher related inputs instead of all inputs, can generate better forecasts in different clusters than are generated from the SOM process. The SOM-SVM2 model is recommended as an alternative to the original SVR (Support Vector Regression model because of its accuracy and robustness.

  11. Investigation of the marked and long-standing spatial inhomogeneity of the Hungarian suicide rate: a spatial regression approach.

    Science.gov (United States)

    Balint, Lajos; Dome, Peter; Daroczi, Gergely; Gonda, Xenia; Rihmer, Zoltan

    2014-02-01

    In the last century Hungary had astonishingly high suicide rates characterized by marked regional within-country inequalities, a spatial pattern which has been quite stable over time. To explain the above phenomenon at the level of micro-regions (n=175) in the period between 2005 and 2011. Our dependent variable was the age and gender standardized mortality ratio (SMR) for suicide while explanatory variables were factors which are supposed to influence suicide risk, such as measures of religious and political integration, travel time accessibility of psychiatric services, alcohol consumption, unemployment and disability pensionery. When applying the ordinary least squared regression model, the residuals were found to be spatially autocorrelated, which indicates the violation of the assumption on the independence of error terms and - accordingly - the necessity of application of a spatial autoregressive (SAR) model to handle this problem. According to our calculations the SARlag model was a better way (versus the SARerr model) of addressing the problem of spatial autocorrelation, furthermore its substantive meaning is more convenient. SMR was significantly associated with the "political integration" variable in a negative and with "lack of religious integration" and "disability pensionery" variables in a positive manner. Associations were not significant for the remaining explanatory variables. Several important psychiatric variables were not available at the level of micro-regions. We conducted our analysis on aggregate data. Our results may draw attention to the relevance and abiding validity of the classic Durkheimian suicide risk factors - such as lack of social integration - apropos of the spatial pattern of Hungarian suicides. © 2013 Published by Elsevier B.V.

  12. Data-driven approach for creating synthetic electronic medical records.

    Science.gov (United States)

    Buczak, Anna L; Babin, Steven; Moniz, Linda

    2010-10-14

    New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs) that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia) and for background records. The method developed has three major steps: 1) synthetic patient identity and basic information generation; 2) identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3) adaptation of these care patterns to the synthetic patient population. We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders). The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious diseases. The pilot synthetic background records were in the 4

  13. An alternative approach to the determination of scaling law expressions for the L–H transition in Tokamaks utilizing classification tools instead of regression

    International Nuclear Information System (INIS)

    Gaudio, P; Gelfusa, M; Lupelli, I; Murari, A; Vega, J

    2014-01-01

    A new approach to determine the power law expressions for the threshold between the H and L mode of confinement is presented. The method is based on two powerful machine learning tools for classification: neural networks and support vector machines. Using as inputs clear examples of the systems on either side of the transition, the machine learning tools learn the input–output mapping corresponding to the equations of the boundary separating the confinement regimes. Systematic tests with synthetic data show that the machine learning tools provide results competitive with traditional statistical regression and more robust against random noise and systematic errors. The developed tools have then been applied to the multi-machine International Tokamak Physics Activity International Global Threshold Database of validated ITER-like Tokamak discharges. The machine learning tools converge on the same scaling law parameters obtained with non-linear regression. On the other hand, the developed tools allow a reduction of 50% of the uncertainty in the extrapolations to ITER. Therefore the proposed approach can effectively complement traditional regression since its application poses much less stringent requirements on the experimental data, to be used to determine the scaling laws, because they do not require examples exactly at the moment of the transition. (paper)

  14. A Two-Step Approach for Analysis of Nonignorable Missing Outcomes in Longitudinal Regression: an Application to Upstate KIDS Study.

    Science.gov (United States)

    Liu, Danping; Yeung, Edwina H; McLain, Alexander C; Xie, Yunlong; Buck Louis, Germaine M; Sundaram, Rajeshwari

    2017-09-01

    Imperfect follow-up in longitudinal studies commonly leads to missing outcome data that can potentially bias the inference when the missingness is nonignorable; that is, the propensity of missingness depends on missing values in the data. In the Upstate KIDS Study, we seek to determine if the missingness of child development outcomes is nonignorable, and how a simple model assuming ignorable missingness would compare with more complicated models for a nonignorable mechanism. To correct for nonignorable missingness, the shared random effects model (SREM) jointly models the outcome and the missing mechanism. However, the computational complexity and lack of software packages has limited its practical applications. This paper proposes a novel two-step approach to handle nonignorable missing outcomes in generalized linear mixed models. We first analyse the missing mechanism with a generalized linear mixed model and predict values of the random effects; then, the outcome model is fitted adjusting for the predicted random effects to account for heterogeneity in the missingness propensity. Extensive simulation studies suggest that the proposed method is a reliable approximation to SREM, with a much faster computation. The nonignorability of missing data in the Upstate KIDS Study is estimated to be mild to moderate, and the analyses using the two-step approach or SREM are similar to the model assuming ignorable missingness. The two-step approach is a computationally straightforward method that can be conducted as sensitivity analyses in longitudinal studies to examine violations to the ignorable missingness assumption and the implications relative to health outcomes. © 2017 John Wiley & Sons Ltd.

  15. Boosted beta regression.

    Directory of Open Access Journals (Sweden)

    Matthias Schmid

    Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.

  16. Compact femtosecond electron diffractometer with 100 keV electron bunches approaching the single-electron pulse duration limit

    International Nuclear Information System (INIS)

    Waldecker, Lutz; Bertoni, Roman; Ernstorfer, Ralph

    2015-01-01

    We present the design and implementation of a highly compact femtosecond electron diffractometer working at electron energies up to 100 keV. We use a multi-body particle tracing code to simulate electron bunch propagation through the setup and to calculate pulse durations at the sample position. Our simulations show that electron bunches containing few thousands of electrons per bunch are only weakly broadened by space-charge effects and their pulse duration is thus close to the one of a single-electron wavepacket. With our compact setup, we can create electron bunches containing up to 5000 electrons with a pulse duration below 100 fs on the sample. We use the diffractometer to track the energy transfer from photoexcited electrons to the lattice in a thin film of titanium. This process takes place on the timescale of few-hundred femtoseconds and a fully equilibrated state is reached within 1 ps

  17. Isokinetic knee strength qualities as predictors of jumping performance in high-level volleyball athletes: multiple regression approach.

    Science.gov (United States)

    Sattler, Tine; Sekulic, Damir; Spasic, Miodrag; Osmankac, Nedzad; Vicente João, Paulo; Dervisevic, Edvin; Hadzic, Vedran

    2016-01-01

    Previous investigations noted potential importance of isokinetic strength in rapid muscular performances, such as jumping. This study aimed to identify the influence of isokinetic-knee-strength on specific jumping performance in volleyball. The secondary aim of the study was to evaluate reliability and validity of the two volleyball-specific jumping tests. The sample comprised 67 female (21.96±3.79 years; 68.26±8.52 kg; 174.43±6.85 cm) and 99 male (23.62±5.27 years; 84.83±10.37 kg; 189.01±7.21 cm) high- volleyball players who competed in 1st and 2nd National Division. Subjects were randomly divided into validation (N.=55 and 33 for males and females, respectively) and cross-validation subsamples (N.=54 and 34 for males and females, respectively). Set of predictors included isokinetic tests, to evaluate the eccentric and concentric strength capacities of the knee extensors, and flexors for dominant and non-dominant leg. The main outcome measure for the isokinetic testing was peak torque (PT) which was later normalized for body mass and expressed as PT/Kg. Block-jump and spike-jump performances were measured over three trials, and observed as criteria. Forward stepwise multiple regressions were calculated for validation subsamples and then cross-validated. Cross validation included correlations between and t-test differences between observed and predicted scores; and Bland Altman graphics. Jumping tests were found to be reliable (spike jump: ICC of 0.79 and 0.86; block-jump: ICC of 0.86 and 0.90; for males and females, respectively), and their validity was confirmed by significant t-test differences between 1st vs. 2nd division players. Isokinetic variables were found to be significant predictors of jumping performance in females, but not among males. In females, the isokinetic-knee measures were shown to be stronger and more valid predictors of the block-jump (42% and 64% of the explained variance for validation and cross-validation subsample, respectively

  18. Output-Only Modal Parameter Recursive Estimation of Time-Varying Structures via a Kernel Ridge Regression FS-TARMA Approach

    Directory of Open Access Journals (Sweden)

    Zhi-Sai Ma

    2017-01-01

    Full Text Available Modal parameter estimation plays an important role in vibration-based damage detection and is worth more attention and investigation, as changes in modal parameters are usually being used as damage indicators. This paper focuses on the problem of output-only modal parameter recursive estimation of time-varying structures based upon parameterized representations of the time-dependent autoregressive moving average (TARMA. A kernel ridge regression functional series TARMA (FS-TARMA recursive identification scheme is proposed and subsequently employed for the modal parameter estimation of a numerical three-degree-of-freedom time-varying structural system and a laboratory time-varying structure consisting of a simply supported beam and a moving mass sliding on it. The proposed method is comparatively assessed against an existing recursive pseudolinear regression FS-TARMA approach via Monte Carlo experiments and shown to be capable of accurately tracking the time-varying dynamics in a recursive manner.

  19. Evaluating Electronic Reference Services: Issues, Approaches and Criteria.

    Science.gov (United States)

    Novotny, Eric

    2001-01-01

    Discussion of electronic library reference services focuses on an overview of the chief methodologies available for conducting assessments of electronic services. Highlights include quantitative measures and benchmarks, including equity and access; quality measures; behavioral aspects of quality, including librarian-patron interaction; and future…

  20. Electron correlations in narrow energy bands: modified polar model approach

    Directory of Open Access Journals (Sweden)

    L. Didukh

    2008-09-01

    Full Text Available The electron correlations in narrow energy bands are examined within the framework of the modified form of polar model. This model permits to analyze the effect of strong Coulomb correlation, inter-atomic exchange and correlated hopping of electrons and explain some peculiarities of the properties of narrow-band materials, namely the metal-insulator transition with an increase of temperature, nonlinear concentration dependence of Curie temperature and peculiarities of transport properties of electronic subsystem. Using a variant of generalized Hartree-Fock approximation, the single-electron Green's function and quasi-particle energy spectrum of the model are calculated. Metal-insulator transition with the change of temperature is investigated in a system with correlated hopping. Processes of ferromagnetic ordering stabilization in the system with various forms of electronic DOS are studied. The static conductivity and effective spin-dependent masses of current carriers are calculated as a function of electron concentration at various DOS forms. The correlated hopping is shown to cause the electron-hole asymmetry of transport and ferromagnetic properties of narrow band materials.

  1. Data-driven approach for creating synthetic electronic medical records

    Directory of Open Access Journals (Sweden)

    Moniz Linda

    2010-10-01

    Full Text Available Abstract Background New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. Methods This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia and for background records. The method developed has three major steps: 1 synthetic patient identity and basic information generation; 2 identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3 adaptation of these care patterns to the synthetic patient population. Results We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. Conclusions A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders. The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious

  2. A novel approach for honey pollen profile assessment using an electronic tongue and chemometric tools

    International Nuclear Information System (INIS)

    Dias, Luís G.; Veloso, Ana C.A.; Sousa, Mara E.B.C.; Estevinho, Letícia; Machado, Adélio A.S.C.

    2015-01-01

    Nowadays the main honey producing countries require accurate labeling of honey before commercialization, including floral classification. Traditionally, this classification is made by melissopalynology analysis, an accurate but time-consuming task requiring laborious sample pre-treatment and high-skilled technicians. In this work the potential use of a potentiometric electronic tongue for pollinic assessment is evaluated, using monofloral and polyfloral honeys. The results showed that after splitting honeys according to color (white, amber and dark), the novel methodology enabled quantifying the relative percentage of the main pollens (Castanea sp., Echium sp., Erica sp., Eucaliptus sp., Lavandula sp., Prunus sp., Rubus sp. and Trifolium sp.). Multiple linear regression models were established for each type of pollen, based on the best sensors' sub-sets selected using the simulated annealing algorithm. To minimize the overfitting risk, a repeated K-fold cross-validation procedure was implemented, ensuring that at least 10–20% of the honeys were used for internal validation. With this approach, a minimum average determination coefficient of 0.91 ± 0.15 was obtained. Also, the proposed technique enabled the correct classification of 92% and 100% of monofloral and polyfloral honeys, respectively. The quite satisfactory performance of the novel procedure for quantifying the relative pollen frequency may envisage its applicability for honey labeling and geographical origin identification. Nevertheless, this approach is not a full alternative to the traditional melissopalynologic analysis; it may be seen as a practical complementary tool for preliminary honey floral classification, leaving only problematic cases for pollinic evaluation. - Highlights: • Honey's floral origin labeling is a legal requirement. • Melissopalynology analysis usually used to evaluate pollens profile is laborious. • A novel E-tongue based approach is applied to assess pollens relative

  3. A novel approach for honey pollen profile assessment using an electronic tongue and chemometric tools

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Luís G., E-mail: ldias@ipb.pt [Escola Superior Agrária, Instituto Politécnico de Bragança, Campus Santa Apolónia, 5301-855 Bragança (Portugal); CQ-VR, Centro de Química – Vila Real, University of Trás-os-Montes e Alto Douro, Apartado 1013, 5001-801 Vila Real (Portugal); Veloso, Ana C.A. [Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); CEB-Centre of Biological Engineering, University of Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Sousa, Mara E.B.C.; Estevinho, Letícia [CIMO-Escola Superior Agrária, Instituto Politécnico de Bragança, Campus Santa Apolónia, 5301-855 Bragança (Portugal); Machado, Adélio A.S.C. [LAQUIPAI – Laboratório de Química Inorgânica Pura e de Aplicação Interdisciplinar, Departamento de Química, Faculdade de Ciências da, Universidade do Porto, Rua Campo Alegre n°. 687, 4169-007 Porto (Portugal); and others

    2015-11-05

    Nowadays the main honey producing countries require accurate labeling of honey before commercialization, including floral classification. Traditionally, this classification is made by melissopalynology analysis, an accurate but time-consuming task requiring laborious sample pre-treatment and high-skilled technicians. In this work the potential use of a potentiometric electronic tongue for pollinic assessment is evaluated, using monofloral and polyfloral honeys. The results showed that after splitting honeys according to color (white, amber and dark), the novel methodology enabled quantifying the relative percentage of the main pollens (Castanea sp., Echium sp., Erica sp., Eucaliptus sp., Lavandula sp., Prunus sp., Rubus sp. and Trifolium sp.). Multiple linear regression models were established for each type of pollen, based on the best sensors' sub-sets selected using the simulated annealing algorithm. To minimize the overfitting risk, a repeated K-fold cross-validation procedure was implemented, ensuring that at least 10–20% of the honeys were used for internal validation. With this approach, a minimum average determination coefficient of 0.91 ± 0.15 was obtained. Also, the proposed technique enabled the correct classification of 92% and 100% of monofloral and polyfloral honeys, respectively. The quite satisfactory performance of the novel procedure for quantifying the relative pollen frequency may envisage its applicability for honey labeling and geographical origin identification. Nevertheless, this approach is not a full alternative to the traditional melissopalynologic analysis; it may be seen as a practical complementary tool for preliminary honey floral classification, leaving only problematic cases for pollinic evaluation. - Highlights: • Honey's floral origin labeling is a legal requirement. • Melissopalynology analysis usually used to evaluate pollens profile is laborious. • A novel E-tongue based approach is applied to assess pollens

  4. Linear algebraic approach to electron-molecule collisions

    International Nuclear Information System (INIS)

    Schneider, B.I.; Collins, L.A.

    1983-01-01

    The various levels of sophistication of the linear algebraic method are discussed and its application to electron-molecule collisions of H 2 , N 2 LiH, LiF and HCl is described. 13 references, 2 tables

  5. Electronic Publishing Approaches to Curriculum: Videotex, Teletext and Databases.

    Science.gov (United States)

    Aumente, Jerome

    1986-01-01

    Describes the Journalism Resources Institute (JRI) of Rutgers University in terms of its administrative organization, computer resources, computer facilities use, involvement in electronic publishing, use of the Dow Jones News/Retrieval Database, curricular options, and professional continuing education. (AYC)

  6. An iteratively reweighted least-squares approach to adaptive robust adjustment of parameters in linear regression models with autoregressive and t-distributed deviations

    Science.gov (United States)

    Kargoll, Boris; Omidalizarandi, Mohammad; Loth, Ina; Paffenholz, Jens-André; Alkhatib, Hamza

    2018-03-01

    In this paper, we investigate a linear regression time series model of possibly outlier-afflicted observations and autocorrelated random deviations. This colored noise is represented by a covariance-stationary autoregressive (AR) process, in which the independent error components follow a scaled (Student's) t-distribution. This error model allows for the stochastic modeling of multiple outliers and for an adaptive robust maximum likelihood (ML) estimation of the unknown regression and AR coefficients, the scale parameter, and the degree of freedom of the t-distribution. This approach is meant to be an extension of known estimators, which tend to focus only on the regression model, or on the AR error model, or on normally distributed errors. For the purpose of ML estimation, we derive an expectation conditional maximization either algorithm, which leads to an easy-to-implement version of iteratively reweighted least squares. The estimation performance of the algorithm is evaluated via Monte Carlo simulations for a Fourier as well as a spline model in connection with AR colored noise models of different orders and with three different sampling distributions generating the white noise components. We apply the algorithm to a vibration dataset recorded by a high-accuracy, single-axis accelerometer, focusing on the evaluation of the estimated AR colored noise model.

  7. Parametric optimization of multiple quality characteristics in laser cutting of Inconel-718 by using hybrid approach of multiple regression analysis and genetic algorithm

    Science.gov (United States)

    Shrivastava, Prashant Kumar; Pandey, Arun Kumar

    2018-06-01

    Inconel-718 has found high demand in different industries due to their superior mechanical properties. The traditional cutting methods are facing difficulties for cutting these alloys due to their low thermal potential, lower elasticity and high chemical compatibility at inflated temperature. The challenges of machining and/or finishing of unusual shapes and/or sizes in these materials have also faced by traditional machining. Laser beam cutting may be applied for the miniaturization and ultra-precision cutting and/or finishing by appropriate control of different process parameter. This paper present multi-objective optimization the kerf deviation, kerf width and kerf taper in the laser cutting of Incone-718 sheet. The second order regression models have been developed for different quality characteristics by using the experimental data obtained through experimentation. The regression models have been used as objective function for multi-objective optimization based on the hybrid approach of multiple regression analysis and genetic algorithm. The comparison of optimization results to experimental results shows an improvement of 88%, 10.63% and 42.15% in kerf deviation, kerf width and kerf taper, respectively. Finally, the effects of different process parameters on quality characteristics have also been discussed.

  8. Collaborative regression.

    Science.gov (United States)

    Gross, Samuel M; Tibshirani, Robert

    2015-04-01

    We consider the scenario where one observes an outcome variable and sets of features from multiple assays, all measured on the same set of samples. One approach that has been proposed for dealing with these type of data is "sparse multiple canonical correlation analysis" (sparse mCCA). All of the current sparse mCCA techniques are biconvex and thus have no guarantees about reaching a global optimum. We propose a method for performing sparse supervised canonical correlation analysis (sparse sCCA), a specific case of sparse mCCA when one of the datasets is a vector. Our proposal for sparse sCCA is convex and thus does not face the same difficulties as the other methods. We derive efficient algorithms for this problem that can be implemented with off the shelf solvers, and illustrate their use on simulated and real data. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. A meta-regression analysis of 41 Australian problem gambling prevalence estimates and their relationship to total spending on electronic gaming machines.

    Science.gov (United States)

    Markham, Francis; Young, Martin; Doran, Bruce; Sugden, Mark

    2017-05-23

    Many jurisdictions regularly conduct surveys to estimate the prevalence of problem gambling in their adult populations. However, the comparison of such estimates is problematic due to methodological variations between studies. Total consumption theory suggests that an association between mean electronic gaming machine (EGM) and casino gambling losses and problem gambling prevalence estimates may exist. If this is the case, then changes in EGM losses may be used as a proxy indicator for changes in problem gambling prevalence. To test for this association this study examines the relationship between aggregated losses on electronic gaming machines (EGMs) and problem gambling prevalence estimates for Australian states and territories between 1994 and 2016. A Bayesian meta-regression analysis of 41 cross-sectional problem gambling prevalence estimates was undertaken using EGM gambling losses, year of survey and methodological variations as predictor variables. General population studies of adults in Australian states and territory published before 1 July 2016 were considered in scope. 41 studies were identified, with a total of 267,367 participants. Problem gambling prevalence, moderate-risk problem gambling prevalence, problem gambling screen, administration mode and frequency threshold were extracted from surveys. Administrative data on EGM and casino gambling loss data were extracted from government reports and expressed as the proportion of household disposable income lost. Money lost on EGMs is correlated with problem gambling prevalence. An increase of 1% of household disposable income lost on EGMs and in casinos was associated with problem gambling prevalence estimates that were 1.33 times higher [95% credible interval 1.04, 1.71]. There was no clear association between EGM losses and moderate-risk problem gambling prevalence estimates. Moderate-risk problem gambling prevalence estimates were not explained by the models (I 2  ≥ 0.97; R 2  ≤ 0.01). The

  10. A meta-regression analysis of 41 Australian problem gambling prevalence estimates and their relationship to total spending on electronic gaming machines

    Directory of Open Access Journals (Sweden)

    Francis Markham

    2017-05-01

    Full Text Available Abstract Background Many jurisdictions regularly conduct surveys to estimate the prevalence of problem gambling in their adult populations. However, the comparison of such estimates is problematic due to methodological variations between studies. Total consumption theory suggests that an association between mean electronic gaming machine (EGM and casino gambling losses and problem gambling prevalence estimates may exist. If this is the case, then changes in EGM losses may be used as a proxy indicator for changes in problem gambling prevalence. To test for this association this study examines the relationship between aggregated losses on electronic gaming machines (EGMs and problem gambling prevalence estimates for Australian states and territories between 1994 and 2016. Methods A Bayesian meta-regression analysis of 41 cross-sectional problem gambling prevalence estimates was undertaken using EGM gambling losses, year of survey and methodological variations as predictor variables. General population studies of adults in Australian states and territory published before 1 July 2016 were considered in scope. 41 studies were identified, with a total of 267,367 participants. Problem gambling prevalence, moderate-risk problem gambling prevalence, problem gambling screen, administration mode and frequency threshold were extracted from surveys. Administrative data on EGM and casino gambling loss data were extracted from government reports and expressed as the proportion of household disposable income lost. Results Money lost on EGMs is correlated with problem gambling prevalence. An increase of 1% of household disposable income lost on EGMs and in casinos was associated with problem gambling prevalence estimates that were 1.33 times higher [95% credible interval 1.04, 1.71]. There was no clear association between EGM losses and moderate-risk problem gambling prevalence estimates. Moderate-risk problem gambling prevalence estimates were not explained by

  11. Nonextensive statistical mechanics approach to electron trapping in degenerate plasmas

    Science.gov (United States)

    Mebrouk, Khireddine; Gougam, Leila Ait; Tribeche, Mouloud

    2016-06-01

    The electron trapping in a weakly nondegenerate plasma is reformulated and re-examined by incorporating the nonextensive entropy prescription. Using the q-deformed Fermi-Dirac distribution function including the quantum as well as the nonextensive statistical effects, we derive a new generalized electron density with a new contribution proportional to the electron temperature T, which may dominate the usual thermal correction (∼T2) at very low temperatures. To make the physics behind the effect of this new contribution more transparent, we analyze the modifications arising in the propagation of ion-acoustic solitary waves. Interestingly, we find that due to the nonextensive correction, our plasma model allows the possibility of existence of quantum ion-acoustic solitons with velocity higher than the Fermi ion-sound velocity. Moreover, as the nonextensive parameter q increases, the critical temperature Tc beyond which coexistence of compressive and rarefactive solitons sets in, is shifted towards higher values.

  12. Two-particle approach to the electronic structure of solids

    International Nuclear Information System (INIS)

    Gonis, A.

    2007-01-01

    Based on an extension of Hubbard's treatment of the electronic structure of correlated electrons in matter we propose a methodology that incorporates the scattering off the Coulomb interaction through the determination of a two-particle propagator. The Green function equations of motion are then used to obtain single-particle Green functions and related properties such as densities of states. The solutions of the equations of motion in two- and single-particle spaces are accomplished through applications of the coherent potential approximation. The formalism is illustrated by means of calculations for a single-band model system representing a linear arrangement of sites with nearest neighbor hopping and an one-site repulsion when two electrons of opposite spin occupy the same site in the lattice in the manner described by the so-called Hubbard Hamiltonian

  13. A laser printing based approach for printed electronics

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, T.; Hu, M.; Guo, Q.; Zhang, W.; Yang, J., E-mail: jyang@eng.uwo.ca [Department of Mechanical and Materials Engineering, Western University, London N6A 3K7 (Canada); Liu, Y.; Lau, W. [Chengdu Green Energy and Green Manufacturing Technology R& D Center, 355 Tengfei Road, 620107 Chengdu (China); Wang, X. [Department of Mechanical and Materials Engineering, Western University, London N6A 3K7 (Canada); Lanzhou Institute of Chemical Physics, Chinese Academy of Sciences, Lanzhou 730000 (China)

    2016-03-07

    Here we report a study of printing of electronics using an office use laser printer. The proposed method eliminates those critical disadvantages of solvent-based printing techniques by taking the advantages of electroless deposition and laser printing. The synthesized toner acts as a catalyst for the electroless copper deposition as well as an adhesion-promoting buffer layer between the substrate and deposited copper. The easy metallization of printed patterns and strong metal-substrate adhesion make it an especially effective method for massive production of flexible printed circuits. The proposed process is a high throughput, low cost, efficient, and environmentally benign method for flexible electronics manufacturing.

  14. A laser printing based approach for printed electronics

    International Nuclear Information System (INIS)

    Zhang, T.; Hu, M.; Guo, Q.; Zhang, W.; Yang, J.; Liu, Y.; Lau, W.; Wang, X.

    2016-01-01

    Here we report a study of printing of electronics using an office use laser printer. The proposed method eliminates those critical disadvantages of solvent-based printing techniques by taking the advantages of electroless deposition and laser printing. The synthesized toner acts as a catalyst for the electroless copper deposition as well as an adhesion-promoting buffer layer between the substrate and deposited copper. The easy metallization of printed patterns and strong metal-substrate adhesion make it an especially effective method for massive production of flexible printed circuits. The proposed process is a high throughput, low cost, efficient, and environmentally benign method for flexible electronics manufacturing.

  15. Mechanisms of neuroblastoma regression

    Science.gov (United States)

    Brodeur, Garrett M.; Bagatell, Rochelle

    2014-01-01

    Recent genomic and biological studies of neuroblastoma have shed light on the dramatic heterogeneity in the clinical behaviour of this disease, which spans from spontaneous regression or differentiation in some patients, to relentless disease progression in others, despite intensive multimodality therapy. This evidence also suggests several possible mechanisms to explain the phenomena of spontaneous regression in neuroblastomas, including neurotrophin deprivation, humoral or cellular immunity, loss of telomerase activity and alterations in epigenetic regulation. A better understanding of the mechanisms of spontaneous regression might help to identify optimal therapeutic approaches for patients with these tumours. Currently, the most druggable mechanism is the delayed activation of developmentally programmed cell death regulated by the tropomyosin receptor kinase A pathway. Indeed, targeted therapy aimed at inhibiting neurotrophin receptors might be used in lieu of conventional chemotherapy or radiation in infants with biologically favourable tumours that require treatment. Alternative approaches consist of breaking immune tolerance to tumour antigens or activating neurotrophin receptor pathways to induce neuronal differentiation. These approaches are likely to be most effective against biologically favourable tumours, but they might also provide insights into treatment of biologically unfavourable tumours. We describe the different mechanisms of spontaneous neuroblastoma regression and the consequent therapeutic approaches. PMID:25331179

  16. Inverse Problem Approach for the Alignment of Electron Tomographic Series.

    OpenAIRE

    Tran , Viet Dung; Moreaud , Maxime; Thiébaut , Éric; Denis , L.; Becker , Jean-Marie

    2014-01-01

    In the refining industry, morphological measurements of particles have become an essential part in the characterization catalyst supports. Through these parameters, one can infer the specific physicochemical properties of the studied materials. One of the main acquisition techniques is electron tomography (or nanotomography). 3D volumes are reconstructed from sets of projections from different angles made by a Transmission Elect...

  17. Preservation of Electronic Scholarly Publishing: An Analysis of Three Approaches

    Science.gov (United States)

    Honey, Sadie L.

    2005-01-01

    Scholars publish in journals to preserve their work and to make sure that it is available for current and future researchers. More and more of this publishing is done in electronic format. Libraries, the institutions that have traditionally overseen the preservation of print publications, are now struggling with the preservation of digital…

  18. The highly reintegrative approach of electronic monitoring in the Netherlands

    NARCIS (Netherlands)

    Boone, M.M.; Kooij, van der M.; Rap, S.E.

    2017-01-01

    This contribution describes the way electronic monitoring (EM) is organized and implemented in the Netherlands. It will become clear that the situation in the Netherlands is characterized by, in particular, two features. The application of EM is highly interwoven with the Probation Service and its

  19. Electron momentum density and Compton profile by a semi-empirical approach

    Science.gov (United States)

    Aguiar, Julio C.; Mitnik, Darío; Di Rocco, Héctor O.

    2015-08-01

    Here we propose a semi-empirical approach to describe with good accuracy the electron momentum densities and Compton profiles for a wide range of pure crystalline metals. In the present approach, we use an experimental Compton profile to fit an analytical expression for the momentum densities of the valence electrons. This expression is similar to a Fermi-Dirac distribution function with two parameters, one of which coincides with the ground state kinetic energy of the free-electron gas and the other resembles the electron-electron interaction energy. In the proposed scheme conduction electrons are neither completely free nor completely bound to the atomic nucleus. This procedure allows us to include correlation effects. We tested the approach for all metals with Z=3-50 and showed the results for three representative elements: Li, Be and Al from high-resolution experiments.

  20. Linear-algebraic approach to electronic excitation of atoms and molecules by electron impact

    International Nuclear Information System (INIS)

    Collins, L.A.; Schneider, B.I.

    1983-01-01

    A linear-algebraic method, based on an integral equations formulation, is applied to the excitation of atoms and molecules by electron impact. Various schemes are devised for treating the one-electron terms that sometimes cause instabilities when directly incorporated into the solution matrix. These include introducing Lagrange undetermined multipliers and correlation terms. Good agreement between the method and other computational techniques is obtained for electron scattering for hydrogenic and Li-like atomic ions and for H 2 + in two- to five-state close-coupling calculations

  1. Heating electrons with ion irradiation: A first-principles approach

    International Nuclear Information System (INIS)

    Pruneda, J.M.; Sanchez-Portal, D.; Arnau, A.; Juaristi, J.I.; Artacho, E.

    2009-01-01

    Using time-dependent density functional theory we calculate from first-principles the rate of energy transfer from a moving charged particle to the electrons in an insulating material. The behavior of the electronic stopping power in LiF (a wide band gap insulator) versus projectile velocity displays an effective threshold velocity of 8.2 Bohr/asec for the proton, consistent with recent experimental observations. The calculated proton/antiproton stopping power ratio is 2.4 at velocities slightly above the threshold (16.5 Bohr/asec) as compared to the experimental value of 2.1. The approximations introduced in this new non-perturbative methodology are discussed, and results on the velocity dependence of the stopping power, the locality of the energy transfer, and other characteristics of the host material are presented.

  2. French Electronic Theses and Dissertations in Europe : A Scientometric Approach

    OpenAIRE

    Prost , Hélène; Buirette , Amélie; Berbache , Rachid; Halipré , Aurélie

    2016-01-01

    International audience; Problem/goalThe poster presents an empirical overview on French electronic theses and dissertations, in particular with regards to the place of France in Europe, to their geographical and disciplinary distribution, to their representativity and to their openness.Research method/procedureThe study includes a scientometrcic analysis of the DART-Europe e-theses portal and of the French Theses.fr portal. It will also draw on other data from the French academic union catalo...

  3. Nucleons, mesons and quarks: the electron scattering approach

    International Nuclear Information System (INIS)

    Frois, B.

    1985-05-01

    A few examples of the research carried out by electron scattering in order to elucidate the relevant degrees of freedom for nuclear physics. Is considered first quasielastic scattering from 3 He which gives some insight into the properties of the nucleon in the nuclear medium. Then examples of meson exchange currents are presented. Finally, the present status of our understanding of shorter range effects is discussed

  4. Evaluating risk factors for endemic human Salmonella Enteritidis infections with different phage types in Ontario, Canada using multinomial logistic regression and a case-case study approach

    Directory of Open Access Journals (Sweden)

    Varga Csaba

    2012-10-01

    Full Text Available Abstract Background Identifying risk factors for Salmonella Enteritidis (SE infections in Ontario will assist public health authorities to design effective control and prevention programs to reduce the burden of SE infections. Our research objective was to identify risk factors for acquiring SE infections with various phage types (PT in Ontario, Canada. We hypothesized that certain PTs (e.g., PT8 and PT13a have specific risk factors for infection. Methods Our study included endemic SE cases with various PTs whose isolates were submitted to the Public Health Laboratory-Toronto from January 20th to August 12th, 2011. Cases were interviewed using a standardized questionnaire that included questions pertaining to demographics, travel history, clinical symptoms, contact with animals, and food exposures. A multinomial logistic regression method using the Generalized Linear Latent and Mixed Model procedure and a case-case study design were used to identify risk factors for acquiring SE infections with various PTs in Ontario, Canada. In the multinomial logistic regression model, the outcome variable had three categories representing human infections caused by SE PT8, PT13a, and all other SE PTs (i.e., non-PT8/non-PT13a as a referent category to which the other two categories were compared. Results In the multivariable model, SE PT8 was positively associated with contact with dogs (OR=2.17, 95% CI 1.01-4.68 and negatively associated with pepper consumption (OR=0.35, 95% CI 0.13-0.94, after adjusting for age categories and gender, and using exposure periods and health regions as random effects to account for clustering. Conclusions Our study findings offer interesting hypotheses about the role of phage type-specific risk factors. Multinomial logistic regression analysis and the case-case study approach are novel methodologies to evaluate associations among SE infections with different PTs and various risk factors.

  5. Spatial variability of excess mortality during prolonged dust events in a high-density city: a time-stratified spatial regression approach.

    Science.gov (United States)

    Wong, Man Sing; Ho, Hung Chak; Yang, Lin; Shi, Wenzhong; Yang, Jinxin; Chan, Ta-Chien

    2017-07-24

    Dust events have long been recognized to be associated with a higher mortality risk. However, no study has investigated how prolonged dust events affect the spatial variability of mortality across districts in a downwind city. In this study, we applied a spatial regression approach to estimate the district-level mortality during two extreme dust events in Hong Kong. We compared spatial and non-spatial models to evaluate the ability of each regression to estimate mortality. We also compared prolonged dust events with non-dust events to determine the influences of community factors on mortality across the city. The density of a built environment (estimated by the sky view factor) had positive association with excess mortality in each district, while socioeconomic deprivation contributed by lower income and lower education induced higher mortality impact in each territory planning unit during a prolonged dust event. Based on the model comparison, spatial error modelling with the 1st order of queen contiguity consistently outperformed other models. The high-risk areas with higher increase in mortality were located in an urban high-density environment with higher socioeconomic deprivation. Our model design shows the ability to predict spatial variability of mortality risk during an extreme weather event that is not able to be estimated based on traditional time-series analysis or ecological studies. Our spatial protocol can be used for public health surveillance, sustainable planning and disaster preparation when relevant data are available.

  6. A first approach to runaway electron control in FTU

    International Nuclear Information System (INIS)

    Boncagni, L.; Carnevale, D.; Cianfarani, C.; Esposito, B.; Granucci, G.; Maddaluno, G.; Marocco, D.; Martin-Solis, J.R.; Pucella, G.; Sozzi, C.; Varano, G.; Vitale, V.; Zaccarian, L.

    2013-01-01

    The Plasma Control System (PCS) of the Frascati Tokamak Upgrade (FTU) is not equipped with any runaway electron (RE) beam control or suppression tool. In this paper we propose an upgraded PCS including an architecture for the control of disruption-generated REs that, making use of filtering techniques to estimate the onsets of the current quench (CQ) and of the RE beam current plateau, provides a controlled plasma current shut-down and a simultaneous RE position control. The control strategy is based on a nonlinear technique, called Input Allocation, that allows to re-configure the current in the poloidal field (PF) coils and improve the PCS responsiveness needed for RE position control. Preliminary results on the implementation of the Input Allocation and an experimental proposal to test the control scheme architecture are discussed

  7. A first approach to runaway electron control in FTU

    Energy Technology Data Exchange (ETDEWEB)

    Boncagni, L. [Associazione Euratom/ENEA sulla Fusione, Centro Ricerche Frascati, CP 65, 00044 Frascati, Roma (Italy); Carnevale, D., E-mail: carnevaledaniele@gmail.com [Dipartimento Ing. Civile ed Ing. Informatica Università di Roma, Tor Vergata, Via del Politecnico 1, 00133 Roma (Italy); Cianfarani, C.; Esposito, B. [Associazione Euratom/ENEA sulla Fusione, Centro Ricerche Frascati, CP 65, 00044 Frascati, Roma (Italy); Granucci, G. [Associazione Euratom-CNR sulla Fusione, IFP-CNR, Via R. Cozzi 53, 20125 Milano (Italy); Maddaluno, G.; Marocco, D. [Associazione Euratom/ENEA sulla Fusione, Centro Ricerche Frascati, CP 65, 00044 Frascati, Roma (Italy); Martin-Solis, J.R. [Universidad Carlos III de Madrid, Avda. de la Universidad 30, 28911 Leganes-Madrid (Spain); Pucella, G. [Associazione Euratom/ENEA sulla Fusione, Centro Ricerche Frascati, CP 65, 00044 Frascati, Roma (Italy); Sozzi, C. [Associazione Euratom-CNR sulla Fusione, IFP-CNR, Via R. Cozzi 53, 20125 Milano (Italy); Varano, G. [Dipartimento Ing. Civile ed Ing. Informatica Università di Roma, Tor Vergata, Via del Politecnico 1, 00133 Roma (Italy); Vitale, V. [Associazione Euratom/ENEA sulla Fusione, Centro Ricerche Frascati, CP 65, 00044 Frascati, Roma (Italy); Zaccarian, L. [CNRS, LAAS, 7 av. du colonel Roche, F-31400 Toulouse (France); Univ. de Toulouse, LAAS, F-31400 Toulouse (France)

    2013-10-15

    The Plasma Control System (PCS) of the Frascati Tokamak Upgrade (FTU) is not equipped with any runaway electron (RE) beam control or suppression tool. In this paper we propose an upgraded PCS including an architecture for the control of disruption-generated REs that, making use of filtering techniques to estimate the onsets of the current quench (CQ) and of the RE beam current plateau, provides a controlled plasma current shut-down and a simultaneous RE position control. The control strategy is based on a nonlinear technique, called Input Allocation, that allows to re-configure the current in the poloidal field (PF) coils and improve the PCS responsiveness needed for RE position control. Preliminary results on the implementation of the Input Allocation and an experimental proposal to test the control scheme architecture are discussed.

  8. Method for secure electronic voting system: face recognition based approach

    Science.gov (United States)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  9. New Computational Approach to Electron Transport in Irregular Graphene Nanostructures

    Science.gov (United States)

    Mason, Douglas; Heller, Eric; Prendergast, David; Neaton, Jeffrey

    2009-03-01

    For novel graphene devices of nanoscale-to-macroscopic scale, many aspects of their transport properties are not easily understood due to difficulties in fabricating devices with regular edges. Here we develop a framework to efficiently calculate and potentially screen electronic transport properties of arbitrary nanoscale graphene device structures. A generalization of the established recursive Green's function method is presented, providing access to arbitrary device and lead geometries with substantial computer-time savings. Using single-orbital nearest-neighbor tight-binding models and the Green's function-Landauer scattering formalism, we will explore the transmission function of irregular two-dimensional graphene-based nanostructures with arbitrary lead orientation. Prepared by LBNL under contract DE-AC02-05CH11231 and supported by the U.S. Dept. of Energy Computer Science Graduate Fellowship under grant DE-FG02-97ER25308.

  10. Novel approaches to study low-energy electron-induced damage to DNA oligonucleotides

    International Nuclear Information System (INIS)

    Rackwitz, Jenny; Bald, Ilko; Ranković, Miloš Lj; Milosavljević, Aleksandar R

    2015-01-01

    The novel approach of DNA origami structures as templates for precise quantification of various well- defined oligonucleotides provides the opportunity to determine the sensitivity of complex DNA sequences towards low-energy electrons. (paper)

  11. Electronic Structure Approach to Tunable Electronic Properties of Hybrid Organic-Inorganic Perovskites

    Science.gov (United States)

    Liu, Garnett; Huhn, William; Mitzi, David B.; Kanai, Yosuke; Blum, Volker

    We present a study of the electronic structure of layered hybrid organic-inorganic perovskite (HOIP) materials using all-electron density-functional theory. Varying the nature of the organic and inorganic layers should enable systematically fine-tuning the carrier properties of each component. Using the HSE06 hybrid density functional including spin-orbit coupling (SOC), we validate the principle of tuning subsystem-specific parts of the electron band structures and densities of states in CH3NH3PbX3 (X=Cl, Br, I) compared to a modified organic component in layered (C6H5C2H4NH3) 2PbX4 (X=Cl, Br, I) and C20H22S4N2PbX4 (X=Cl, Br, I). We show that tunable shifts of electronic levels indeed arise by varying Cl, Br, I as the inorganic components, and CH3NH3+ , C6H5C2H4NH3+ , C20H22S4N22 + as the organic components. SOC is found to play an important role in splitting the conduction bands of the HOIP compounds investigated here. The frontier orbitals of the halide shift, increasing the gap, when Cl is substituted for Br and I.

  12. Electronic structure of a striped nickelate studied by the exact exchange for correlated electrons (EECE) approach

    KAUST Repository

    Schwingenschlö gl, Udo; Schuster, Cosima B.; Fré sard, Raymond

    2009-01-01

    Motivated by a RIXS study of Wakimoto, et al.(Phys. Rev. Lett., 102 (2009) 157001) we use density functional theory to analyze the magnetic order in the nickelate La5/3Sr1/3NiO4 and the details of its crystal and electronic structure. We compare

  13. Alternate approaches to future electron-positron linear colliders

    Energy Technology Data Exchange (ETDEWEB)

    Loew, G.A. [Stanford Univ., CA (United States). Stanford Linear Accelerator Center

    1998-07-01

    The purpose of this article is two-fold: to review the current international status of various design approaches to the next generation of e{sup +}e{sup {minus}} linear colliders, and on the occasion of his 80th birthday, to celebrate Richard B. Neal`s many contributions to the field of linear accelerators. As it turns out, combining these two tasks is a rather natural enterprise because of Neal`s long professional involvement and insight into many of the problems and options which the international e{sup +}e{sup {minus}} linear collider community is currently studying to achieve a practical design for a future machine.

  14. Alternate approaches to future electron-positron linear colliders

    International Nuclear Information System (INIS)

    Loew, G.A.

    1998-01-01

    The purpose of this article is two-fold: to review the current international status of various design approaches to the next generation of e + e - linear colliders, and on the occasion of his 80th birthday, to celebrate Richard B. Neal's many contributions to the field of linear accelerators. As it turns out, combining these two tasks is a rather natural enterprise because of Neal's long professional involvement and insight into many of the problems and options which the international e + e - linear collider community is currently studying to achieve a practical design for a future machine

  15. Human dental age estimation using third molar developmental stages: does a Bayesian approach outperform regression models to discriminate between juveniles and adults?

    Science.gov (United States)

    Thevissen, P W; Fieuws, S; Willems, G

    2010-01-01

    Dental age estimation methods based on the radiologically detected third molar developmental stages are implemented in forensic age assessments to discriminate between juveniles and adults considering the judgment of young unaccompanied asylum seekers. Accurate and unbiased age estimates combined with appropriate quantified uncertainties are the required properties for accurate forensic reporting. In this study, a subset of 910 individuals uniformly distributed in age between 16 and 22 years was selected from an existing dataset collected by Gunst et al. containing 2,513 panoramic radiographs with known third molar developmental stages of Belgian Caucasian men and women. This subset was randomly split in a training set to develop a classical regression analysis and a Bayesian model for the multivariate distribution of the third molar developmental stages conditional on age and in a test set to assess the performance of both models. The aim of this study was to verify if the Bayesian approach differentiates the age of maturity more precisely and removes the bias, which disadvantages the systematically overestimated young individuals. The Bayesian model offers the discrimination of subjects being older than 18 years more appropriate and produces more meaningful prediction intervals but does not strongly outperform the classical approaches.

  16. An integrated unscented kalman filter and relevance vector regression approach for lithium-ion battery remaining useful life and short-term capacity prediction

    International Nuclear Information System (INIS)

    Zheng, Xiujuan; Fang, Huajing

    2015-01-01

    The gradual decreasing capacity of lithium-ion batteries can serve as a health indicator for tracking the degradation of lithium-ion batteries. It is important to predict the capacity of a lithium-ion battery for future cycles to assess its health condition and remaining useful life (RUL). In this paper, a novel method is developed using unscented Kalman filter (UKF) with relevance vector regression (RVR) and applied to RUL and short-term capacity prediction of batteries. A RVR model is employed as a nonlinear time-series prediction model to predict the UKF future residuals which otherwise remain zero during the prediction period. Taking the prediction step into account, the predictive value through the RVR method and the latest real residual value constitute the future evolution of the residuals with a time-varying weighting scheme. Next, the future residuals are utilized by UKF to recursively estimate the battery parameters for predicting RUL and short-term capacity. Finally, the performance of the proposed method is validated and compared to other predictors with the experimental data. According to the experimental and analysis results, the proposed approach has high reliability and prediction accuracy, which can be applied to battery monitoring and prognostics, as well as generalized to other prognostic applications. - Highlights: • An integrated method is proposed for RUL prediction as well as short-term capacity prediction. • Relevance vector regression model is employed as a nonlinear time-series prediction model. • Unscented Kalman filter is used to recursively update the states for battery model parameters during the prediction. • A time-varying weighting scheme is utilized to improve the accuracy of the RUL prediction. • The proposed method demonstrates high reliability and prediction accuracy.

  17. Nicotine and Cotinine Exposure from Electronic Cigarettes: A Population Approach

    Science.gov (United States)

    de Mendizábal, Nieves Vélez; Jones, David R.; Jahn, Andy; Bies, Robert R.; Brown, Joshua W.

    2015-01-01

    Background and Objectives Electronic cigarettes (e-cigarettes) are a recent technology that has gained rapid acceptance. Still, little is known about them in terms of safety and effectiveness. A basic question is how effectively they deliver nicotine, however the literature is surprisingly unclear on this point. Here, a population pharmacokinetic (PK) model was developed for nicotine and its major metabolite cotinine with the aim to provide a reliable framework for the simulation of nicotine and cotinine concentrations over time, based solely on inhalation airflow recordings and individual covariates (i.e. weight and breath carbon monoxide CO levels). Methods This study included 10 adults self-identified as heavy smokers (at least one pack per day). Plasma nicotine and cotinine concentrations were measured at regular 10-minute intervals for 90 minutes while human subjects inhaled nicotine vapor from a modified e-cigarette. Airflow measurements were recorded every 200 milliseconds throughout the session. A population PK model for nicotine and cotinine was developed based on previously published PK parameters and the airflow recordings. All the analyses were performed with the nonlinear mixed-effect modelling software NONMEM 7.2. Results The results show that e-cigarettes deliver nicotine effectively, although the pharmacokinetic profiles are lower than those achieved with regular cigarettes. Our PK model effectively predicts plasma nicotine and cotinine concentrations from the inhalation volume, and initial breath CO. Conclusion E-cigarettes are effective at delivering nicotine. This new PK model of e-cigarette usage might be used for pharmacodynamic analysis where the PK profiles are not available. PMID:25503588

  18. Teaching Electronic Literacy A Concepts-Based Approach for School Library Media Specialists

    CERN Document Server

    Craver, Kathleen W

    1997-01-01

    School library media specialists will find this concepts-based approach to teaching electronic literacy an indispensable basic tool for instructing students and teachers. It provides step-by-step instruction on how to find and evaluate needed information from electronic databases and the Internet, how to formulate successful electronic search strategies and retrieve relevant results, and how to interpret and critically analyze search results. The chapters contain a suggested lesson plan and sample assignments for the school library media specialist to use in teaching electronic literacy skills

  19. Ordinary least square regression, orthogonal regression, geometric mean regression and their applications in aerosol science

    International Nuclear Information System (INIS)

    Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei

    2007-01-01

    Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age

  20. New Statistical Multiparticle Approach to the Acceleration of Electrons by the Ion Field in Plasmas

    Directory of Open Access Journals (Sweden)

    Eugene Oks

    2010-01-01

    Full Text Available The phenomenon of the acceleration of the (perturbing electrons by the ion field (AEIF significantly reduces Stark widths and shifts in plasmas of relatively high densities and/or relatively low temperature. Our previous analytical calculations of the AEIF were based on the dynamical treatment: the starting point was the ion-microfield-caused changes of the trajectories and velocities of individual perturbing electrons. In the current paper, we employ a statistical approach: the starting point is the electron velocity distribution function modified by the ion microfield. The latter had been calculated by Romanovsky and Ebeling in the multiparticle description of the ion microfield. The result shows again the reduction of the electron Stark broadening. Thus two totally different analytical approaches (dynamical and statistical agree with each other and therefore disprove the corresponding recent fully-numerical simulations by Stambulchik et al. that claimed an increase of the electron Stark broadening.

  1. Differentiating regressed melanoma from regressed lichenoid keratosis.

    Science.gov (United States)

    Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A

    2017-04-01

    Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. New Approach For Detection Of Irradiated Spices Using Electron Spin Resonance (ESR)

    International Nuclear Information System (INIS)

    FARAG, S.A.; SHAMS EL DIEEN, N.M.M.

    2010-01-01

    Black pepper and anise samples were irradiated with different doses of gamma rays (5, 10 and 20 kGy) then the irradiated samples were stored at room temperature (20 0 C, 70-75 % RH) for one year. The measurements of free radicals were carried out by electron spin resonance (ESR) at different intervals (3, 6, 9 and 12 months). A series of signals tentatively described as cellulose-like and complex radical observed at G values were 2.01027 for black pepper and 2.01019 for anise. The ESR signals of irradiated spices showed a directly proportional relationship for increasing dose with increasing intensity of signal. A relationship was noticed as polynomial regression analysis resulted between signals of ESR intensity and applied doses with significant values of correlation coefficient (R 2 ). All combination treatments of thermal and irradiation beside long storage caused significant reduction of ESR intensity of irradiated black pepper and anise. Upon using low doses as 1, 2 and 3 kGy for re-irradiation, the irradiated samples (10 and 20 kGy) increased the power of ESR intensity. The enhancement effect was markedly increased. For example, the irradiated black pepper (10 kGy) increased the ESR intensity with high percentages as 49.19%, 69.23% and 89.68% while the high dose (20 kGy) caused increase by 39.96%, 69.05% and 96.90% for irradiated black pepper samples. This approach with that technique can be used easily to overcome the main disadvantages of ESR signals fading especially at the end of storage period.

  3. Evaluation of the prediction precision capability of partial least squares regression approach for analysis of high alloy steel by laser induced breakdown spectroscopy

    Science.gov (United States)

    Sarkar, Arnab; Karki, Vijay; Aggarwal, Suresh K.; Maurya, Gulab S.; Kumar, Rohit; Rai, Awadhesh K.; Mao, Xianglei; Russo, Richard E.

    2015-06-01

    Laser induced breakdown spectroscopy (LIBS) was applied for elemental characterization of high alloy steel using partial least squares regression (PLSR) with an objective to evaluate the analytical performance of this multivariate approach. The optimization of the number of principle components for minimizing error in PLSR algorithm was investigated. The effect of different pre-treatment procedures on the raw spectral data before PLSR analysis was evaluated based on several statistical (standard error of prediction, percentage relative error of prediction etc.) parameters. The pre-treatment with "NORM" parameter gave the optimum statistical results. The analytical performance of PLSR model improved by increasing the number of laser pulses accumulated per spectrum as well as by truncating the spectrum to appropriate wavelength region. It was found that the statistical benefit of truncating the spectrum can also be accomplished by increasing the number of laser pulses per accumulation without spectral truncation. The constituents (Co and Mo) present in hundreds of ppm were determined with relative precision of 4-9% (2σ), whereas the major constituents Cr and Ni (present at a few percent levels) were determined with a relative precision of ~ 2%(2σ).

  4. Development and application of a 2-electron reduced density matrix approach to electron transport via molecular junctions

    Science.gov (United States)

    Hoy, Erik P.; Mazziotti, David A.; Seideman, Tamar

    2017-11-01

    Can an electronic device be constructed using only a single molecule? Since this question was first asked by Aviram and Ratner in the 1970s [Chem. Phys. Lett. 29, 277 (1974)], the field of molecular electronics has exploded with significant experimental advancements in the understanding of the charge transport properties of single molecule devices. Efforts to explain the results of these experiments and identify promising new candidate molecules for molecular devices have led to the development of numerous new theoretical methods including the current standard theoretical approach for studying single molecule charge transport, i.e., the non-equilibrium Green's function formalism (NEGF). By pairing this formalism with density functional theory (DFT), a wide variety of transport problems in molecular junctions have been successfully treated. For some systems though, the conductance and current-voltage curves predicted by common DFT functionals can be several orders of magnitude above experimental results. In addition, since density functional theory relies on approximations to the exact exchange-correlation functional, the predicted transport properties can show significant variation depending on the functional chosen. As a first step to addressing this issue, the authors have replaced density functional theory in the NEGF formalism with a 2-electron reduced density matrix (2-RDM) method, creating a new approach known as the NEGF-RDM method. 2-RDM methods provide a more accurate description of electron correlation compared to density functional theory, and they have lower computational scaling compared to wavefunction based methods of similar accuracy. Additionally, 2-RDM methods are capable of capturing static electron correlation which is untreatable by existing NEGF-DFT methods. When studying dithiol alkane chains and dithiol benzene in model junctions, the authors found that the NEGF-RDM predicts conductances and currents that are 1-2 orders of magnitude below

  5. Regression: A Bibliography.

    Science.gov (United States)

    Pedrini, D. T.; Pedrini, Bonnie C.

    Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…

  6. Ridge Regression Signal Processing

    Science.gov (United States)

    Kuhl, Mark R.

    1990-01-01

    The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.

  7. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  8. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying; Carroll, Raymond J.

    2009-01-01

    . The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a

  9. Electronic and magnetic properties of UPdSn: the itinerant 5f electrons approach

    CERN Document Server

    Sandratskii, L M

    1997-01-01

    Density functional theory, modified to include spin-orbit coupling and an effective orbital field to simulate Hound's second rule, is applied to investigate the magnetic structure and electronic properties of the compound Upends. Our theoretical results are in overall good agreement with experiment. Thus both theory and experiment find the magnetic structure of Upends to be non collinear, the calculated magnetic U-moments being in very good agreement with the measurements. Also, the calculated density of states is found to simulate closely the photoemission spectrum and the very low experimental value of 5 mJ mol sup - sup 1 K sup - sup 2 for the specific heat gamma is reproduced reasonably well by the calculated value of 7.5 mJ mol sup - sup 1 K sup - sup 2. Furthermore, the interconnection of the magnetic structure with the crystal structure is investigated. Here theory and experiment agree concerning the planar non collinear antiferromagnetic configuration in the orthorhombic crystal structure and for the ...

  10. Electronic structure of FeTiSb using relativistic and scalar-relativistic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Sahariya, Jagrati [Department of Physics, Manipal University Jaipur, Jaipur-303007, Rajasthan (India); Mund, H. S., E-mail: hmoond@gmail.com [Department of Physics, M. L. Sukhadia University, Udaipur-313001, Rajasthan (India)

    2016-05-06

    Electronic and magnetic properties of FeTiSb have been reported. The calculations are performed using spin polarized relativistic Korringa-Kohn-Rostoker scheme based on Green’s function method. Within SPR-KKR a fully relativistic and scalar-relativistic approaches have been used to investigate electronic structure of FeTiSb. Energy bands, total and partial density of states, atom specific magnetic moment along with total moment of FeTiSb alloys are presented.

  11. Many-electron approaches in physics, chemistry and mathematics a multidisciplinary view

    CERN Document Server

    Site, Luigi

    2014-01-01

    This book provides a broad description of the development and (computational) application of many-electron approaches from a multidisciplinary perspective. In the context of studying many-electron systems Computer Science, Chemistry, Mathematics and Physics are all intimately interconnected. However, beyond a handful of communities working at the interface between these disciplines, there is still a marked separation of subjects. This book seeks to offer a common platform for possible exchanges between the various fields and to introduce the reader to perspectives for potential further developments across the disciplines. The rapid advances of modern technology will inevitably require substantial improvements in the approaches currently used, which will in turn make exchanges between disciplines indispensable. In essence this book is one of the very first attempts at an interdisciplinary approach to the many-electron problem.

  12. Visualized attribute analysis approach for characterization and quantification of rice taste flavor using electronic tongue

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Lin; Hu, Xianqiao [Rice Product Quality Supervision and Inspection Center, Ministry of Agriculture, China National Rice Research Institute, Hangzhou 310006 (China); Tian, Shiyi; Deng, Shaoping [College of Food Science and Biotechnology, Zhejiang Gongshang University, Hangzhou 310035 (China); Zhu, Zhiwei, E-mail: 615834652@qq.com [Rice Product Quality Supervision and Inspection Center, Ministry of Agriculture, China National Rice Research Institute, Hangzhou 310006 (China)

    2016-05-05

    This paper deals with a novel visualized attributive analysis approach for characterization and quantification of rice taste flavor attributes (softness, stickiness, sweetness and aroma) employing a multifrequency large-amplitude pulse voltammetric electronic tongue. Data preprocessing methods including Principal Component Analysis (PCA) and Fast Fourier Transform (FFT) were provided. An attribute characterization graph was represented for visualization of the interactive response in which each attribute responded by specific electrodes and frequencies. The model was trained using signal data from electronic tongue and attribute scores from artificial evaluation. The correlation coefficients for all attributes were over 0.9, resulting in good predictive ability of attributive analysis model preprocessed by FFT. This approach extracted more effective information about linear relationship between electronic tongue and taste flavor attribute. Results indicated that this approach can accurately quantify taste flavor attributes, and can be an efficient tool for data processing in a voltammetric electronic tongue system. - Graphical abstract: Schematic process for visualized attributive analysis approach using multifrequency large-amplitude pulse voltammetric electronic tongue for determination of rice taste flavor attribute. (a) sample; (b) sensors in electronic tongue; (c) excitation voltage program and response current signal from MLAPS; (d) similarity data matrix by data preprocessing and similarity extraction; (e) feature data matrix of attribute; (f) attribute characterization graph; (g) attribute scores predicted by the model. - Highlights: • Multifrequency large-amplitude pulse voltammetric electronic tongue was used. • A visualized attributive analysis approach was created as an efficient tool for data processing. • Rice taste flavor attribute was determined and predicted. • The attribute characterization graph was represented for visualization of the

  13. The principles of electronic and electromechanic power conversion a systems approach

    CERN Document Server

    Ferreira, Braham

    2013-01-01

    Teaching the principles of power electronics and electromechanical power conversion through a unique top down systems approach, The Principles of Electromechanical Power Conversion takes the role and system context of power conversion functions as the starting point. Following this approach, the text defines the building blocks of the system and describes the theory of how they exchange power with each other. The authors introduce a modern, simple approach to machines, which makes the principles of field oriented control and space vector theory approachable to undergraduate students as well as

  14. Multiple scattering approach to the vibrational excitation of molecules by slow electrons

    International Nuclear Information System (INIS)

    Drukarev, G.

    1976-01-01

    Another approach to the problem of vibrational excitation of homonuclear two-atomic molecules by slow electrons possibly accompanied by rotational transitions is presented based on the picture of multiple scattering of an electron inside the molecule. The scattering of two fixed centers in the zero range potential model is considered. The results indicate that the multiple scattering determines the order of magnitude of the vibrational excitation cross sections in the energy region under consideration even if the zero range potential model is used. Also the connection between the multiple scattering approach and quasi-stationary molecular ion picture is established. 9 refs

  15. Quantum Geometry: Relativistic energy approach to cooperative electron-nucleary-transition spectrum

    Directory of Open Access Journals (Sweden)

    Ольга Юрьевна Хецелиус

    2014-11-01

    Full Text Available An advanced relativistic energy approach is presented and applied to calculating parameters of electron-nuclear 7-transition spectra of nucleus in the atom. The intensities of the spectral satellites are defined in the relativistic version of the energy approach (S-matrix formalism, and gauge-invariant quantum-electrodynamical perturbation theory with the Dirac-Kohn-Sham density-functional zeroth approximation.

  16. Regression analysis with categorized regression calibrated exposure: some interesting findings

    Directory of Open Access Journals (Sweden)

    Hjartåker Anette

    2006-07-01

    Full Text Available Abstract Background Regression calibration as a method for handling measurement error is becoming increasingly well-known and used in epidemiologic research. However, the standard version of the method is not appropriate for exposure analyzed on a categorical (e.g. quintile scale, an approach commonly used in epidemiologic studies. A tempting solution could then be to use the predicted continuous exposure obtained through the regression calibration method and treat it as an approximation to the true exposure, that is, include the categorized calibrated exposure in the main regression analysis. Methods We use semi-analytical calculations and simulations to evaluate the performance of the proposed approach compared to the naive approach of not correcting for measurement error, in situations where analyses are performed on quintile scale and when incorporating the original scale into the categorical variables, respectively. We also present analyses of real data, containing measures of folate intake and depression, from the Norwegian Women and Cancer study (NOWAC. Results In cases where extra information is available through replicated measurements and not validation data, regression calibration does not maintain important qualities of the true exposure distribution, thus estimates of variance and percentiles can be severely biased. We show that the outlined approach maintains much, in some cases all, of the misclassification found in the observed exposure. For that reason, regression analysis with the corrected variable included on a categorical scale is still biased. In some cases the corrected estimates are analytically equal to those obtained by the naive approach. Regression calibration is however vastly superior to the naive method when applying the medians of each category in the analysis. Conclusion Regression calibration in its most well-known form is not appropriate for measurement error correction when the exposure is analyzed on a

  17. Advantages and Limitations of Anticipating Laboratory Test Results from Regression- and Tree-Based Rules Derived from Electronic Health-Record Data

    OpenAIRE

    Mohammad, Fahim; Theisen-Toupal, Jesse C.; Arnaout, Ramy

    2014-01-01

    Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to thei...

  18. A modified linear algebraic approach to electron scattering using cubic splines

    International Nuclear Information System (INIS)

    Kinney, R.A.

    1986-01-01

    A modified linear algebraic approach to the solution of the Schrodiner equation for low-energy electron scattering is presented. The method uses a piecewise cubic-spline approximation of the wavefunction. Results in the static-potential and the static-exchange approximations for e - +H s-wave scattering are compared with unmodified linear algebraic and variational linear algebraic methods. (author)

  19. Hot electrons and the approach to metallic behavior in Kx(KCl)1-x

    NARCIS (Netherlands)

    Silvestrelli, P.L.; Alavi, A.; Parrinello, M.; Frenkel, D.

    1996-01-01

    The approach to the metallic phase of molten Kx(KCl)1-x mixtures is studied using ab initio molecular dynamics based on finite-temperature density functional theory. The finite electronic temperature is found to result in new and unexpected effects. In particular, we observe a thermally induced

  20. Eikonal approach to the atomic break-up process by polarized electrons

    International Nuclear Information System (INIS)

    Onaga, Tomohide

    1992-01-01

    The cross section asymmetry for ionization of hydrogen atoms by electron impact is analysed in the eikonal approach. A new formulation is given for the evaluation of the exchange amplitude up to higher partial Coulomb waves. It is concluded that the cross section asymmetry gives an important criterion or interesting test of validity of approximation methods with the exchange effect. (author)

  1. A Graphical, Self-Organizing Approach to Classifying Electronic Meeting Output.

    Science.gov (United States)

    Orwig, Richard E.; Chen, Hsinchun; Nunamaker, Jay F., Jr.

    1997-01-01

    Describes research using an artificial intelligence approach in the application of a Kohonen Self-Organizing Map (SOM) to the problem of classification of electronic brainstorming output and an evaluation of the results. The graphical representation of textual data produced by the Kohonen SOM suggests many opportunities for improving information…

  2. A New Approach in Teaching Power Electronics Control of Electrical Drives using Real-Time

    DEFF Research Database (Denmark)

    Teodorescu, Remus; Bech, Michael Møller; Blaabjerg, Frede

    2000-01-01

    A new approach in teaching power electronics and electrical drives is achieved at the Flexible Drives System Laboratory (FDSL) from Aalborg University by using the new Total Development Environment (TDE) concept that allows a full visual block-oriented programming of dynamic real-time systems...

  3. General approach to understanding the electronic structure of graphene on metals

    International Nuclear Information System (INIS)

    Voloshina, E N; Dedkov, Yu S

    2014-01-01

    This manuscript presents the general approach to the understanding of the connection between bonding mechanism and electronic structure of graphene on metals. To demonstrate its validity, two limiting cases of ‘weakly’ and ‘strongly’ bonded graphene on Al(111) and Ni(111) are considered, where the Dirac cone is preserved or fully destroyed, respectively. Furthermore, the electronic structure, i.e. doping level, hybridization effects, as well as a gap formation at the Dirac point of the intermediate system, graphene/Cu(111), is fully understood in the framework of the proposed approach. This work summarises the long-term debates regarding connection of the bonding strength and the valence band modification in the graphene/metal systems and paves a way for the effective control of the electronic states of graphene in the vicinity of the Fermi level. (paper)

  4. Zeroth order regular approximation approach to electric dipole moment interactions of the electron

    Science.gov (United States)

    Gaul, Konstantin; Berger, Robert

    2017-07-01

    A quasi-relativistic two-component approach for an efficient calculation of P ,T -odd interactions caused by a permanent electric dipole moment of the electron (eEDM) is presented. The approach uses a (two-component) complex generalized Hartree-Fock and a complex generalized Kohn-Sham scheme within the zeroth order regular approximation. In applications to select heavy-elemental polar diatomic molecular radicals, which are promising candidates for an eEDM experiment, the method is compared to relativistic four-component electron-correlation calculations and confirms values for the effective electric field acting on the unpaired electron for RaF, BaF, YbF, and HgF. The calculations show that purely relativistic effects, involving only the lower component of the Dirac bi-spinor, are well described by treating only the upper component explicitly.

  5. Electron microscopy approach for the visualization of the epithelial and endothelial glycocalyx.

    Science.gov (United States)

    Chevalier, L; Selim, J; Genty, D; Baste, J M; Piton, N; Boukhalfa, I; Hamzaoui, M; Pareige, P; Richard, V

    2017-06-01

    This study presents a methodological approach for the visualization of the glycocalyx by electron microscopy. The glycocalyx is a three dimensional network mainly composed of glycolipids, glycoproteins and proteoglycans associated with the plasma membrane. Since less than a decade, the epithelial and endothelial glycocalyx proved to play an important role in physiology and pathology, increasing its research interest especially in vascular functions. Therefore, visualization of the glycocalyx requires reliable techniques and its preservation remains challenging due to its fragile and dynamic organization, which is highly sensitive to the different process steps for electron microscopy sampling. In this study, chemical fixation was performed by perfusion as a good alternative to conventional fixation. Additional lanthanum nitrate in the fixative enhances staining of the glycocalyx in transmission electron microscopy bright field and improves its visualization by detecting the elastic scattered electrons, thus providing a chemical contrast. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  6. Retro-regression--another important multivariate regression improvement.

    Science.gov (United States)

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  7. Reduced Rank Regression

    DEFF Research Database (Denmark)

    Johansen, Søren

    2008-01-01

    The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...

  8. Fabrication Approaches to Interconnect Based Devices for Stretchable Electronics: A Review

    Directory of Open Access Journals (Sweden)

    Steven Nagels

    2018-03-01

    Full Text Available Stretchable electronics promise to naturalize the way that we are surrounded by and interact with our devices. Sensors that can stretch and bend furthermore have become increasingly relevant as the technology behind them matures rapidly from lab-based workflows to industrially applicable production principles. Regardless of the specific materials used, creating stretchable conductors involves either the implementation of strain reliefs through insightful geometric patterning, the dispersion of stiff conductive filler in an elastomeric matrix, or the employment of intrinsically stretchable conductive materials. These basic principles however have spawned a myriad of materials systems wherein future application engineers need to find their way. This paper reports a literature study on the spectrum of different approaches towards stretchable electronics, discusses standardization of characteristic tests together with their reports and estimates matureness for industry. Patterned copper foils that are embedded in elastomeric sheets, which are closest to conventional electronic circuits processing, make up one end of the spectrum. Furthest from industry are the more recent circuits based on intrinsically stretchable liquid metals. These show extremely promising results, however, as a technology, liquid metal is not mature enough to be adapted. Printing makes up the transition between both ends, and is also well established on an industrial level, but traditionally not linked to creating electronics. Even though a certain level of maturity was found amongst the approaches that are reviewed herein, industrial adaptation for consumer electronics remains unpredictable without a designated break-through commercial application.

  9. Fabrication Approaches to Interconnect Based Devices for Stretchable Electronics: A Review.

    Science.gov (United States)

    Nagels, Steven; Deferme, Wim

    2018-03-03

    Stretchable electronics promise to naturalize the way that we are surrounded by and interact with our devices. Sensors that can stretch and bend furthermore have become increasingly relevant as the technology behind them matures rapidly from lab-based workflows to industrially applicable production principles. Regardless of the specific materials used, creating stretchable conductors involves either the implementation of strain reliefs through insightful geometric patterning, the dispersion of stiff conductive filler in an elastomeric matrix, or the employment of intrinsically stretchable conductive materials. These basic principles however have spawned a myriad of materials systems wherein future application engineers need to find their way. This paper reports a literature study on the spectrum of different approaches towards stretchable electronics, discusses standardization of characteristic tests together with their reports and estimates matureness for industry. Patterned copper foils that are embedded in elastomeric sheets, which are closest to conventional electronic circuits processing, make up one end of the spectrum. Furthest from industry are the more recent circuits based on intrinsically stretchable liquid metals. These show extremely promising results, however, as a technology, liquid metal is not mature enough to be adapted. Printing makes up the transition between both ends, and is also well established on an industrial level, but traditionally not linked to creating electronics. Even though a certain level of maturity was found amongst the approaches that are reviewed herein, industrial adaptation for consumer electronics remains unpredictable without a designated break-through commercial application.

  10. From Rasch scores to regression

    DEFF Research Database (Denmark)

    Christensen, Karl Bang

    2006-01-01

    Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....

  11. Advantages and limitations of anticipating laboratory test results from regression- and tree-based rules derived from electronic health-record data.

    Directory of Open Access Journals (Sweden)

    Fahim Mohammad

    Full Text Available Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal". We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV and area under the receiver-operator characteristic curve (ROC AUCs. Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.

  12. Advantages and limitations of anticipating laboratory test results from regression- and tree-based rules derived from electronic health-record data.

    Science.gov (United States)

    Mohammad, Fahim; Theisen-Toupal, Jesse C; Arnaout, Ramy

    2014-01-01

    Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV) and area under the receiver-operator characteristic curve (ROC AUCs). Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.

  13. Direct electron transfer: an approach for electrochemical biosensors with higher selectivity and sensitivity

    Directory of Open Access Journals (Sweden)

    Freire Renato S.

    2003-01-01

    Full Text Available The most promising approach for the development of electrochemical biosensors is to establish a direct electrical communication between the biomolecules and the electrode surface. This review focuses on advances, directions and strategies in the development of third generation electrochemical biosensors. Subjects covered include a brief description of the fundamentals of the electron transfer phenomenon and amperometric biosensor development (different types and new oriented enzyme immobilization techniques. Special attention is given to different redox enzymes and proteins capable of electrocatalyzing reactions via direct electron transfer. The analytical applications and future trends for third generation biosensors are also presented and discussed.

  14. The effect of different electrodes on the electronic transmission of benzene junctions: Analytical approach

    Energy Technology Data Exchange (ETDEWEB)

    Mohebbi, Razie; Seyed-Yazdi, Jamileh, E-mail: j.seyedyazdi@vru.ac.ir

    2016-06-01

    In this paper we have investigated the electronic transmission of systems electrode–benzene–electrode using the Landauer approach. The effect of different electrodes made of metal (Au) and semiconductors (Si, TiO{sub 2}) is investigated. These three electrodes are compared between them and the results show that the electronic transmission of benzene junctions, when using semiconductor electrodes, is associated to a gap in transmission which is due to the electrodes band gap. As a consequence, a threshold voltage is necessary to obtain conducting channels.

  15. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-01-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10 −4 ), 283 for the intensity approach (p = 2  ×  10 −6 ) and 282

  16. Integrated Nationwide Electronic Health Records system: Semi-distributed architecture approach.

    Science.gov (United States)

    Fragidis, Leonidas L; Chatzoglou, Prodromos D; Aggelidis, Vassilios P

    2016-11-14

    The integration of heterogeneous electronic health records systems by building an interoperable nationwide electronic health record system provides undisputable benefits in health care, like superior health information quality, medical errors prevention and cost saving. This paper proposes a semi-distributed system architecture approach for an integrated national electronic health record system incorporating the advantages of the two dominant approaches, the centralized architecture and the distributed architecture. The high level design of the main elements for the proposed architecture is provided along with diagrams of execution and operation and data synchronization architecture for the proposed solution. The proposed approach effectively handles issues related to redundancy, consistency, security, privacy, availability, load balancing, maintainability, complexity and interoperability of citizen's health data. The proposed semi-distributed architecture offers a robust interoperability framework without healthcare providers to change their local EHR systems. It is a pragmatic approach taking into account the characteristics of the Greek national healthcare system along with the national public administration data communication network infrastructure, for achieving EHR integration with acceptable implementation cost.

  17. Use of Multiple Linear Regression Models for Setting Water Quality Criteria for Copper: A Complementary Approach to the Biotic Ligand Model.

    Science.gov (United States)

    Brix, Kevin V; DeForest, David K; Tear, Lucinda; Grosell, Martin; Adams, William J

    2017-05-02

    Biotic Ligand Models (BLMs) for metals are widely applied in ecological risk assessments and in the development of regulatory water quality guidelines in Europe, and in 2007 the United States Environmental Protection Agency (USEPA) recommended BLM-based water quality criteria (WQC) for Cu in freshwater. However, to-date, few states have adopted BLM-based Cu criteria into their water quality standards on a state-wide basis, which appears to be due to the perception that the BLM is too complicated or requires too many input variables. Using the mechanistic BLM framework to first identify key water chemistry parameters that influence Cu bioavailability, namely dissolved organic carbon (DOC), pH, and hardness, we developed Cu criteria using the same basic methodology used by the USEPA to derive hardness-based criteria but with the addition of DOC and pH. As an initial proof of concept, we developed stepwise multiple linear regression (MLR) models for species that have been tested over wide ranges of DOC, pH, and hardness conditions. These models predicted acute Cu toxicity values that were within a factor of ±2 in 77% to 97% of tests (5 species had adequate data) and chronic Cu toxicity values that were within a factor of ±2 in 92% of tests (1 species had adequate data). This level of accuracy is comparable to the BLM. Following USEPA guidelines for WQC development, the species data were then combined to develop a linear model with pooled slopes for each independent parameter (i.e., DOC, pH, and hardness) and species-specific intercepts using Analysis of Covariance. The pooled MLR and BLM models predicted species-specific toxicity with similar precision; adjusted R 2 and R 2 values ranged from 0.56 to 0.86 and 0.66-0.85, respectively. Graphical exploration of relationships between predicted and observed toxicity, residuals and observed toxicity, and residuals and concentrations of key input parameters revealed many similarities and a few key distinctions between the

  18. A bioelectrochemical approach to characterize extracellular electron transfer by Synechocystis sp. PCC6803.

    Directory of Open Access Journals (Sweden)

    Angelo Cereda

    Full Text Available Biophotovoltaic devices employ photosynthetic organisms at the anode of a microbial fuel cell to generate electrical power. Although a range of cyanobacteria and algae have been shown to generate photocurrent in devices of a multitude of architectures, mechanistic understanding of extracellular electron transfer by phototrophs remains minimal. Here we describe a mediatorless bioelectrochemical device to measure the electrogenic output of a planktonically grown cyanobacterium, Synechocystis sp. PCC6803. Light dependent production of current is measured, and its magnitude is shown to scale with microbial cell concentration and light intensity. Bioelectrochemical characterization of a Synechocystis mutant lacking Photosystem II demonstrates conclusively that production of the majority of photocurrent requires a functional water splitting aparatus and electrons are likely ultimately derived from water. This shows the potential of the device to rapidly and quantitatively characterize photocurrent production by genetically modified strains, an approach that can be used in future studies to delineate the mechanisms of cyanobacterial extracellular electron transport.

  19. Reconstructing Regional Ionospheric Electron Density: A Combined Spherical Slepian Function and Empirical Orthogonal Function Approach

    Science.gov (United States)

    Farzaneh, Saeed; Forootan, Ehsan

    2018-03-01

    The computerized ionospheric tomography is a method for imaging the Earth's ionosphere using a sounding technique and computing the slant total electron content (STEC) values from data of the global positioning system (GPS). The most common approach for ionospheric tomography is the voxel-based model, in which (1) the ionosphere is divided into voxels, (2) the STEC is then measured along (many) satellite signal paths, and finally (3) an inversion procedure is applied to reconstruct the electron density distribution of the ionosphere. In this study, a computationally efficient approach is introduced, which improves the inversion procedure of step 3. Our proposed method combines the empirical orthogonal function and the spherical Slepian base functions to describe the vertical and horizontal distribution of electron density, respectively. Thus, it can be applied on regional and global case studies. Numerical application is demonstrated using the ground-based GPS data over South America. Our results are validated against ionospheric tomography obtained from the constellation observing system for meteorology, ionosphere, and climate (COSMIC) observations and the global ionosphere map estimated by international centers, as well as by comparison with STEC derived from independent GPS stations. Using the proposed approach, we find that while using 30 GPS measurements in South America, one can achieve comparable accuracy with those from COSMIC data within the reported accuracy (1 × 1011 el/cm3) of the product. Comparisons with real observations of two GPS stations indicate an absolute difference is less than 2 TECU (where 1 total electron content unit, TECU, is 1016 electrons/m2).

  20. Consistent quantum approach to new laser-electron-nuclear effects in diatomic molecules

    International Nuclear Information System (INIS)

    Glushkov, A V; Malinovskaya, S V; Loboda, A V; Shpinareva, I M; Prepelitsa, G P

    2006-01-01

    We present a consistent, quantum approach to the calculation of electron-nuclear γ. spectra (set of vibrational and rotational satellites) for nuclei in diatomic molecules. The approach generelizes the well known Letokhov-Minogin model and is based on the Dunham model potential approximation for potential curves of diatomic molecules. The method is applied to the calculation of probabilities of the vibration-rotation-nuclear transitions in a case of emission and absorption spectrum for the nucleus 127 I (E γ (0) = 203 keV) linked with the molecule H 127 I

  1. The Nuisance of Nuisance Regression: Spectral Misspecification in a Common Approach to Resting-State fMRI Preprocessing Reintroduces Noise and Obscures Functional Connectivity

    OpenAIRE

    Hallquist, Michael N.; Hwang, Kai; Luna, Beatriz

    2013-01-01

    Recent resting-state functional connectivity fMRI (RS-fcMRI) research has demonstrated that head motion during fMRI acquisition systematically influences connectivity estimates despite bandpass filtering and nuisance regression, which are intended to reduce such nuisance variability. We provide evidence that the effects of head motion and other nuisance signals are poorly controlled when the fMRI time series are bandpass-filtered but the regressors are unfiltered, resulting in the inadvertent...

  2. A New Predictive Model Based on the ABC Optimized Multivariate Adaptive Regression Splines Approach for Predicting the Remaining Useful Life in Aircraft Engines

    Directory of Open Access Journals (Sweden)

    Paulino José García Nieto

    2016-05-01

    Full Text Available Remaining useful life (RUL estimation is considered as one of the most central points in the prognostics and health management (PHM. The present paper describes a nonlinear hybrid ABC–MARS-based model for the prediction of the remaining useful life of aircraft engines. Indeed, it is well-known that an accurate RUL estimation allows failure prevention in a more controllable way so that the effective maintenance can be carried out in appropriate time to correct impending faults. The proposed hybrid model combines multivariate adaptive regression splines (MARS, which have been successfully adopted for regression problems, with the artificial bee colony (ABC technique. This optimization technique involves parameter setting in the MARS training procedure, which significantly influences the regression accuracy. However, its use in reliability applications has not yet been widely explored. Bearing this in mind, remaining useful life values have been predicted here by using the hybrid ABC–MARS-based model from the remaining measured parameters (input variables for aircraft engines with success. A correlation coefficient equal to 0.92 was obtained when this hybrid ABC–MARS-based model was applied to experimental data. The agreement of this model with experimental data confirmed its good performance. The main advantage of this predictive model is that it does not require information about the previous operation states of the aircraft engine.

  3. Discrimination and characterization of strawberry juice based on electronic nose and tongue: comparison of different juice processing approaches by LDA, PLSR, RF, and SVM.

    Science.gov (United States)

    Qiu, Shanshan; Wang, Jun; Gao, Liping

    2014-07-09

    An electronic nose (E-nose) and an electronic tongue (E-tongue) have been used to characterize five types of strawberry juices based on processing approaches (i.e., microwave pasteurization, steam blanching, high temperature short time pasteurization, frozen-thawed, and freshly squeezed). Juice quality parameters (vitamin C, pH, total soluble solid, total acid, and sugar/acid ratio) were detected by traditional measuring methods. Multivariate statistical methods (linear discriminant analysis (LDA) and partial least squares regression (PLSR)) and neural networks (Random Forest (RF) and Support Vector Machines) were employed to qualitative classification and quantitative regression. E-tongue system reached higher accuracy rates than E-nose did, and the simultaneous utilization did have an advantage in LDA classification and PLSR regression. According to cross-validation, RF has shown outstanding and indisputable performances in the qualitative and quantitative analysis. This work indicates that the simultaneous utilization of E-nose and E-tongue can discriminate processed fruit juices and predict quality parameters successfully for the beverage industry.

  4. Continuum multiple-scattering approach to electron-molecule scattering and molecular photoionization

    International Nuclear Information System (INIS)

    Dehmer, J.L.; Dill, D.

    1979-01-01

    The multiple-scattering approach to the electronic continuum of molecules is described. The continuum multiple-scattering model (CMSM) was developed as a survey tool and, as such was required to satisfy two requirements. First, it had to have a very broad scope, which means (i) molecules of arbitrary geometry and complexity containing any atom in the periodic system, (ii) continuum electron energies from 0-1000 eV, and (iii) capability to treat a large range of processes involving both photoionization and electron scattering. Second, the structure of the theory was required to lend itself to transparent, physical interpretation of major spectral features such as shape resonances. A comprehensive theoretical framework for the continuum multiple scattering method is presented, as well as its applications to electron-molecule scattering and molecular photoionization. Highlights of recent applications in these two areas are reviewed. The major impact of the resulting studies over the last few years has been to establish the importance of shape resonances in electron collisions and photoionization of practically all (non-hydride) molecules

  5. Canonical variate regression.

    Science.gov (United States)

    Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun

    2016-07-01

    In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  7. Logic regression and its extensions.

    Science.gov (United States)

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  9. Nanotubule and Tour Molecule Based Molecular Electronics: Suggestion for a Hybrid Approach

    Science.gov (United States)

    Srivastava, Deepak; Saini, Subhash (Technical Monitor)

    1998-01-01

    Recent experimental and theoretical attempts and results indicate two distinct broad pathways towards future molecular electronic devices and architectures. The first is the approach via Tour type ladder molecules and their junctions which can be fabricated with solution phase chemical approaches. Second are fullerenes or nanotubules and their junctions which may have better conductance, switching and amplifying characteristics but can not be made through well controlled and defined chemical means. A hybrid approach combining the two pathways to take advantage of the characteristics of both is suggested. Dimension and scale of such devices would be somewhere in between isolated molecule and nanotubule based devices but it maybe possible to use self-assembly towards larger functional and logicalunits.

  10. Advanced statistics: linear regression, part II: multiple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  11. The impact of meteorology on the occurrence of waterborne outbreaks of vero cytotoxin-producing Escherichia coli (VTEC): a logistic regression approach.

    Science.gov (United States)

    O'Dwyer, Jean; Morris Downes, Margaret; Adley, Catherine C

    2016-02-01

    This study analyses the relationship between meteorological phenomena and outbreaks of waterborne-transmitted vero cytotoxin-producing Escherichia coli (VTEC) in the Republic of Ireland over an 8-year period (2005-2012). Data pertaining to the notification of waterborne VTEC outbreaks were extracted from the Computerised Infectious Disease Reporting system, which is administered through the national Health Protection Surveillance Centre as part of the Health Service Executive. Rainfall and temperature data were obtained from the national meteorological office and categorised as cumulative rainfall, heavy rainfall events in the previous 7 days, and mean temperature. Regression analysis was performed using logistic regression (LR) analysis. The LR model was significant (p < 0.001), with all independent variables: cumulative rainfall, heavy rainfall and mean temperature making a statistically significant contribution to the model. The study has found that rainfall, particularly heavy rainfall in the preceding 7 days of an outbreak, is a strong statistical indicator of a waterborne outbreak and that temperature also impacts waterborne VTEC outbreak occurrence.

  12. Confinement effects on electron and phonon degrees of freedom in nanofilm superconductors: A Green function approach

    Science.gov (United States)

    Saniz, R.; Partoens, B.; Peeters, F. M.

    2013-02-01

    The Green function approach to the Bardeen-Cooper-Schrieffer theory of superconductivity is used to study nanofilms. We go beyond previous models and include effects of confinement on the strength of the electron-phonon coupling as well as on the electronic spectrum and on the phonon modes. Within our approach, we find that in ultrathin films, confinement effects on the electronic screening become very important. Indeed, contrary to what has been advanced in recent years, the sudden increases of the density of states when new bands start to be occupied as the film thickness increases, tend to suppress the critical temperature rather than to enhance it. On the other hand, the increase of the number of phonon modes with increasing number of monolayers in the film leads to an increase in the critical temperature. As a consequence, the superconducting critical parameters in such nanofilms are determined by these two competing effects. Furthermore, in sufficiently thin films, the condensate consists of well-defined subcondensates associated with the occupied bands, each with a distinct coherence length. The subcondensates can interfere constructively or destructively giving rise to an interference pattern in the Cooper pair probability density.

  13. Coupled forward-backward trajectory approach for nonequilibrium electron-ion dynamics

    Science.gov (United States)

    Sato, Shunsuke A.; Kelly, Aaron; Rubio, Angel

    2018-04-01

    We introduce a simple ansatz for the wave function of a many-body system based on coupled forward and backward propagating semiclassical trajectories. This method is primarily aimed at, but not limited to, treating nonequilibrium dynamics in electron-phonon systems. The time evolution of the system is obtained from the Euler-Lagrange variational principle, and we show that this ansatz yields Ehrenfest mean-field theory in the limit that the forward and backward trajectories are orthogonal, and in the limit that they coalesce. We investigate accuracy and performance of this method by simulating electronic relaxation in the spin-boson model and the Holstein model. Although this method involves only pairs of semiclassical trajectories, it shows a substantial improvement over mean-field theory, capturing quantum coherence of nuclear dynamics as well as electron-nuclear correlations. This improvement is particularly evident in nonadiabatic systems, where the accuracy of this coupled trajectory method extends well beyond the perturbative electron-phonon coupling regime. This approach thus provides an attractive route forward to the ab initio description of relaxation processes, such as thermalization, in condensed phase systems.

  14. Oil condition monitoring of gears onboard ships using a regression approach for multivariate T2 control charts

    DEFF Research Database (Denmark)

    Henneberg, Morten; Jørgensen, Bent; Eriksen, René Lynge

    2016-01-01

    In this paper, we present an oil condition and wear debris evaluation method for ship thruster gears using T2 statistics to form control charts from a multi-sensor platform. The proposed method takes into account the different ambient conditions by multiple linear regression on the mean value...... only quasi-stationary data are included in phase I of the T2 statistics. Data from two thruster gears onboard two different ships are presented and analyzed, and the selection of the phase I data size is discussed. A graphic overview for quick localization of T2 signaling is also demonstrated using...... spider plots. Finally, progression and trending of the T2 statistics are investigated using orthogonal polynomials for a fix-sized data window....

  15. Theoretical study of molecular vibrations in electron momentum spectroscopy experiments on furan: An analytical versus a molecular dynamical approach

    International Nuclear Information System (INIS)

    Morini, Filippo; Deleuze, Michael S.; Watanabe, Noboru; Takahashi, Masahiko

    2015-01-01

    The influence of thermally induced nuclear dynamics (molecular vibrations) in the initial electronic ground state on the valence orbital momentum profiles of furan has been theoretically investigated using two different approaches. The first of these approaches employs the principles of Born-Oppenheimer molecular dynamics, whereas the so-called harmonic analytical quantum mechanical approach resorts to an analytical decomposition of contributions arising from quantized harmonic vibrational eigenstates. In spite of their intrinsic differences, the two approaches enable consistent insights into the electron momentum distributions inferred from new measurements employing electron momentum spectroscopy and an electron impact energy of 1.2 keV. Both approaches point out in particular an appreciable influence of a few specific molecular vibrations of A 1 symmetry on the 9a 1 momentum profile, which can be unravelled from considerations on the symmetry characteristics of orbitals and their energy spacing

  16. Electron-impact ionization of oriented molecules using the time-dependent close-coupling approach

    Energy Technology Data Exchange (ETDEWEB)

    Colgan, J [Theoretical Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Pindzola, M S, E-mail: jcolgan@lanl.gov [Department of Physics, Auburn University, Auburn, AL 36849 (United States)

    2011-04-01

    An overview is given on recent progress on computing triple differential cross sections for electron-impact ionization of the hydrogen molecule using a time-dependent close-coupling approach. Our calculations, when averaged over all molecular orientations, are generally in very good agreement with (e,2e) measurements made on H{sub 2}, where the molecular orientation is unknown, for a range of incident energies and outgoing electron angles and energies. In this paper, we present TDCS for ionization of H{sub 2} at specific molecular orientations. It is hoped that this study will help stimulate future measurements of TDCS from oriented H{sub 2} at medium impact energies.

  17. The electronic structure of molecules by a many-body approach. Pt. 1

    International Nuclear Information System (INIS)

    Niessen, W. von; Cederbaum, L.S.; Kraemer, W.P.

    1976-01-01

    The ionization potentials of benzene are studied by an ab initio many-body approach which includes the effects of electron correlation and reorganization beyond the one-particle approximation. The calculations confirm the assignment of the photoelectron spectrum experimentally proposed by Jonsson and Lindholm: 1esub(1g)(π), 2esub(2g), 1asub(2u)(π), 2esub(1u), 1bsub(2u), 1bsub(1u), 2asub(1g), 1esub(2g) in order of increasing binding energy. To definitely establish the ordering of the ionization potentials in the second band, which has been very controversial, the corresponding vibrational structure has been calculated. A number of one-electron properties are calculated in the one-particle approximation and compared to experimental work and other theoretical calculations. (orig.) [de

  18. Phase-space description of wave packet approach to electronic transport in nanoscale systems

    International Nuclear Information System (INIS)

    Szydłowski, D; Wołoszyn, M; Spisak, B J

    2013-01-01

    The dynamics of conduction electrons in resonant tunnelling nanosystems is studied within the phase-space approach based on the Wigner distribution function. The time evolution of the distribution function is calculated from the time-dependent quantum kinetic equation for which an effective numerical method is presented. Calculations of the transport properties of a double-barrier resonant tunnelling diode are performed to illustrate the proposed techniques. Additionally, analysis of the transient effects in the nanosystem is carried out and it is shown that for some range of the bias voltage the temporal variations of electronic current can take negative values. The explanation of this effect is based on the analysis of the time changes of the Wigner distribution function. The decay time of the temporal current oscillations in the nanosystem as a function of the bias voltage is determined. (paper)

  19. A real-space stochastic density matrix approach for density functional electronic structure.

    Science.gov (United States)

    Beck, Thomas L

    2015-12-21

    The recent development of real-space grid methods has led to more efficient, accurate, and adaptable approaches for large-scale electrostatics and density functional electronic structure modeling. With the incorporation of multiscale techniques, linear-scaling real-space solvers are possible for density functional problems if localized orbitals are used to represent the Kohn-Sham energy functional. These methods still suffer from high computational and storage overheads, however, due to extensive matrix operations related to the underlying wave function grid representation. In this paper, an alternative stochastic method is outlined that aims to solve directly for the one-electron density matrix in real space. In order to illustrate aspects of the method, model calculations are performed for simple one-dimensional problems that display some features of the more general problem, such as spatial nodes in the density matrix. This orbital-free approach may prove helpful considering a future involving increasingly parallel computing architectures. Its primary advantage is the near-locality of the random walks, allowing for simultaneous updates of the density matrix in different regions of space partitioned across the processors. In addition, it allows for testing and enforcement of the particle number and idempotency constraints through stabilization of a Feynman-Kac functional integral as opposed to the extensive matrix operations in traditional approaches.

  20. A new approach in the development of quality management systems for (micro)electronics

    Science.gov (United States)

    Bacivarov, Ioan C.; Bacivarov, Angelica; Gherghina, Cǎtǎlina

    2016-12-01

    This paper presents the new approach in the analysis of the Quality Management Systems (QMS) of companies, based on the revised standard ISO 9001:2015. In the first part of the paper, QMS based on ISO 9001 certification are introduced; the changes and the updates proposed for the new version of ISO 9001:2015 are critically analyzed, based on the documents elaborated by ISO/TC 176. The approach based on ISO 9001:2015 could be considered as "beginning of a new era in development of quality management systems". A comparison between the between the "old" standard ISO 9001:2008 and the "new" standard ISO 9001:2015 is made. In the second part of the paper, steps to be followed in a company to implement this new standard are presented. A peculiar attention is given to the new concept of risk-based thinking in order to support and improve application of the process based approach. The authors conclude that, by considering risk throughout the organization the likelihood of achieving stated objectives is improved, output is more consistent and customers can be confident that they will receive the expected results. Finally, the benefits of the new approach in the development of quality management systems are outlined, as well as how they are reflected in the management of companies in general and those in electronics field, in particular. As demonstrated in this paper, well understood and properly applied, the new approach based on the revised standard ISO9001:2015 could offer a better quality management for companies operating in electronics and beyond.

  1. Understanding logistic regression analysis

    OpenAIRE

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...

  2. Introduction to regression graphics

    CERN Document Server

    Cook, R Dennis

    2009-01-01

    Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava

  3. Alternative Methods of Regression

    CERN Document Server

    Birkes, David

    2011-01-01

    Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s

  4. Stepwise approach to establishing multiple outreach laboratory information system-electronic medical record interfaces.

    Science.gov (United States)

    Pantanowitz, Liron; Labranche, Wayne; Lareau, William

    2010-05-26

    Clinical laboratory outreach business is changing as more physician practices adopt an electronic medical record (EMR). Physician connectivity with the laboratory information system (LIS) is consequently becoming more important. However, there are no reports available to assist the informatician with establishing and maintaining outreach LIS-EMR connectivity. A four-stage scheme is presented that was successfully employed to establish unidirectional and bidirectional interfaces with multiple physician EMRs. This approach involves planning (step 1), followed by interface building (step 2) with subsequent testing (step 3), and finally ongoing maintenance (step 4). The role of organized project management, software as a service (SAAS), and alternate solutions for outreach connectivity are discussed.

  5. An attempt at solving the problem of autocorrelation associated with use of mean approach for pooling cross-section and time series in regression modelling

    International Nuclear Information System (INIS)

    Nuamah, N.N.N.N.

    1990-12-01

    The paradoxical nature of results of the mean approach in pooling cross-section and time series data has been identified to be caused by the presence in the normal equations of phenomena such as autocovariances, multicollinear covariances, drift covariances and drift multicollinear covariances. This paper considers the problem of autocorrelation and suggests ways of solving it. (author). 4 refs

  6. Deep learning-based subdivision approach for large scale macromolecules structure recovery from electron cryo tomograms.

    Science.gov (United States)

    Xu, Min; Chai, Xiaoqi; Muthakana, Hariank; Liang, Xiaodan; Yang, Ge; Zeev-Ben-Mordehai, Tzviya; Xing, Eric P

    2017-07-15

    Cellular Electron CryoTomography (CECT) enables 3D visualization of cellular organization at near-native state and in sub-molecular resolution, making it a powerful tool for analyzing structures of macromolecular complexes and their spatial organizations inside single cells. However, high degree of structural complexity together with practical imaging limitations makes the systematic de novo discovery of structures within cells challenging. It would likely require averaging and classifying millions of subtomograms potentially containing hundreds of highly heterogeneous structural classes. Although it is no longer difficult to acquire CECT data containing such amount of subtomograms due to advances in data acquisition automation, existing computational approaches have very limited scalability or discrimination ability, making them incapable of processing such amount of data. To complement existing approaches, in this article we propose a new approach for subdividing subtomograms into smaller but relatively homogeneous subsets. The structures in these subsets can then be separately recovered using existing computation intensive methods. Our approach is based on supervised structural feature extraction using deep learning, in combination with unsupervised clustering and reference-free classification. Our experiments show that, compared with existing unsupervised rotation invariant feature and pose-normalization based approaches, our new approach achieves significant improvements in both discrimination ability and scalability. More importantly, our new approach is able to discover new structural classes and recover structures that do not exist in training data. Source code freely available at http://www.cs.cmu.edu/∼mxu1/software . mxu1@cs.cmu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  7. Electron cyclotron waves transmission: new approach for the characterization of electron distribution functions in Tokamak hot plasmas

    International Nuclear Information System (INIS)

    Michelot, Y.

    1995-10-01

    Fast electrons are one of the basic ingredients of plasma operations in many existing thermonuclear fusion research devices. However, the understanding of fast electrons dynamics during creation and sustainment of the superthermal electrons tail is far for being satisfactory. For this reason, the Electron Cyclotron Transmission (ECT) diagnostic was implemented on Tore Supra tokamak. It consists on a microwave transmission system installed on a vertical chord crossing the plasma center and working in the frequency range 77-109 GHz. Variations of the wave amplitude during the propagation across the plasma may be due to refraction and resonant absorption. For the ECT, the most common manifestation of refraction is a reduction of the received power density with respect to the signal detected in vacuum, due to the spreading and deflection of the wave beam. Wave absorption is observed in the vicinity of the electron cyclotron harmonics and may be due both to thermal plasma and to superthermal electron tails. It has a characteristic frequency dependence due to the relativistic mass variation in the wave-electron resonance condition. This thesis presents the first measurements of: the extraordinary mode optical depth at the third harmonics, the electron temperature from the width of a cyclotron absorption line and the relaxation times of the electron distribution during lower hybrid current drive from the ordinary mode spectral superthermal absorption line at the first harmonic. (J.S.). 175 refs., 110 figs., 9 tabs., 3 annexes

  8. Analytical approach to phonons and electron-phonon interactions in single-walled zigzag carbon nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Kandemir, B S; Keskin, M [Department of Physics, Faculty of Sciences, Ankara University, 06100 Tandogan, Ankara (Turkey)

    2008-08-13

    In this paper, exact analytical expressions for the entire phonon spectra in single-walled carbon nanotubes with zigzag geometry are presented by using a new approach, originally developed by Kandemir and Altanhan. This approach is based on the concept of construction of a classical lattice Hamiltonian of single-walled carbon nanotubes, wherein the nearest and next nearest neighbor and bond bending interactions are all included, then its quantization and finally diagonalization of the resulting second quantized Hamiltonian. Furthermore, within this context, explicit analytical expressions for the relevant electron-phonon interaction coefficients are also investigated for single-walled carbon nanotubes having this geometry, by the phonon modulation of the hopping interaction.

  9. Analytical approach to phonons and electron-phonon interactions in single-walled zigzag carbon nanotubes

    International Nuclear Information System (INIS)

    Kandemir, B S; Keskin, M

    2008-01-01

    In this paper, exact analytical expressions for the entire phonon spectra in single-walled carbon nanotubes with zigzag geometry are presented by using a new approach, originally developed by Kandemir and Altanhan. This approach is based on the concept of construction of a classical lattice Hamiltonian of single-walled carbon nanotubes, wherein the nearest and next nearest neighbor and bond bending interactions are all included, then its quantization and finally diagonalization of the resulting second quantized Hamiltonian. Furthermore, within this context, explicit analytical expressions for the relevant electron-phonon interaction coefficients are also investigated for single-walled carbon nanotubes having this geometry, by the phonon modulation of the hopping interaction

  10. A Study on Technology Architecture and Serving Approaches of Electronic Government System

    Science.gov (United States)

    Liu, Chunnian; Huang, Yiyun; Pan, Qin

    As E-government becomes a very active research area, a lot of solutions to solve citizens' needs are being deployed. This paper provides technology architecture of E-government system and approaches of service in Public Administrations. The proposed electronic system addresses the basic E-government requirements of user friendliness, security, interoperability, transparency and effectiveness in the communication between small and medium sized public organizations and their citizens, businesses and other public organizations. The paper has provided several serving approaches of E-government, which includes SOA, web service, mobile E-government, public library and every has its own characteristics and application scenes. Still, there are a number of E-government issues for further research on organization structure change, including research methodology, data collection analysis, etc.

  11. A new neutron interferometry approach in the determination of the neutron-electron interaction amplitude

    CERN Document Server

    Ioffe, A

    2002-01-01

    A new experimental approach in the determination of the neutron-electron interaction amplitude is proposed. The main idea of this approach is to use a perfect-crystal neutron interferometer as both a sample and a device for the measurement of the extra phase shift caused by the neutron interaction with atoms of Si. Indeed, such a sample (an interferometer blade) has a well-known atomic density and is a priori perfectly aligned with respect to the crystal lattice of the interferometer crystal. This results in the minimization of systematic errors caused by sample alignment and increases the overall experimental accuracy. Some theoretic estimations and details of an experimental setup are discussed. (orig.)

  12. Determining the Relationship between U.S. County-Level Adult Obesity Rate and Multiple Risk Factors by PLS Regression and SVM Modeling Approaches

    Directory of Open Access Journals (Sweden)

    Chau-Kuang Chen

    2015-02-01

    Full Text Available Data from the Center for Disease Control (CDC has shown that the obesity rate doubled among adults within the past two decades. This upsurge was the result of changes in human behavior and environment. Partial least squares (PLS regression and support vector machine (SVM models were conducted to determine the relationship between U.S. county-level adult obesity rate and multiple risk factors. The outcome variable was the adult obesity rate. The 23 risk factors were categorized into four domains of the social ecological model including biological/behavioral factor, socioeconomic status, food environment, and physical environment. Of the 23 risk factors related to adult obesity, the top eight significant risk factors with high normalized importance were identified including physical inactivity, natural amenity, percent of households receiving SNAP benefits, and percent of all restaurants being fast food. The study results were consistent with those in the literature. The study showed that adult obesity rate was influenced by biological/behavioral factor, socioeconomic status, food environment, and physical environment embedded in the social ecological theory. By analyzing multiple risk factors of obesity in the communities, may lead to the proposal of more comprehensive and integrated policies and intervention programs to solve the population-based problem.

  13. Deducing Electronic Unit Internal Response During a Vibration Test Using a Lumped Parameter Modeling Approach

    Science.gov (United States)

    Van Dyke, Michael B.

    2014-01-01

    During random vibration testing of electronic boxes there is often a desire to know the dynamic response of certain internal printed wiring boards (PWBs) for the purpose of monitoring the response of sensitive hardware or for post-test forensic analysis in support of anomaly investigation. Due to restrictions on internally mounted accelerometers for most flight hardware there is usually no means to empirically observe the internal dynamics of the unit, so one must resort to crude and highly uncertain approximations. One common practice is to apply Miles Equation, which does not account for the coupled response of the board in the chassis, resulting in significant over- or under-prediction. This paper explores the application of simple multiple-degree-of-freedom lumped parameter modeling to predict the coupled random vibration response of the PWBs in their fundamental modes of vibration. A simple tool using this approach could be used during or following a random vibration test to interpret vibration test data from a single external chassis measurement to deduce internal board dynamics by means of a rapid correlation analysis. Such a tool might also be useful in early design stages as a supplemental analysis to a more detailed finite element analysis to quickly prototype and analyze the dynamics of various design iterations. After developing the theoretical basis, a lumped parameter modeling approach is applied to an electronic unit for which both external and internal test vibration response measurements are available for direct comparison. Reasonable correlation of the results demonstrates the potential viability of such an approach. Further development of the preliminary approach presented in this paper will involve correlation with detailed finite element models and additional relevant test data.

  14. A novel approach to electron data background treatment in an online wide-angle spectrometer for laser-accelerated ion and electron bunches

    Science.gov (United States)

    Lindner, F. H.; Bin, J. H.; Englbrecht, F.; Haffa, D.; Bolton, P. R.; Gao, Y.; Hartmann, J.; Hilz, P.; Kreuzer, C.; Ostermayr, T. M.; Rösch, T. F.; Speicher, M.; Parodi, K.; Thirolf, P. G.; Schreiber, J.

    2018-01-01

    Laser-based ion acceleration is driven by electrical fields emerging when target electrons absorb laser energy and consecutively leave the target material. A direct correlation between these electrons and the accelerated ions is thus to be expected and predicted by theoretical models. We report on a modified wide-angle spectrometer, allowing the simultaneous characterization of angularly resolved energy distributions of both ions and electrons. Equipped with online pixel detectors, the RadEye1 detectors, the investigation of this correlation gets attainable on a single shot basis. In addition to first insights, we present a novel approach for reliably extracting the primary electron energy distribution from the interfering secondary radiation background. This proves vitally important for quantitative extraction of average electron energies (temperatures) and emitted total charge.

  15. Producing The New Regressive Left

    DEFF Research Database (Denmark)

    Crone, Christine

    members, this thesis investigates a growing political trend and ideological discourse in the Arab world that I have called The New Regressive Left. On the premise that a media outlet can function as a forum for ideology production, the thesis argues that an analysis of this material can help to trace...... the contexture of The New Regressive Left. If the first part of the thesis lays out the theoretical approach and draws the contextual framework, through an exploration of the surrounding Arab media-and ideoscapes, the second part is an analytical investigation of the discourse that permeates the programmes aired...... becomes clear from the analytical chapters is the emergence of the new cross-ideological alliance of The New Regressive Left. This emerging coalition between Shia Muslims, religious minorities, parts of the Arab Left, secular cultural producers, and the remnants of the political,strategic resistance...

  16. Practical Approaches to Mitigation of Specimen Charging in High-Resolution Transmission Electron Microscopy

    Directory of Open Access Journals (Sweden)

    Young-Min Kim

    2010-09-01

    Full Text Available Specimen charging that is associated with the electron bombardment on the sample is a practical hindrance to high-resolution transmission electron microscopy (HRTEM analysis because it causes a severe loss of resolution in either diffraction or image data. Conductive thin film deposition on an insulating specimen has been proposed as an effective approach to the mitigation of the specimen charging; however, this method is generally not useful in HRTEM imaging of materials because the deposited film induces another artifact in the HRTEM image contrast. In this study, we propose practical methods to mitigate the specimen charging that takes place during the HRTEM of materials. For bulk-type specimens prepared by either an ion-thinning or focused-ion beam (FIB process, a plasma cleaning treatment is significantly effective in eliminating the charging phenomenon. In the case of low-dimensional nanomaterials such as nanowires and nanoparticles, the plasma cleaning is not feasible; however, the charging effect can be effectively eliminated by adjusting the electron illumination condition. The proposed methods facilitate a decrease in the buildup of specimen charging, thereby enhancing the quality of high-resolution images significantly.

  17. Entanglement transfer from electrons to photons in quantum dots: an open quantum system approach

    International Nuclear Information System (INIS)

    Budich, Jan C; Trauzettel, Bjoern

    2010-01-01

    We investigate entanglement transfer from a system of two spin-entangled electron-hole pairs, each placed in a separate single mode cavity, to the photons emitted due to cavity leakage. Dipole selection rules and a splitting between the light hole and the heavy hole subbands are the crucial ingredients establishing a one-to-one correspondence between electron spins and circular photon polarizations. To account for the measurement of the photons as well as dephasing effects, we choose a stochastic Schroedinger equation and a conditional master equation approach, respectively. The influence of interactions with the environment as well as asymmetries in the coherent couplings on the photon entanglement is analysed for two concrete measurement schemes. The first one is designed to violate the Clauser-Horne-Shimony-Holt (CHSH) inequality, while the second one employs the visibility of interference fringes to prove the entanglement of the photons. Because of the spatial separation of the entangled electronic system over two quantum dots, a successful verification of entangled photons emitted by this system would imply the detection of nonlocal spin entanglement of massive particles in a solid state structure.

  18. An analytic approach to 2D electronic PE spectra of molecular systems

    International Nuclear Information System (INIS)

    Szoecs, V.

    2011-01-01

    Graphical abstract: The three-pulse photon echo (3P-PE) spectra of finite molecular systems using direct calculation from electronic Hamiltonians allows peak classification from 3P-PE spectra dynamics. Display Omitted Highlights: → RWA approach to electronic photon echo. → A straightforward calculation of 2D electronic spectrograms in finite molecular systems. → Importance of population time dynamics in relation to inter-site coherent coupling. - Abstract: The three-pulse photon echo (3P-PE) spectra of finite molecular systems and simplified line broadening models is presented. The Fourier picture of a heterodyne detected three-pulse rephasing PE signal in the δ-pulse limit of the external field is derived in analytic form. The method includes contributions of one and two-excitonic states and allows direct calculation of Fourier PE spectrogram from corresponding Hamiltonian. As an illustration, the proposed treatment is applied to simple systems, e.g. 2-site two-level system (TLS) and n-site TLS model of photosynthetic unit. The importance of relation between Fourier picture of 3P-PE dynamics (corresponding to nonzero population time, T) and coherent inter-state coupling is emphasized.

  19. Digital contract approach for consistent and predictable multimedia information delivery in electronic commerce

    Science.gov (United States)

    Konana, Prabhudev; Gupta, Alok; Whinston, Andrew B.

    1997-01-01

    A pure 'technological' solution to network quality problems is incomplete since any benefits from new technologies are offset by the demand from exponentially growing electronic commerce ad data-intensive applications. SInce an economic paradigm is implicit in electronic commerce, we propose a 'market-system' approach to improve quality of service. Quality of service for digital products takes on a different meaning since users view quality of service differently and value information differently. We propose a framework for electronic commerce that is based on an economic paradigm and mass-customization, and works as a wide-area distributed management system. In our framework, surrogate-servers act as intermediaries between information provides and end- users, and arrange for consistent and predictable information delivery through 'digital contracts.' These contracts are negotiated and priced based on economic principles. Surrogate servers pre-fetched, through replication, information from many different servers and consolidate based on demand expectations. In order to recognize users' requirements and process requests accordingly, real-time databases are central to our framework. We also propose that multimedia information be separated into slowly changing and rapidly changing data streams to improve response time requirements. Surrogate- servers perform the tasks of integration of these data streams that is transparent to end-users.

  20. Spatiotemporal modeling of ozone levels in Quebec (Canada): a comparison of kriging, land-use regression (LUR), and combined Bayesian maximum entropy-LUR approaches.

    Science.gov (United States)

    Adam-Poupart, Ariane; Brand, Allan; Fournier, Michel; Jerrett, Michael; Smargiassi, Audrey

    2014-09-01

    Ambient air ozone (O3) is a pulmonary irritant that has been associated with respiratory health effects including increased lung inflammation and permeability, airway hyperreactivity, respiratory symptoms, and decreased lung function. Estimation of O3 exposure is a complex task because the pollutant exhibits complex spatiotemporal patterns. To refine the quality of exposure estimation, various spatiotemporal methods have been developed worldwide. We sought to compare the accuracy of three spatiotemporal models to predict summer ground-level O3 in Quebec, Canada. We developed a land-use mixed-effects regression (LUR) model based on readily available data (air quality and meteorological monitoring data, road networks information, latitude), a Bayesian maximum entropy (BME) model incorporating both O3 monitoring station data and the land-use mixed model outputs (BME-LUR), and a kriging method model based only on available O3 monitoring station data (BME kriging). We performed leave-one-station-out cross-validation and visually assessed the predictive capability of each model by examining the mean temporal and spatial distributions of the average estimated errors. The BME-LUR was the best predictive model (R2 = 0.653) with the lowest root mean-square error (RMSE ;7.06 ppb), followed by the LUR model (R2 = 0.466, RMSE = 8.747) and the BME kriging model (R2 = 0.414, RMSE = 9.164). Our findings suggest that errors of estimation in the interpolation of O3 concentrations with BME can be greatly reduced by incorporating outputs from a LUR model developed with readily available data.

  1. Multiple linear regression approach for the analysis of the relationships between joints mobility and regional pressure-based parameters in the normal-arched foot.

    Science.gov (United States)

    Caravaggi, Paolo; Leardini, Alberto; Giacomozzi, Claudia

    2016-10-03

    Plantar load can be considered as a measure of the foot ability to transmit forces at the foot/ground, or foot/footwear interface during ambulatory activities via the lower limb kinematic chain. While morphological and functional measures have been shown to be correlated with plantar load, no exhaustive data are currently available on the possible relationships between range of motion of foot joints and plantar load regional parameters. Joints' kinematics from a validated multi-segmental foot model were recorded together with plantar pressure parameters in 21 normal-arched healthy subjects during three barefoot walking trials. Plantar pressure maps were divided into six anatomically-based regions of interest associated to corresponding foot segments. A stepwise multiple regression analysis was performed to determine the relationships between pressure-based parameters, joints range of motion and normalized walking speed (speed/subject height). Sagittal- and frontal-plane joint motion were those most correlated to plantar load. Foot joints' range of motion and normalized walking speed explained between 6% and 43% of the model variance (adjusted R 2 ) for pressure-based parameters. In general, those joints' presenting lower mobility during stance were associated to lower vertical force at forefoot and to larger mean and peak pressure at hindfoot and forefoot. Normalized walking speed was always positively correlated to mean and peak pressure at hindfoot and forefoot. While a large variance in plantar pressure data is still not accounted for by the present models, this study provides statistical corroboration of the close relationship between joint mobility and plantar pressure during stance in the normal healthy foot. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. ANALYSIS OF DOMESTIC AND INTERNATIONAL APPROACHES TO THE ADVANCED EDUCATIONAL PRACTICES IN THE ELECTRONIC NETWORK ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Tatiana N. Noskova

    2016-12-01

    Full Text Available Introduction: human activities related to the use of information are being transformed under the influence of computer technology. Variable solutions to information problems are emerging; demands and require¬ments for the competence are changing on the labour market. Educational practices are destined to form a new learning behaviour for the 21st century, adopting lifelong learning strategy. The main purpose of the article is to answer the question as to how to transform existing pedagogical theory and practice under current conditions of electronic environment. Publishing of this article is coherent with concept of the journal Integration of Education, analyzing Russian and world experience in the development of education systems. This approach is important for dissemination and implementation in practice. This article explores the challenges of information technology and technical support of the educational process in universities and schools. The study of these issues is in the field of view of the journa l. Materials and Methods: the paper elaborates on the results of domestic and international educational theory and practice, comparison methods, drawing on student’s survey in the framework of international research in the field of e-learning in higher education institutions. Results: the main approaches, applied to the formulation of educational practices in the electronic environ-ment, were analyzed. The most topical national approaches include system, activity, polysubject (dialogical, context, and dialogical ones. Among international approaches self-directed learning, educational communication strategies, experiential learning, training in partnership, collaborative learning, learning in online communities, situational training were analyzed. Specifics of electronic educational interactions with distributed in time and space activities of teachers and students, create the preconditions for the implementation of new educational

  3. Chemical and engineering approaches to enable organic field-effect transistors for electronic skin applications.

    Science.gov (United States)

    Sokolov, Anatoliy N; Tee, Benjamin C-K; Bettinger, Christopher J; Tok, Jeffrey B-H; Bao, Zhenan

    2012-03-20

    Skin is the body's largest organ and is responsible for the transduction of a vast amount of information. This conformable material simultaneously collects signals from external stimuli that translate into information such as pressure, pain, and temperature. The development of an electronic material, inspired by the complexity of this organ is a tremendous, unrealized engineering challenge. However, the advent of carbon-based electronics may offer a potential solution to this long-standing problem. In this Account, we describe the use of an organic field-effect transistor (OFET) architecture to transduce mechanical and chemical stimuli into electrical signals. In developing this mimic of human skin, we thought of the sensory elements of the OFET as analogous to the various layers and constituents of skin. In this fashion, each layer of the OFET can be optimized to carry out a specific recognition function. The separation of multimodal sensing among the components of the OFET may be considered a "divide and conquer" approach, where the electronic skin (e-skin) can take advantage of the optimized chemistry and materials properties of each layer. This design of a novel microstructured gate dielectric has led to unprecedented sensitivity for tactile pressure events. Typically, pressure-sensitive components within electronic configurations have suffered from a lack of sensitivity or long mechanical relaxation times often associated with elastomeric materials. Within our method, these components are directly compatible with OFETs and have achieved the highest reported sensitivity to date. Moreover, the tactile sensors operate on a time scale comparable with human skin, making them ideal candidates for integration as synthetic skin devices. The methodology is compatible with large-scale fabrication and employs simple, commercially available elastomers. The design of materials within the semiconductor layer has led to the incorporation of selectivity and sensitivity within

  4. Understanding logistic regression analysis.

    Science.gov (United States)

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.

  5. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  6. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  7. Understanding Contrasting Approaches to Nationwide Implementations of Electronic Health Record Systems: England, the USA and Australia

    Directory of Open Access Journals (Sweden)

    Zoe Morrison

    2011-01-01

    Full Text Available As governments commit to national electronic health record (EHR systems, there is increasing international interest in identifying effective implementation strategies. We draw on Coiera's typology of national programmes - ‘top-down’, ‘bottom-up’ and ‘middle-out’ - to review EHR implementation strategies in three exemplar countries: England, the USA and Australia. In comparing and contrasting three approaches, we show how different healthcare systems, national policy contexts and anticipated benefits have shaped initial strategies. We reflect on progress and likely developments in the face of continually changing circumstances. Our review shows that irrespective of the initial strategy, over time there is likely to be convergence on the negotiated, devolved middle-out approach, which aims to balance the interests and responsibilities of local healthcare constituencies and national government to achieve national connectivity. We conclude that, accepting the current lack of empirical evidence, the flexibility offered by the middle-out approach may make this the best initial national strategy.

  8. Modeling a terminology-based electronic nursing record system: an object-oriented approach.

    Science.gov (United States)

    Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo

    2007-10-01

    The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.

  9. An Analysis of Knowledge Sharing Approaches for Emerging-technology-based Strategic Alliances in Electronic Industry

    Institute of Scientific and Technical Information of China (English)

    LIU Ju; LI Yong-jian

    2006-01-01

    Emerging technologies are now initiating new industries and transforming old ones with tremendous power. They are different games compared with established technologies with distinctive characteristics of knowledge management in knowledge-based and technological-innovation-based competition. How to obtain knowledge advantage and enhance competences by knowledge sharing for emerging-technology-based strategic alliances (ETBSA) is what we concern in this paper. On the basis of our previous work on emerging technologies'distinctive attributes, we counter the wide spread presumption that the primary purpose of strategic alliances is knowledge acquiring by means of learning. We offers new insight into the knowledge sharing approaches of ETBSAs - the knowledge integrating approach by which each member firm integrates its partner's complementary knowledge base into the products and services and maintains its own knowledge specialization at the same time. So that ETBSAs should plan and practice their knowledge sharing strategies from the angle of knowledge integrating rather than knowledge acquiring. A four-dimensional framework is developed to analyze the advantages and disadvantages of these two knowledge sharing approaches. Some cases in electronic industry are introduced to illustrate our point of view.

  10. A Cost-Effective Approach for Migrating Enterprise Electronic Mail Systems

    Directory of Open Access Journals (Sweden)

    Emmanuel Omojokun

    2008-02-01

    Full Text Available Electronic mail (E-mail is one of the most utilized application software systems in modern-day organizations. The major messaging application programs used in the enterprise are IBM Lotus Notes also known as Domino, Microsoft Exchange Servers, and Novel GroupWise. For various reasons – such as high cost of maintenance, undeliverable e-mail issue and loss of attachments, companies find it necessary to either migrate to newer versions of their messaging software or to an entirely different software. In either case, the process must be carefully planned, well designed and properly implemented to avoid disaster. In this paper, we present a cost-effective approach for migrating a particular messaging software. The approach was implemented and tested for the migration of GroupWise 5.5 to Exchange Server 2003. We present our success story and lessons learned from the case. A six-week and one-year post migration system-audits indicated that the organization derived several benefits including significant cost savings as a result of this particular approach. Chief information/technology officers and e-mail administrators will benefit immensely from the "best practice" strategy hereby presented.

  11. An efficient approach for surveillance of childhood diabetes by type derived from electronic health record data: the SEARCH for Diabetes in Youth Study

    Science.gov (United States)

    Zhong, Victor W; Obeid, Jihad S; Craig, Jean B; Pfaff, Emily R; Thomas, Joan; Jaacks, Lindsay M; Beavers, Daniel P; Carey, Timothy S; Lawrence, Jean M; Dabelea, Dana; Hamman, Richard F; Bowlby, Deborah A; Pihoker, Catherine; Saydah, Sharon H

    2016-01-01

    Objective To develop an efficient surveillance approach for childhood diabetes by type across 2 large US health care systems, using phenotyping algorithms derived from electronic health record (EHR) data. Materials and Methods Presumptive diabetes cases diabetes-related billing codes, patient problem list, and outpatient anti-diabetic medications. EHRs of all the presumptive cases were manually reviewed, and true diabetes status and diabetes type were determined. Algorithms for identifying diabetes cases overall and classifying diabetes type were either prespecified or derived from classification and regression tree analysis. Surveillance approach was developed based on the best algorithms identified. Results We developed a stepwise surveillance approach using billing code–based prespecified algorithms and targeted manual EHR review, which efficiently and accurately ascertained and classified diabetes cases by type, in both health care systems. The sensitivity and positive predictive values in both systems were approximately ≥90% for ascertaining diabetes cases overall and classifying cases with type 1 or type 2 diabetes. About 80% of the cases with “other” type were also correctly classified. This stepwise surveillance approach resulted in a >70% reduction in the number of cases requiring manual validation compared to traditional surveillance methods. Conclusion EHR data may be used to establish an efficient approach for large-scale surveillance for childhood diabetes by type, although some manual effort is still needed. PMID:27107449

  12. Communication: Electronic and transport properties of molecular junctions under a finite bias: A dual mean field approach

    International Nuclear Information System (INIS)

    Liu, Shuanglong; Feng, Yuan Ping; Zhang, Chun

    2013-01-01

    We show that when a molecular junction is under an external bias, its properties cannot be uniquely determined by the total electron density in the same manner as the density functional theory for ground state properties. In order to correctly incorporate bias-induced nonequilibrium effects, we present a dual mean field (DMF) approach. The key idea is that the total electron density together with the density of current-carrying electrons are sufficient to determine the properties of the system. Two mean fields, one for current-carrying electrons and the other one for equilibrium electrons can then be derived. Calculations for a graphene nanoribbon junction show that compared with the commonly used ab initio transport theory, the DMF approach could significantly reduce the electric current at low biases due to the non-equilibrium corrections to the mean field potential in the scattering region

  13. Electronics

    Science.gov (United States)

    2001-01-01

    International Acer Incorporated, Hsin Chu, Taiwan Aerospace Industrial Development Corporation, Taichung, Taiwan American Institute of Taiwan, Taipei, Taiwan...Singapore and Malaysia .5 - 4 - The largest market for semiconductor products is the high technology consumer electronics industry that consumes up...Singapore, and Malaysia . A new semiconductor facility costs around $3 billion to build and takes about two years to become operational

  14. A new approach to age-period-cohort analysis using partial least squares regression: the trend in blood pressure in the Glasgow Alumni cohort.

    Directory of Open Access Journals (Sweden)

    Yu-Kang Tu

    2011-04-01

    Full Text Available Due to a problem of identification, how to estimate the distinct effects of age, time period and cohort has been a controversial issue in the analysis of trends in health outcomes in epidemiology. In this study, we propose a novel approach, partial least squares (PLS analysis, to separate the effects of age, period, and cohort. Our example for illustration is taken from the Glasgow Alumni cohort. A total of 15,322 students (11,755 men and 3,567 women received medical screening at the Glasgow University between 1948 and 1968. The aim is to investigate the secular trends in blood pressure over 1925 and 1950 while taking into account the year of examination and age at examination. We excluded students born before 1925 or aged over 25 years at examination and those with missing values in confounders from the analyses, resulting in 12,546 and 12,516 students for analysis of systolic and diastolic blood pressure, respectively. PLS analysis shows that both systolic and diastolic blood pressure increased with students' age, and students born later had on average lower blood pressure (SBP: -0.17 mmHg/per year [95% confidence intervals: -0.19 to -0.15] for men and -0.25 [-0.28 to -0.22] for women; DBP: -0.14 [-0.15 to -0.13] for men; -0.09 [-0.11 to -0.07] for women. PLS also shows a decreasing trend in blood pressure over the examination period. As identification is not a problem for PLS, it provides a flexible modelling strategy for age-period-cohort analysis. More emphasis is then required to clarify the substantive and conceptual issues surrounding the definitions and interpretations of age, period and cohort effects.

  15. Linear-algebraic approach to electron-molecule collisions: General formulation

    International Nuclear Information System (INIS)

    Collins, L.A.; Schneider, B.I.

    1981-01-01

    We present a linear-algebraic approach to electron-molecule collisions based on an integral equations form with either logarithmic or asymptotic boundary conditions. The introduction of exchange effects does not alter the basic form or order of the linear-algebraic equations for a local potential. In addition to the standard procedure of directly evaluating the exchange integrals by numerical quadrature, we also incorporate exchange effects through a separable-potential approximation. Efficient schemes are developed for reducing the number of points and channels that must be included. The method is applied at the static-exchange level to a number of molecular systems including H 2 , N 2 , LiH, and CO 2

  16. Electronic health records approaches and challenges: a comparison between Malaysia and four East Asian countries.

    Science.gov (United States)

    Abd Ghani, Mohd Khanapi; Bali, Rajeev K; Naguib, Raouf N G; Marshall, Ian M

    2008-01-01

    An integrated Lifetime Health Record (LHR) is fundamental for achieving seamless and continuous access to patient medical information and for the continuum of care. However, the aim has not yet been fully realised. The efforts are actively progressing around the globe. Every stage of the development of the LHR initiatives had presented peculiar challenges. The best lessons in life are those of someone else's experiences. This paper presents an overview of the development approaches undertaken by four East Asian countries in implementing a national Electronic Health Record (EHR) in the public health system. The major challenges elicited from the review including integration efforts, process reengineering, funding, people, and law and regulation will be presented, compared, discussed and used as lessons learned for the further development of the Malaysian integrated LHR.

  17. Stepwise approach to establishing multiple outreach laboratory information system-electronic medical record interfaces

    Directory of Open Access Journals (Sweden)

    Liron Pantanowitz

    2010-01-01

    Full Text Available Clinical laboratory outreach business is changing as more physician practices adopt an electronic medical record (EMR. Physician connectivity with the laboratory information system (LIS is consequently becoming more important. However, there are no reports available to assist the informatician with establishing and maintaining outreach LIS-EMR connectivity. A four-stage scheme is presented that was successfully employed to establish unidirectional and bidirectional interfaces with multiple physician EMRs. This approach involves planning (step 1, followed by interface building (step 2 with subsequent testing (step 3, and finally ongoing maintenance (step 4. The role of organized project management, software as a service (SAAS, and alternate solutions for outreach connectivity are discussed.

  18. An Electronic Structure Approach to Charge Transfer and Transport in Molecular Building Blocks for Organic Optoelectronics

    Science.gov (United States)

    Hendrickson, Heidi Phillips

    technological design and development. Time dependent perturbation theory, employed by non-equilibrium Green's function formalism, is utilized to study the effect of quantum coherences on electron transport and the effect of symmetry breaking on the electronic spectra of model molecular junctions. The fourth part of this thesis presents the design of a physical chemistry course based on a pedagogical approach called Writing-to-Teach. The nature of inaccuracies expressed in student-generated explanations of quantum chemistry topics, and the ability of a peer review process to engage these inaccuracies, is explored within this context.

  19. A Weibull Approach for Enabling Safety-Oriented Decision-Making for Electronic Railway Signaling Systems

    Directory of Open Access Journals (Sweden)

    Emanuele Pascale

    2018-04-01

    Full Text Available This paper presents the advantages of using Weibull distributions, within the context of railway signaling systems, for enabling safety-oriented decision-making. Failure rates are used to statistically model the basic event of fault-tree analysis, and their value sizes the maximum allowable latency of failures to fulfill the safety target for which the system has been designed. Relying on field-return failure data, Weibull parameters have been calculated for an existing electronic signaling system and a comparison with existing predictive reliability data, based on exponential distribution, is provided. Results are discussed in order to drive considerations on the respect of quantitative targets and on the impact that a wrong hypothesis might have on the choice of a given architecture. Despite the huge amount of information gathered through the after-sales logbook used to build reliability distribution, several key elements for reliable estimation of failure rate values are still missing. This might affect the uncertainty of reliability parameters and the effort required to collect all the information. We then present how to intervene when operational failure rates present higher values compared to the theoretical approach: increasing the redundancies of the system or performing preventive maintenance tasks. Possible consequences of unjustified adoption of constant failure rate are presented. Some recommendations are also shared in order to build reliability-oriented logbooks and avoid data censoring phenomena by enhancing the functions of the electronic boards composing the system.

  20. Renormalization group-theoretic approach to electron localization in disordered systems

    International Nuclear Information System (INIS)

    Kumar, N.; Heinrichs, J.

    1977-06-01

    The localization problem for the Anderson tight-binding model with site-diagonal (gaussian) disorder is studied, using a previously established analogy between this problem and the statistical mechanics of a zero-component classical field. The equivalent free-energy functional turns out to have complex coefficients in the bilinear terms but involves a real repulsive quartic interaction. The averaged one-electron propagator corresponds to the two-point correlation function for the equivalent statistical problem and the critical point gives the mobility edge, which is identified with the (real) fixed point energy of the associated renormalization group. Since for convergence reasons the conventional perturbative treatment of Wilson's formula is invalid, it is resorted to a non-perturbative approach which leads to a physical fixed point corresponding to a repulsive quartic interaction. The results for the mobility edge in three dimensions and for the critical disorder for an Anderson transition in two dimensions agree well with previous detailed predictions. The critical indices describing the approach of the transition at the mobility edge of various physical quantities, within the epsilon-expansion are also discussed. The more general problem where both diagonal and off-diagonal disorder is present in the Anderson hamiltonian is considered. In this case it is shown that the Hamilton function for the equivalent zero-component classical field model involves an additional biquadratic exchange term. From a simple generalization of Wilson's recursion relation and its non-perturbative solution explicit expressions for the mobility edges for weak diagonal and off-diagonal disorder in two and three dimensions are obtained. Our treatment casts doubts on the validity of recent conclusions about electron localization based on the renormalization group study of the nm-component spin model

  1. ANALYSIS OF THEORETICAL AND METHODOLOGICAL APPROACHES TO DESIGN OF ELECTRONIC TEXTBOOKS FOR STUDENTS OF HIGHER AGRICULTURAL EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Olena Yu. Balalaieva

    2017-06-01

    Full Text Available The article deals with theoretical and methodological approaches to the design of electronic textbook, in particular systems, competence, activity, personality oriented, technological one, that in complex reflect the general trends in the formation of a new educational paradigm, distinctive features of which lie in constructing the heuristic searching model of the learning process, focusing on developmental teaching, knowledge integration, skills development for the independent information search and processing, technification of the learning process. The approach in this study is used in a broad sense as a synthesis of the basic ideas, views, principles that determine the overall research strategy. The main provisions of modern approaches to design are not antagonistic, they should be applied in a complex, taking into account the advantages of each of them and leveling shortcomings for the development of optimal concept of electronic textbook. The model of electronic textbook designing and components of methodology for its using based on these approaches are described.

  2. Linear regression crash prediction models : issues and proposed solutions.

    Science.gov (United States)

    2010-05-01

    The paper develops a linear regression model approach that can be applied to : crash data to predict vehicle crashes. The proposed approach involves novice data aggregation : to satisfy linear regression assumptions; namely error structure normality ...

  3. Vector regression introduced

    Directory of Open Access Journals (Sweden)

    Mok Tik

    2014-06-01

    Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.

  4. Observation of superconducting fluxons by transmission electron microscopy: A Fourier space approach to calculate the electron optical phase shifts and images

    International Nuclear Information System (INIS)

    Beleggia, M.; Pozzi, G.

    2001-01-01

    An approach is presented for the calculation of the electron optical phase shift experienced by high-energy electrons in a transmission electron microscope, when they interact with the magnetic field associated with superconducting fluxons in a thin specimen tilted with respect to the beam. It is shown that by decomposing the vector potential in its Fourier components and by calculating the phase shift of each component separately, it is possible to obtain the Fourier transform of the electron optical phase shift, which can be inverted either analytically or numerically. It will be shown how this method can be used to recover the result, previously obtained by the real-space approach, relative to the case of a straight flux tube perpendicular to the specimen surfaces. Then the method is applied to the case of a London fluxon in a thin film, where the bending and the broadening of the magnetic-field lines due to the finite specimen thickness are now correctly taken into account and not treated approximately by means of a parabolic fit. Finally, it will be shown how simple models for the pancake structure of the fluxon can be analyzed within this framework and the main features of electron transmission images predicted

  5. Multicollinearity and Regression Analysis

    Science.gov (United States)

    Daoud, Jamal I.

    2017-12-01

    In regression analysis it is obvious to have a correlation between the response and predictor(s), but having correlation among predictors is something undesired. The number of predictors included in the regression model depends on many factors among which, historical data, experience, etc. At the end selection of most important predictors is something objective due to the researcher. Multicollinearity is a phenomena when two or more predictors are correlated, if this happens, the standard error of the coefficients will increase [8]. Increased standard errors means that the coefficients for some or all independent variables may be found to be significantly different from In other words, by overinflating the standard errors, multicollinearity makes some variables statistically insignificant when they should be significant. In this paper we focus on the multicollinearity, reasons and consequences on the reliability of the regression model.

  6. Functional renormalization group approach to electronic structure calculations for systems without translational symmetry

    Science.gov (United States)

    Seiler, Christian; Evers, Ferdinand

    2016-10-01

    A formalism for electronic-structure calculations is presented that is based on the functional renormalization group (FRG). The traditional FRG has been formulated for systems that exhibit a translational symmetry with an associated Fermi surface, which can provide the organization principle for the renormalization group (RG) procedure. We here advance an alternative formulation, where the RG flow is organized in the energy-domain rather than in k space. This has the advantage that it can also be applied to inhomogeneous matter lacking a band structure, such as disordered metals or molecules. The energy-domain FRG (ɛ FRG) presented here accounts for Fermi-liquid corrections to quasiparticle energies and particle-hole excitations. It goes beyond the state of the art G W -BSE , because in ɛ FRG the Bethe-Salpeter equation (BSE) is solved in a self-consistent manner. An efficient implementation of the approach that has been tested against exact diagonalization calculations and calculations based on the density matrix renormalization group is presented. Similar to the conventional FRG, also the ɛ FRG is able to signalize the vicinity of an instability of the Fermi-liquid fixed point via runaway flow of the corresponding interaction vertex. Embarking upon this fact, in an application of ɛ FRG to the spinless disordered Hubbard model we calculate its phase boundary in the plane spanned by the interaction and disorder strength. Finally, an extension of the approach to finite temperatures and spin S =1 /2 is also given.

  7. EPR policies for electronics in developing Asia: an adapted phase-in approach.

    Science.gov (United States)

    Akenji, Lewis; Hotta, Yasuhiko; Bengtsson, Magnus; Hayashi, Shiko

    2011-09-01

    The amount of e-waste is growing rapidly in developing countries, and the health and environmental problems resulting from poor management of this waste have become a concern for policy makers. In response to these challenges, a number of Asian developing countries have been inspired by policy developments in OECD countries, and have drafted legislations based on the principle of extended producer responsibility (EPR). However, the experiences from developed countries show that a successful implementation of EPR policies requires adequate institutions and sufficient administrative capacity. Even advanced countries are thus facing difficulties. This paper concludes from existing literature and from the authors' own observations that there seems to be a mismatch between the typical policy responses to e-waste problems in developing Asia and the capacity for successful implementation of such policies. It also notes that the e-waste situation in developing Asian countries is further complicated by a number of additional factors, such as difficulties in identifying producers, import of used electronic products and e-waste (sometimes illegal), and the existence of a strong informal waste sector. Given these challenges, the authors conclude that comprehensive EPR policy schemes of the kind that have been implemented in some advanced countries are not likely to be effective. The paper therefore proposes an alternative phase-in approach whereby developing Asian countries are able to move gradually towards EPR systems. It argues that this approach would be more feasible, and discusses what could be the key building blocks of each implementation stage.

  8. Design of electron beam bending magnet system using three sector magnets for electron and photon therapy: a simulation approach

    International Nuclear Information System (INIS)

    Shahzad, A.A.; Bhoraskar, V.N.; Dhole, S.D.

    2013-01-01

    The 270 degree doubly achromatic beam bending magnet system using three sector magnets has been designed mainly for treating cancer and skin diseases. The main requirements of the design of three magnet system is to focus an electron beam having a spot size less than 3mm x 3mm, energy spread within 3% and divergence angle ≤ 3 mrad at the target. To achieve these parameters the simulation was carried out using Lorentz-3EM software. The beam spot, divergence angle and energy spread were observed with respect to the variation in angles of sector magnets and drift distances. From the simulated results, it has been optimized that all the three sector magnets has an angle of 62 degree and the drift distance 68 mm. It is also observed that at the 1637, 2425, 3278, 4165 and 5690 Amp-turn, the optimized design produces 3851, 5754, 7434, 9356 and 11425 Gauss of magnetic field at median plane require to bend 6, 9, 12, 15 and 18 MeV energy of electron respectively for electron therapy. The output parameters of the optimized design are energy spread 3%, divergence angle ∼ 3 mrad and spot size 2.8 mm. Moreover, for 6 MV and 15 MV photon therapy application, an electron beam of energy 6.5 MeV and 15.5 MeV extracted from magnet system and focussed on the Bremsstrahlung target. For the photon therapy the 1780, and 4456 amp-turn, an optimized design produces 4148 and 9682 Gauss of magnetic field at median plane require to bend 6.5 and 15.5 MeV energy of electron respectively, which further produces Bremsstrahlung in Tungsten target. (author)

  9. Minimax Regression Quantiles

    DEFF Research Database (Denmark)

    Bache, Stefan Holst

    A new and alternative quantile regression estimator is developed and it is shown that the estimator is root n-consistent and asymptotically normal. The estimator is based on a minimax ‘deviance function’ and has asymptotically equivalent properties to the usual quantile regression estimator. It is......, however, a different and therefore new estimator. It allows for both linear- and nonlinear model specifications. A simple algorithm for computing the estimates is proposed. It seems to work quite well in practice but whether it has theoretical justification is still an open question....

  10. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....

  11. A novel system architecture for the national integration of electronic health records: a semi-centralized approach.

    Science.gov (United States)

    AlJarullah, Asma; El-Masri, Samir

    2013-08-01

    The goal of a national electronic health records integration system is to aggregate electronic health records concerning a particular patient at different healthcare providers' systems to provide a complete medical history of the patient. It holds the promise to address the two most crucial challenges to the healthcare systems: improving healthcare quality and controlling costs. Typical approaches for the national integration of electronic health records are a centralized architecture and a distributed architecture. This paper proposes a new approach for the national integration of electronic health records, the semi-centralized approach, an intermediate solution between the centralized architecture and the distributed architecture that has the benefits of both approaches. The semi-centralized approach is provided with a clearly defined architecture. The main data elements needed by the system are defined and the main system modules that are necessary to achieve an effective and efficient functionality of the system are designed. Best practices and essential requirements are central to the evolution of the proposed architecture. The proposed architecture will provide the basis for designing the simplest and the most effective systems to integrate electronic health records on a nation-wide basis that maintain integrity and consistency across locations, time and systems, and that meet the challenges of interoperability, security, privacy, maintainability, mobility, availability, scalability, and load balancing.

  12. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying

    2009-08-27

    Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.

  13. Hybridization approach to in-line and off-axis (electron) holography for superior resolution and phase sensitivity

    Science.gov (United States)

    Ozsoy-Keskinbora, C.; Boothroyd, C. B.; Dunin-Borkowski, R. E.; van Aken, P. A.; Koch, C. T.

    2014-01-01

    Holography - originally developed for correcting spherical aberration in transmission electron microscopes - is now used in a wide range of disciplines that involve the propagation of waves, including light optics, electron microscopy, acoustics and seismology. In electron microscopy, the two primary modes of holography are Gabor's original in-line setup and an off-axis approach that was developed subsequently. These two techniques are highly complementary, offering superior phase sensitivity at high and low spatial resolution, respectively. All previous investigations have focused on improving each method individually. Here, we show how the two approaches can be combined in a synergetic fashion to provide phase information with excellent sensitivity across all spatial frequencies, low noise and an efficient use of electron dose. The principle is also expected to be widely to applications of holography in light optics, X-ray optics, acoustics, ultra-sound, terahertz imaging, etc. PMID:25387480

  14. Multiple linear regression analysis

    Science.gov (United States)

    Edwards, T. R.

    1980-01-01

    Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.

  15. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  16. Linear Regression Analysis

    CERN Document Server

    Seber, George A F

    2012-01-01

    Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.

  17. Nonlinear Regression with R

    CERN Document Server

    Ritz, Christian; Parmigiani, Giovanni

    2009-01-01

    R is a rapidly evolving lingua franca of graphical display and statistical analysis of experiments from the applied sciences. This book provides a coherent treatment of nonlinear regression with R by means of examples from a diversity of applied sciences such as biology, chemistry, engineering, medicine and toxicology.

  18. Bayesian ARTMAP for regression.

    Science.gov (United States)

    Sasu, L M; Andonie, R

    2013-10-01

    Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Bounded Gaussian process regression

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand; Nielsen, Jens Brehm; Larsen, Jan

    2013-01-01

    We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We...... with the proposed explicit noise-model extension....

  20. and Multinomial Logistic Regression

    African Journals Online (AJOL)

    This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for classifying students based on their academic performance. The predictive accuracy for each model was measured by their average Classification Correct Rate (CCR).

  1. Gaussian Process Regression Model in Spatial Logistic Regression

    Science.gov (United States)

    Sofro, A.; Oktaviarina, A.

    2018-01-01

    Spatial analysis has developed very quickly in the last decade. One of the favorite approaches is based on the neighbourhood of the region. Unfortunately, there are some limitations such as difficulty in prediction. Therefore, we offer Gaussian process regression (GPR) to accommodate the issue. In this paper, we will focus on spatial modeling with GPR for binomial data with logit link function. The performance of the model will be investigated. We will discuss the inference of how to estimate the parameters and hyper-parameters and to predict as well. Furthermore, simulation studies will be explained in the last section.

  2. Advances in the MQDT approach of electron/molecular cation reactive collisions: High precision extensive calculations for applications

    Directory of Open Access Journals (Sweden)

    Motapon O.

    2015-01-01

    Full Text Available Recent advances in the stepwise multichannel quantum defect theory approach of electron/molecular cation reactive collisions have been applied to perform computations of cross sections and rate coefficients for dissociative recombination and electron-impact ro-vibrational transitions of H2+, BeH+ and their deuterated isotopomers. At very low energy, rovibronic interactions play a significant role in the dynamics, whereas at high energy, the dissociative excitation strongly competes with all other reactive processes.

  3. Organic solvent wetting properties of UV and plasma treated ZnO nanorods: printed electronics approach

    KAUST Repository

    Sliz, Rafal

    2012-09-13

    Due to low manufacturing costs, printed organic solar cells are on the short-list of renewable and environmentally- friendly energy production technologies of the future. However, electrode materials and each photoactive layer require different techniques and approaches. Printing technologies have attracted considerable attention for organic electronics due to their potentially high volume and low cost processing. A case in point is the interface between the substrate and solution (ink) drop, which is a particularly critical issue for printing quality. In addition, methods such as UV, oxygen and argon plasma treatments have proven suitable to increasing the hydrophilicity of treated surfaces. Among several methods of measuring the ink-substrate interface, the simplest and most reliable is the contact angle method. In terms of nanoscale device applications, zinc oxide (ZnO) has gained popularity, owing to its physical and chemical properties. In particular, there is a growing interest in exploiting the unique properties that the so-called nanorod structure exhibits for future 1-dimensional opto-electronic devices. Applications, such as photodiodes, thin-film transistors, sensors and photo anodes in photovoltaic cells have already been demonstrated. This paper presents the wettability properties of ZnO nanorods treated with UV illumination, oxygen and argon plasma for various periods of time. Since this work concentrates on solar cell applications, four of the most common solutions used in organic solar cell manufacture were tested: P3HT:PCBM DCB, P3HT:PCBM CHB, PEDOT:PSS and water. The achieved results prove that different treatments change the contact angle differently. Moreover, solvent behaviour varied uniquely with the applied treatment. © (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  4. Nuclear-electronic orbital reduced explicitly correlated Hartree-Fock approach: Restricted basis sets and open-shell systems

    International Nuclear Information System (INIS)

    Brorsen, Kurt R.; Sirjoosingh, Andrew; Pak, Michael V.; Hammes-Schiffer, Sharon

    2015-01-01

    The nuclear electronic orbital (NEO) reduced explicitly correlated Hartree-Fock (RXCHF) approach couples select electronic orbitals to the nuclear orbital via Gaussian-type geminal functions. This approach is extended to enable the use of a restricted basis set for the explicitly correlated electronic orbitals and an open-shell treatment for the other electronic orbitals. The working equations are derived and the implementation is discussed for both extensions. The RXCHF method with a restricted basis set is applied to HCN and FHF − and is shown to agree quantitatively with results from RXCHF calculations with a full basis set. The number of many-particle integrals that must be calculated for these two molecules is reduced by over an order of magnitude with essentially no loss in accuracy, and the reduction factor will increase substantially for larger systems. Typically, the computational cost of RXCHF calculations with restricted basis sets will scale in terms of the number of basis functions centered on the quantum nucleus and the covalently bonded neighbor(s). In addition, the RXCHF method with an odd number of electrons that are not explicitly correlated to the nuclear orbital is implemented using a restricted open-shell formalism for these electrons. This method is applied to HCN + , and the nuclear densities are in qualitative agreement with grid-based calculations. Future work will focus on the significance of nonadiabatic effects in molecular systems and the further enhancement of the NEO-RXCHF approach to accurately describe such effects

  5. Nuclear-electronic orbital reduced explicitly correlated Hartree-Fock approach: Restricted basis sets and open-shell systems

    Energy Technology Data Exchange (ETDEWEB)

    Brorsen, Kurt R.; Sirjoosingh, Andrew; Pak, Michael V.; Hammes-Schiffer, Sharon, E-mail: shs3@illinois.edu [Department of Chemistry, University of Illinois at Urbana-Champaign, 600 South Mathews Ave., Urbana, Illinois 61801 (United States)

    2015-06-07

    The nuclear electronic orbital (NEO) reduced explicitly correlated Hartree-Fock (RXCHF) approach couples select electronic orbitals to the nuclear orbital via Gaussian-type geminal functions. This approach is extended to enable the use of a restricted basis set for the explicitly correlated electronic orbitals and an open-shell treatment for the other electronic orbitals. The working equations are derived and the implementation is discussed for both extensions. The RXCHF method with a restricted basis set is applied to HCN and FHF{sup −} and is shown to agree quantitatively with results from RXCHF calculations with a full basis set. The number of many-particle integrals that must be calculated for these two molecules is reduced by over an order of magnitude with essentially no loss in accuracy, and the reduction factor will increase substantially for larger systems. Typically, the computational cost of RXCHF calculations with restricted basis sets will scale in terms of the number of basis functions centered on the quantum nucleus and the covalently bonded neighbor(s). In addition, the RXCHF method with an odd number of electrons that are not explicitly correlated to the nuclear orbital is implemented using a restricted open-shell formalism for these electrons. This method is applied to HCN{sup +}, and the nuclear densities are in qualitative agreement with grid-based calculations. Future work will focus on the significance of nonadiabatic effects in molecular systems and the further enhancement of the NEO-RXCHF approach to accurately describe such effects.

  6. Electron and photon reconstruction and performance in ATLAS using a dynamical, topological cell clustering-based approach

    CERN Document Server

    The ATLAS collaboration

    2017-01-01

    The electron and photon reconstruction in ATLAS has moved towards the use of a dynamical, topo- logical cell-based approach for cluster building, owing to advancements in the calibration procedure which allow for such a method to be applied. The move to this new technique allows for improved measurements of electron and photon energies, particularly in situations where an electron radiates a bremsstrahlung photon, or a photon converts to an electron-poistron pair. This note details the changes to the ATLAS electron and photon reconstruction software, and assesses its performance under current LHC luminosity conditions using simulated data. Changes to the converted photon reconstruction are also detailed, which improve the reconstruction efficiency of double-track converted photons, as well as reducing the reconstruction of spurious one-track converted photons. The performance of the new reconstruction algorithm is also presented in a number of important topologies relevant to precision Standard Model physics,...

  7. Photoemission in strongly correlated crystalline f-electron systems: A need for a new approach

    International Nuclear Information System (INIS)

    Arko, A.J.; Joyce, J.J.; Sarrao, J.

    1998-01-01

    The unusual properties of heavy fermion (or heavy electron) materials have sparked an avalanche of research over the last two decades in order to understand the basic phenomena responsible for these properties. Photoelectron spectroscopy (often referred to as PES in the following sections), the most direct measurement of the electronic structure of a material, should in principle be able to shed considerable light on this matter. In general the distinction between a localized and a band-like state is trivially observed in band dispersion. Much of the past work was performed on poly-crystalline samples, scraped in-situ to expose a clean surface for PES. There have since been considerable advances both in the quality of specimens as well as experimental resolution, which raise questions regarding these conclusions. Much of the past work on poly-crystalline samples has been reported in several review articles, most notably Allen et al., and it is not necessary here to review those efforts again, with the exception of subsequent work performed at high resolution. The primary focus of the present review will be on new measurements obtained on single crystals, cleaved or prepared in situ and measured at high resolution, which seem to suggest that agreement with the GS and NCA approximations is less than perfect, and that perhaps the starting models need to be modified, or that even an entirely new approach is called for. Of the promising new models the Periodic Anderson Model is most closely related to the SIM. Indeed, at high temperatures it reverts to the SIM. However, the charge polaron model of Liu (1997) as well as the two-electron band model of Sheng and Cooper (1995) cannot yet be ruled out. Inasmuch as the bulk of the single crystal work was performed by the Los Alamos group, this review will draw heavily on those results. Moreover, since the GS and NCA approximations represent the most comprehensive and widely accepted treatment of heavy fermion PES, it is only

  8. Dynamical simulation of electron transfer processes in self-assembled monolayers at metal surfaces using a density matrix approach

    Science.gov (United States)

    Prucker, V.; Bockstedte, M.; Thoss, M.; Coto, P. B.

    2018-03-01

    A single-particle density matrix approach is introduced to simulate the dynamics of heterogeneous electron transfer (ET) processes at interfaces. The characterization of the systems is based on a model Hamiltonian parametrized by electronic structure calculations and a partitioning method. The method is applied to investigate ET in a series of nitrile-substituted (poly)(p-phenylene)thiolate self-assembled monolayers adsorbed at the Au(111) surface. The results show a significant dependence of the ET on the orbital symmetry of the donor state and on the molecular and electronic structure of the spacer.

  9. Dynamical simulation of electron transfer processes in self-assembled monolayers at metal surfaces using a density matrix approach.

    Science.gov (United States)

    Prucker, V; Bockstedte, M; Thoss, M; Coto, P B

    2018-03-28

    A single-particle density matrix approach is introduced to simulate the dynamics of heterogeneous electron transfer (ET) processes at interfaces. The characterization of the systems is based on a model Hamiltonian parametrized by electronic structure calculations and a partitioning method. The method is applied to investigate ET in a series of nitrile-substituted (poly)(p-phenylene)thiolate self-assembled monolayers adsorbed at the Au(111) surface. The results show a significant dependence of the ET on the orbital symmetry of the donor state and on the molecular and electronic structure of the spacer.

  10. Time-dependent approach to electron scattering and ionization in the s-wave model

    International Nuclear Information System (INIS)

    Ihra, W.; Draeger, M.; Handke, G.; Friedrich, H.

    1995-01-01

    The time-dependent Schroedinger equation is integrated for continuum states of two-electron atoms in the framework of the s-wave model, in which both electrons are restricted to having vanishing individual orbital angular momenta. The method is suitable for studying the time evolution of correlations in the two-electron wave functions and yields probabilities for elastic and inelastic electron scattering and for electron-impact ionization. The spin-averaged probabilities for electron-impact ionization of hydrogen in the s-wave model reproduce the shape of the experimentally observed integrated ionization cross section remarkably well for energies near and above the maximum

  11. Simulation of electron spin resonance spectroscopy in diverse environments: An integrated approach

    Science.gov (United States)

    Zerbetto, Mirco; Polimeno, Antonino; Barone, Vincenzo

    2009-12-01

    We discuss in this work a new software tool, named E-SpiReS (Electron Spin Resonance Simulations), aimed at the interpretation of dynamical properties of molecules in fluids from electron spin resonance (ESR) measurements. The code implements an integrated computational approach (ICA) for the calculation of relevant molecular properties that are needed in order to obtain spectral lines. The protocol encompasses information from atomistic level (quantum mechanical) to coarse grained level (hydrodynamical), and evaluates ESR spectra for rigid or flexible single or multi-labeled paramagnetic molecules in isotropic and ordered phases, based on a numerical solution of a stochastic Liouville equation. E-SpiReS automatically interfaces all the computational methodologies scheduled in the ICA in a way completely transparent for the user, who controls the whole calculation flow via a graphical interface. Parallelized algorithms are employed in order to allow running on calculation clusters, and a web applet Java has been developed with which it is possible to work from any operating system, avoiding the problems of recompilation. E-SpiReS has been used in the study of a number of different systems and two relevant cases are reported to underline the promising applicability of the ICA to complex systems and the importance of similar software tools in handling a laborious protocol. Program summaryProgram title: E-SpiReS Catalogue identifier: AEEM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.0 No. of lines in distributed program, including test data, etc.: 311 761 No. of bytes in distributed program, including test data, etc.: 10 039 531 Distribution format: tar.gz Programming language: C (core programs) and Java (graphical interface) Computer: PC and Macintosh Operating system: Unix and Windows Has the code been vectorized or

  12. Subset selection in regression

    CERN Document Server

    Miller, Alan

    2002-01-01

    Originally published in 1990, the first edition of Subset Selection in Regression filled a significant gap in the literature, and its critical and popular success has continued for more than a decade. Thoroughly revised to reflect progress in theory, methods, and computing power, the second edition promises to continue that tradition. The author has thoroughly updated each chapter, incorporated new material on recent developments, and included more examples and references. New in the Second Edition:A separate chapter on Bayesian methodsComplete revision of the chapter on estimationA major example from the field of near infrared spectroscopyMore emphasis on cross-validationGreater focus on bootstrappingStochastic algorithms for finding good subsets from large numbers of predictors when an exhaustive search is not feasible Software available on the Internet for implementing many of the algorithms presentedMore examplesSubset Selection in Regression, Second Edition remains dedicated to the techniques for fitting...

  13. THE JEOPARDIZED SITUATION OF ELECTRONIC WASTE IN BANGLADESH: CAN CUSTOMIZED POLICY APPROACH SOLVE THE CHALLENGE?

    Directory of Open Access Journals (Sweden)

    Khalid Md. Bahauddin

    2016-01-01

    Full Text Available Electronic waste (e-waste is one of the fastest-growing pollution problems worldwide given the presence if a variety of toxic substances which can contaminate the environment and threaten human health, if disposal protocols are not meticulously managed. In Bangladesh almost 2.7 million metric tons of e-waste generated per year. Of this amountonly 20 to 30 percent is recycled and the rest of the waste is released in to landfills,  rivers, drains lakes, canals, open spaces which are very hazardous for the health and environment. Since Bangladesh is in the stream of rapid technological advancement, it is seldom to take necessary steps to avoid the future jeopardized situation because of e-waste. The current practices of e-waste management in Bangladesh suffer from a number of drawbacks like the difficulty in inventorisation, unhealthy conditions of informal recycling, inadequate legislation and policy, poor awareness and reluctance on part of the corporate to address the critical issues. The paper highlights the associated issues and strategies to address this emerging problem, analyses the policy and its gaps. Therefore, this paper also suggest that e-waste policy development may require a more customized approach where, instead of addressing e-waste in isolation, it should be addressed as part of the national development agenda that integrates green economy assessment and strategic environmental assessment as part of national policy planning. Finally this work also suggests some alternative strategies and approaches to overcome the challenges of e-waste.

  14. Pathological assessment of liver fibrosis regression

    Directory of Open Access Journals (Sweden)

    WANG Bingqiong

    2017-03-01

    Full Text Available Hepatic fibrosis is the common pathological outcome of chronic hepatic diseases. An accurate assessment of fibrosis degree provides an important reference for a definite diagnosis of diseases, treatment decision-making, treatment outcome monitoring, and prognostic evaluation. At present, many clinical studies have proven that regression of hepatic fibrosis and early-stage liver cirrhosis can be achieved by effective treatment, and a correct evaluation of fibrosis regression has become a hot topic in clinical research. Liver biopsy has long been regarded as the gold standard for the assessment of hepatic fibrosis, and thus it plays an important role in the evaluation of fibrosis regression. This article reviews the clinical application of current pathological staging systems in the evaluation of fibrosis regression from the perspectives of semi-quantitative scoring system, quantitative approach, and qualitative approach, in order to propose a better pathological evaluation system for the assessment of fibrosis regression.

  15. Better Autologistic Regression

    Directory of Open Access Journals (Sweden)

    Mark A. Wolters

    2017-11-01

    Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.

  16. Regression in organizational leadership.

    Science.gov (United States)

    Kernberg, O F

    1979-02-01

    The choice of good leaders is a major task for all organizations. Inforamtion regarding the prospective administrator's personality should complement questions regarding his previous experience, his general conceptual skills, his technical knowledge, and the specific skills in the area for which he is being selected. The growing psychoanalytic knowledge about the crucial importance of internal, in contrast to external, object relations, and about the mutual relationships of regression in individuals and in groups, constitutes an important practical tool for the selection of leaders.

  17. Classification and regression trees

    CERN Document Server

    Breiman, Leo; Olshen, Richard A; Stone, Charles J

    1984-01-01

    The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.

  18. Logistic regression models

    CERN Document Server

    Hilbe, Joseph M

    2009-01-01

    This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...

  19. Robust Estimation of Electron Density From Anatomic Magnetic Resonance Imaging of the Brain Using a Unifying Multi-Atlas Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Shangjie [Tianjin Key Laboratory of Process Measurement and Control, School of Electrical Engineering and Automation, Tianjin University, Tianjin (China); Department of Radiation Oncology, Stanford University School of Medicine, Palo Alto, California (United States); Hara, Wendy; Wang, Lei; Buyyounouski, Mark K.; Le, Quynh-Thu; Xing, Lei [Department of Radiation Oncology, Stanford University School of Medicine, Palo Alto, California (United States); Li, Ruijiang, E-mail: rli2@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Palo Alto, California (United States)

    2017-03-15

    Purpose: To develop a reliable method to estimate electron density based on anatomic magnetic resonance imaging (MRI) of the brain. Methods and Materials: We proposed a unifying multi-atlas approach for electron density estimation based on standard T1- and T2-weighted MRI. First, a composite atlas was constructed through a voxelwise matching process using multiple atlases, with the goal of mitigating effects of inherent anatomic variations between patients. Next we computed for each voxel 2 kinds of conditional probabilities: (1) electron density given its image intensity on T1- and T2-weighted MR images; and (2) electron density given its spatial location in a reference anatomy, obtained by deformable image registration. These were combined into a unifying posterior probability density function using the Bayesian formalism, which provided the optimal estimates for electron density. We evaluated the method on 10 patients using leave-one-patient-out cross-validation. Receiver operating characteristic analyses for detecting different tissue types were performed. Results: The proposed method significantly reduced the errors in electron density estimation, with a mean absolute Hounsfield unit error of 119, compared with 140 and 144 (P<.0001) using conventional T1-weighted intensity and geometry-based approaches, respectively. For detection of bony anatomy, the proposed method achieved an 89% area under the curve, 86% sensitivity, 88% specificity, and 90% accuracy, which improved upon intensity and geometry-based approaches (area under the curve: 79% and 80%, respectively). Conclusion: The proposed multi-atlas approach provides robust electron density estimation and bone detection based on anatomic MRI. If validated on a larger population, our work could enable the use of MRI as a primary modality for radiation treatment planning.

  20. An exciton approach to the excited states of two electron atoms. I Formalism and interpretation

    International Nuclear Information System (INIS)

    Schipper, P.E.

    1985-01-01

    The exciton model is formally applied to a description of the excited states of two electron atoms with the explicit inclusion of exchange. The model leads to a conceptually simple framework for the discussion of the electronic properties of the archetypical atomic electron pair

  1. Approaches and challenges to optimising primary care teams’ electronic health record usage

    Directory of Open Access Journals (Sweden)

    Nancy Pandhi

    2014-07-01

    Full Text Available Background Although the presence of an electronic health record (EHR alone does not ensure high quality, efficient care, few studies have focused on the work of those charged with optimising use of existing EHR functionality.Objective To examine the approaches used and challenges perceived by analysts supporting the optimisation of primary care teams’ EHR use at a large U.S. academic health care system.Methods A qualitative study was conducted. Optimisation analysts and their supervisor were interviewed and data were analysed for themes.Results Analysts needed to reconcile the tension created by organisational mandates focused on the standardisation of EHR processes with the primary care teams’ demand for EHR customisation. They gained an understanding of health information technology (HIT leadership’s and primary care team’s goals through attending meetings, reading meeting minutes and visiting with clinical teams. Within what was organisationally possible, EHR education could then be tailored to fit team needs. Major challenges were related to organisational attempts to standardise EHR use despite varied clinic contexts, personnel readiness and technical issues with the EHR platform. Forcing standardisation upon clinical needs that current EHR functionality could not satisfy was difficult.Conclusions Dedicated optimisation analysts can add value to health systems through playing a mediating role between HIT leadership and care teams. Our findings imply that EHR optimisation should be performed with an in-depth understanding of the workflow, cognitive and interactional activities in primary care.

  2. Epidemic surveillance using an electronic medical record: an empiric approach to performance improvement.

    Directory of Open Access Journals (Sweden)

    Hongzhang Zheng

    Full Text Available Electronic medical records (EMR form a rich repository of information that could benefit public health. We asked how structured and free-text narrative EMR data should be combined to improve epidemic surveillance for acute respiratory infections (ARI.Eight previously characterized ARI case detection algorithms (CDA were applied to historical EMR entries to create authentic time series of daily ARI case counts (background. An epidemic model simulated influenza cases (injection. From the time of the injection, cluster-detection statistics were applied daily on paired background+injection (combined and background-only time series. This cycle was then repeated with the injection shifted to each week of the evaluation year. We computed: a the time from injection to the first statistical alarm uniquely found in the combined dataset (Detection Delay; b how often alarms originated in the background-only dataset (false-alarm rate, or FAR; and c the number of cases found within these false alarms (Caseload. For each CDA, we plotted the Detection Delay as a function of FAR or Caseload, over a broad range of alarm thresholds.CDAs that combined text analyses seeking ARI symptoms in clinical notes with provider-assigned diagnostic codes in order to maximize the precision rather than the sensitivity of case-detection lowered Detection Delay at any given FAR or Caseload.An empiric approach can guide the integration of EMR data into case-detection methods that improve both the timeliness and efficiency of epidemic detection.

  3. Electronic Cigarettes and Indoor Air Quality: A Simple Approach to Modeling Potential Bystander Exposures to Nicotine

    Science.gov (United States)

    Colard, Stéphane; O’Connell, Grant; Verron, Thomas; Cahours, Xavier; Pritchard, John D.

    2014-01-01

    There has been rapid growth in the use of electronic cigarettes (“vaping”) in Europe, North America and elsewhere. With such increased prevalence, there is currently a debate on whether the aerosol exhaled following the use of e-cigarettes has implications for the quality of air breathed by bystanders. Conducting chemical analysis of the indoor environment can be costly and resource intensive, limiting the number of studies which can be conducted. However, this can be modelled reasonably accurately based on empirical emissions data and using some basic assumptions. Here, we present a simplified model, based on physical principles, which considers aerosol propagation, dilution and extraction to determine the potential contribution of a single puff from an e-cigarette to indoor air. From this, it was then possible to simulate the cumulative effect of vaping over time. The model was applied to a virtual, but plausible, scenario considering an e-cigarette user and a non-user working in the same office space. The model was also used to reproduce published experimental studies and showed good agreement with the published values of indoor air nicotine concentration. With some additional refinements, such an approach may be a cost-effective and rapid way of assessing the potential exposure of bystanders to exhaled e-cigarette aerosol constituents. PMID:25547398

  4. Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study

    Science.gov (United States)

    Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh

    2018-03-01

    Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.

  5. A low-cost approach to electronic excitation energies based on the driven similarity renormalization group

    Science.gov (United States)

    Li, Chenyang; Verma, Prakash; Hannon, Kevin P.; Evangelista, Francesco A.

    2017-08-01

    We propose an economical state-specific approach to evaluate electronic excitation energies based on the driven similarity renormalization group truncated to second order (DSRG-PT2). Starting from a closed-shell Hartree-Fock wave function, a model space is constructed that includes all single or single and double excitations within a given set of active orbitals. The resulting VCIS-DSRG-PT2 and VCISD-DSRG-PT2 methods are introduced and benchmarked on a set of 28 organic molecules [M. Schreiber et al., J. Chem. Phys. 128, 134110 (2008)]. Taking CC3 results as reference values, mean absolute deviations of 0.32 and 0.22 eV are observed for VCIS-DSRG-PT2 and VCISD-DSRG-PT2 excitation energies, respectively. Overall, VCIS-DSRG-PT2 yields results with accuracy comparable to those from time-dependent density functional theory using the B3LYP functional, while VCISD-DSRG-PT2 gives excitation energies comparable to those from equation-of-motion coupled cluster with singles and doubles.

  6. Final Report - Composite Fermion Approach to Strongly Interacting Quasi Two Dimensional Electron Gas Systems

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, John

    2009-11-30

    Work related to this project introduced the idea of an effective monopole strength Q* that acted as the effective angular momentum of the lowest shell of composite Fermions (CF). This allowed us to predict the angular momentum of the lowest band of energy states for any value of the applied magnetic field simply by determining N{sub QP} the number of quasielectrons (QE) or quasiholes (QH) in a partially filled CF shell and adding angular momenta of the N{sub QP} Fermions excitations. The approach reported treated the filled CF level as a vacuum state which could support QE and QH excitations. Numerical diagonalization of small systems allowed us to determine the angular momenta, the energy, and the pair interaction energies of these elementary excitations. The spectra of low energy states could then be evaluated in a Fermi liquid-like picture, treating the much smaller number of quasiparticles and their interactions instead of the larger system of N electrons with Coulomb interactions.

  7. Electronic Cigarettes and Indoor Air Quality: A Simple Approach to Modeling Potential Bystander Exposures to Nicotine

    Directory of Open Access Journals (Sweden)

    Stéphane Colard

    2014-12-01

    Full Text Available There has been rapid growth in the use of electronic cigarettes (“vaping” in Europe, North America and elsewhere. With such increased prevalence, there is currently a debate on whether the aerosol exhaled following the use of e-cigarettes has implications for the quality of air breathed by bystanders. Conducting chemical analysis of the indoor environment can be costly and resource intensive, limiting the number of studies which can be conducted. However, this can be modelled reasonably accurately based on empirical emissions data and using some basic assumptions. Here, we present a simplified model, based on physical principles, which considers aerosol propagation, dilution and extraction to determine the potential contribution of a single puff from an e-cigarette to indoor air. From this, it was then possible to simulate the cumulative effect of vaping over time. The model was applied to a virtual, but plausible, scenario considering an e-cigarette user and a non-user working in the same office space. The model was also used to reproduce published experimental studies and showed good agreement with the published values of indoor air nicotine concentration. With some additional refinements, such an approach may be a cost-effective and rapid way of assessing the potential exposure of bystanders to exhaled e-cigarette aerosol constituents.

  8. Data-driven approach for assessing utility of medical tests using electronic medical records.

    Science.gov (United States)

    Skrøvseth, Stein Olav; Augestad, Knut Magne; Ebadollahi, Shahram

    2015-02-01

    To precisely define the utility of tests in a clinical pathway through data-driven analysis of the electronic medical record (EMR). The information content was defined in terms of the entropy of the expected value of the test related to a given outcome. A kernel density classifier was used to estimate the necessary distributions. To validate the method, we used data from the EMR of the gastrointestinal department at a university hospital. Blood tests from patients undergoing surgery for gastrointestinal surgery were analyzed with respect to second surgery within 30 days of the index surgery. The information content is clearly reflected in the patient pathway for certain combinations of tests and outcomes. C-reactive protein tests coupled to anastomosis leakage, a severe complication show a clear pattern of information gain through the patient trajectory, where the greatest gain from the test is 3-4 days post index surgery. We have defined the information content in a data-driven and information theoretic way such that the utility of a test can be precisely defined. The results reflect clinical knowledge. In the case we used the tests carry little negative impact. The general approach can be expanded to cases that carry a substantial negative impact, such as in certain radiological techniques. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Correlated nuclear and electronic dynamics in photoionized systems studied by quantum and mixed quantum-classical approaches

    International Nuclear Information System (INIS)

    Li, Zheng

    2014-09-01

    The advent of free electron lasers and high harmonic sources enables the investigation of electronic and nuclear dynamics of molecules and solids with atomic spatial resolution and femtosecond/attosecond time resolution, using bright and ultrashort laser pulses of frequency from terahertz to hard x-ray range. With the help of ultrashort laser pulses, the nuclear and electronic dynamics can be initiated, monitored and actively controlled at the typical time scale in the femtosecond to attosecond realm. Meanwhile, theoretical tools are required to describe the underlying mechanism. This doctoral thesis focuses on the development of theoretical tools based on full quantum mechanical multiconfiguration time-dependent Hartree (MCTDH) and mixed quantum classical approaches, which can be applied to describe the dynamical behavior of gas phase molecules and strongly correlated solids in the presence of ultrashort laser pulses. In the first part of this thesis, the focus is on the motion of electron holes in gas phase molecular ions created by extreme ultraviolet (XUV) photoionization and watched by spectroscopic approaches. The XUV photons create electron-hole in the valence orbitals of molecules by photoionization, the electron hole, as a positively charged quasi-particle, can then interact with the nuclei and the rest of electrons, leading to coupled non-Born-Oppenheimer dynamics. I present our study on electron-hole relaxation dynamics in valence ionized molecular ions of moderate size, using quantum wave packet and mixed quantum-classical approaches, using photoionized [H + (H 2 O) n ] + molecular ion as example. We have shown that the coupled motion of the electron-hole and the nuclei can be mapped out with femtosecond resolution by core-level x-ray transient absorption spectroscopy. Furthermore, in specific cases, the XUV photon can create a coherent electron hole, that can maintain its coherence to time scales of ∝ 1 picosecond. Employing XUV pump - IR probe

  10. Pseudoclassical approach to electron and ion density correlations in simple liquid metals

    International Nuclear Information System (INIS)

    Vericat, F.; Tosi, M.P.; Pastore, G.

    1986-04-01

    Electron-electron and electron-ion structural correlations in simple liquid metals are treated by using effective pair potentials to incorporate quantal effects into a pseudoclassical description of the electron fluid. An effective pair potential between simultaneous electron density fluctuations is first constructed from known properties of the degenerate jellium model, which are the plasmon sum rule, the Kimball-Niklasson relation and Yasuhara's values of the electron pair distribution function at contact. An analytic expression is thereby obtained in the Debye-Hueckel approximation for the electronic structure factor in jellium over a range of density appropriate to metals, with results which compare favourably with those of fully quantal evaluations. A simple pseudoclassical model is then set up for a liquid metal: this involves a model of charged hard spheres for the ion-ion potential and an empty core model for the electron-ion potential, the Coulombic tails being scaled as required by the relation between the long-wavelength partial structure factors and the isothermal compressibility of the metal. The model is solved analytically by a pseudoclassical linear response treatment of the electron-ion coupling and numerical results are reported for partial structure factors in liquid sodium and liquid beryllium. Contact is made for the latter system with data on the electron-electron structure factor in the crystal from inelastic X-ray scattering experiments of Eisenberger, Marra and Brown. (author)

  11. A partitioned correlation function interaction approach for describing electron correlation in atoms

    International Nuclear Information System (INIS)

    Verdebout, S; Godefroid, M; Rynkun, P; Jönsson, P; Gaigalas, G; Fischer, C Froese

    2013-01-01

    The traditional multiconfiguration Hartree–Fock (MCHF) and configuration interaction (CI) methods are based on a single orthonormal orbital basis. For atoms with many closed core shells, or complicated shell structures, a large orbital basis is needed to saturate the different electron correlation effects such as valence, core–valence and correlation within the core shells. The large orbital basis leads to massive configuration state function (CSF) expansions that are difficult to handle, even on large computer systems. We show that it is possible to relax the orthonormality restriction on the orbital basis and break down the originally very large calculations into a series of smaller calculations that can be run in parallel. Each calculation determines a partitioned correlation function (PCF) that accounts for a specific correlation effect. The PCFs are built on optimally localized orbital sets and are added to a zero-order multireference (MR) function to form a total wave function. The expansion coefficients of the PCFs are determined from a low dimensional generalized eigenvalue problem. The interaction and overlap matrices are computed using a biorthonormal transformation technique (Verdebout et al 2010 J. Phys. B: At. Mol. Phys. 43 074017). The new method, called partitioned correlation function interaction (PCFI), converges rapidly with respect to the orbital basis and gives total energies that are lower than the ones from ordinary MCHF and CI calculations. The PCFI method is also very flexible when it comes to targeting different electron correlation effects. Focusing our attention on neutral lithium, we show that by dedicating a PCF to the single excitations from the core, spin- and orbital-polarization effects can be captured very efficiently, leading to highly improved convergence patterns for hyperfine parameters compared with MCHF calculations based on a single orthogonal radial orbital basis. By collecting separately optimized PCFs to correct the

  12. A partitioned correlation function interaction approach for describing electron correlation in atoms

    Science.gov (United States)

    Verdebout, S.; Rynkun, P.; Jönsson, P.; Gaigalas, G.; Froese Fischer, C.; Godefroid, M.

    2013-04-01

    The traditional multiconfiguration Hartree-Fock (MCHF) and configuration interaction (CI) methods are based on a single orthonormal orbital basis. For atoms with many closed core shells, or complicated shell structures, a large orbital basis is needed to saturate the different electron correlation effects such as valence, core-valence and correlation within the core shells. The large orbital basis leads to massive configuration state function (CSF) expansions that are difficult to handle, even on large computer systems. We show that it is possible to relax the orthonormality restriction on the orbital basis and break down the originally very large calculations into a series of smaller calculations that can be run in parallel. Each calculation determines a partitioned correlation function (PCF) that accounts for a specific correlation effect. The PCFs are built on optimally localized orbital sets and are added to a zero-order multireference (MR) function to form a total wave function. The expansion coefficients of the PCFs are determined from a low dimensional generalized eigenvalue problem. The interaction and overlap matrices are computed using a biorthonormal transformation technique (Verdebout et al 2010 J. Phys. B: At. Mol. Phys. 43 074017). The new method, called partitioned correlation function interaction (PCFI), converges rapidly with respect to the orbital basis and gives total energies that are lower than the ones from ordinary MCHF and CI calculations. The PCFI method is also very flexible when it comes to targeting different electron correlation effects. Focusing our attention on neutral lithium, we show that by dedicating a PCF to the single excitations from the core, spin- and orbital-polarization effects can be captured very efficiently, leading to highly improved convergence patterns for hyperfine parameters compared with MCHF calculations based on a single orthogonal radial orbital basis. By collecting separately optimized PCFs to correct the MR

  13. SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (SMARTE): OVERVIEW AND DEMONSTRATION FOR FINAL PHASE 3 CONFERENCE

    Science.gov (United States)

    The U.S. contingent of the U.S.-German Bilateral Working Group is developing Sustainable Management Approaches and Revitalization Tools-electronic (SMARTe). SMARTe is a web-based, decision support system designed to assist stakeholders in developing and evaluating alternative reu...

  14. Mechanisms before Reactions: A Mechanistic Approach to the Organic Chemistry Curriculum Based on Patterns of Electron Flow

    Science.gov (United States)

    Flynn, Alison B.; Ogilvie, William W.

    2015-01-01

    A significant redesign of the introductory organic chemistry curriculum at the authors' institution is described. There are two aspects that differ greatly from a typical functional group approach. First, organic reaction mechanisms and the electron-pushing formalism are taught before students have learned a single reaction. The conservation of…

  15. Steganalysis using logistic regression

    Science.gov (United States)

    Lubenko, Ivans; Ker, Andrew D.

    2011-02-01

    We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.

  16. SEPARATION PHENOMENA LOGISTIC REGRESSION

    Directory of Open Access Journals (Sweden)

    Ikaro Daniel de Carvalho Barreto

    2014-03-01

    Full Text Available This paper proposes an application of concepts about the maximum likelihood estimation of the binomial logistic regression model to the separation phenomena. It generates bias in the estimation and provides different interpretations of the estimates on the different statistical tests (Wald, Likelihood Ratio and Score and provides different estimates on the different iterative methods (Newton-Raphson and Fisher Score. It also presents an example that demonstrates the direct implications for the validation of the model and validation of variables, the implications for estimates of odds ratios and confidence intervals, generated from the Wald statistics. Furthermore, we present, briefly, the Firth correction to circumvent the phenomena of separation.

  17. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...

  18. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  19. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  20. Primary processes of the electron-protic species coupling in pure aqueous phases: - femtosecond laser spectroscopy study; - quantum approach of the electron-water interaction

    International Nuclear Information System (INIS)

    Pommeret, Stanislas

    1991-01-01

    This thesis work deals with the coupling mechanisms between an electron, water molecules or protic species (hydronium ion, hydroxyl radical). Two complementary studies have been carry out in pure aqueous phases. The first one is concerned with the structural aspect of the hydrated electron which is studied via a semi-quantum approach Splitting Operator Method. The results indicates the importance of the second hydration shell in the localisation of an electron at 77 and 300 Kelvin. The second part of this work relates to the dynamic of the primary processes in light or heavy water at room temperature: the ion-molecule reaction, radical pair formation, geminate recombination of the hydrated electron with the hydronium ion and the hydroxyl radical. The dynamic of these reactions is studied by time resolved absorption spectroscopy from the near infrared to the near ultraviolet with a few tens femto-seconds temporal precision. The analysis of the primary processes takes into account the protic properties of water molecules. (author) [fr

  1. Photochemical approach to high-barrier films for the encapsulation of flexible laminary electronic devices

    Energy Technology Data Exchange (ETDEWEB)

    Prager, L., E-mail: lutz.prager@iom-leipzig.de [Leibniz-Institut für Oberflächenmodifizierung e.V., Permoserstr. 15, 04318 Leipzig (Germany); Helmstedt, U. [Leibniz-Institut für Oberflächenmodifizierung e.V., Permoserstr. 15, 04318 Leipzig (Germany); Herrnberger, H. [Solarion AG, Pereser Höhe 1, Breitscheidstraße 45, 04442 Zwenkau (Germany); Kahle, O. [Fraunhofer-Einrichtung für Polymermaterialien und Composite PYCO, Kantstraße 55, 14513 Teltow (Germany); Kita, F. [AZ Electronic Materials Germany GmbH, Rheingaustraße 190-196, 65203 Wiesbaden (Germany); Münch, M. [Solarion AG, Pereser Höhe 1, Breitscheidstraße 45, 04442 Zwenkau (Germany); Pender, A.; Prager, A.; Gerlach, J.W. [Leibniz-Institut für Oberflächenmodifizierung e.V., Permoserstr. 15, 04318 Leipzig (Germany); Stasiak, M. [Fraunhofer-Einrichtung für Polymermaterialien und Composite PYCO, Kantstraße 55, 14513 Teltow (Germany)

    2014-11-03

    Based on results of preceding research and development, thin gas barriers were made by wet application of perhydropolysilazane solution onto polymer films and its subsequent photo-initiated conversion to dense silica layers applying vacuum ultraviolet irradiation. Compared to the state of the art, these layers were sufficiently improved and characterized by spectroscopic methods, by scanning electron microscopy and by gas permeation measurements. Water vapor transmission rates (WVTR) below 10{sup −2} g m{sup −2} d{sup −1} were achieved. In this way, single barrier films were developed and produced on a pilot plant from roll to roll, 250 mm wide, at speeds up to 10 m min{sup −1}. Two films were laminated using adhesives curable with ultraviolet (UV) light and evaluated by peel tests, gas permeation measurement and climate testing. It could be shown that the described high-barrier laminates which exhibit WVTR ≈ 5 × 10{sup −4} g m{sup −2} d{sup −1}, determined by the calcium mirror method, are suitable for encapsulation of flexible thin-film photovoltaic modules. Durability of the encapsulated modules could be verified in several climate tests including damp-heat, thermo-cycle (heating, freezing, wetting) and UV exposures which are equivalent to more than 20 years of endurance at outdoor conditions in temperate climate. In the frame of further research and technical development it seems to be possible to design a cost efficient industrial scale process for the production of encapsulation films for photovoltaic applications. - Highlights: • Dense silica barrier layers were developed by a photochemical approach. • Polymer based barrier films were laminated yielding flexible high-barrier films. • Using these laminates photovoltaic test modules were encapsulated and tested. • A durability of more than 20 years at outdoor conditions could be proved.

  2. A deep convolutional neural network approach to single-particle recognition in cryo-electron microscopy.

    Science.gov (United States)

    Zhu, Yanan; Ouyang, Qi; Mao, Youdong

    2017-07-21

    Single-particle cryo-electron microscopy (cryo-EM) has become a mainstream tool for the structural determination of biological macromolecular complexes. However, high-resolution cryo-EM reconstruction often requires hundreds of thousands of single-particle images. Particle extraction from experimental micrographs thus can be laborious and presents a major practical bottleneck in cryo-EM structural determination. Existing computational methods for particle picking often use low-resolution templates for particle matching, making them susceptible to reference-dependent bias. It is critical to develop a highly efficient template-free method for the automatic recognition of particle images from cryo-EM micrographs. We developed a deep learning-based algorithmic framework, DeepEM, for single-particle recognition from noisy cryo-EM micrographs, enabling automated particle picking, selection and verification in an integrated fashion. The kernel of DeepEM is built upon a convolutional neural network (CNN) composed of eight layers, which can be recursively trained to be highly "knowledgeable". Our approach exhibits an improved performance and accuracy when tested on the standard KLH dataset. Application of DeepEM to several challenging experimental cryo-EM datasets demonstrated its ability to avoid the selection of un-wanted particles and non-particles even when true particles contain fewer features. The DeepEM methodology, derived from a deep CNN, allows automated particle extraction from raw cryo-EM micrographs in the absence of a template. It demonstrates an improved performance, objectivity and accuracy. Application of this novel method is expected to free the labor involved in single-particle verification, significantly improving the efficiency of cryo-EM data processing.

  3. Understanding key factors affecting electronic medical record implementation: a sociotechnical approach.

    Science.gov (United States)

    Cucciniello, Maria; Lapsley, Irvine; Nasi, Greta; Pagliari, Claudia

    2015-07-17

    Recent health care policies have supported the adoption of Information and Communication Technologies (ICT) but examples of failed ICT projects in this sector have highlighted the need for a greater understanding of the processes used to implement such innovations in complex organizations. This study examined the interaction of sociological and technological factors in the implementation of an Electronic Medical Record (EMR) system by a major national hospital. It aimed to obtain insights for managers planning such projects in the future and to examine the usefulness of Actor Network Theory (ANT) as a research tool in this context. Case study using documentary analysis, interviews and observations. Qualitative thematic analysis drawing on ANT. Qualitative analyses revealed a complex network of interactions between organizational stakeholders and technology that helped to shape the system and influence its acceptance and adoption. The EMR clearly emerged as a central 'actor' within this network. The results illustrate how important it is to plan innovative and complex information systems with reference to (i) the expressed needs and involvement of different actors, starting from the initial introductory phase; (ii) promoting commitment to the system and adopting a participative approach; (iii) defining and resourcing new roles within the organization capable of supporting and sustaining the change and (iv) assessing system impacts in order to mobilize the network around a common goal. The paper highlights the organizational, cultural, technological, and financial considerations that should be taken into account when planning strategies for the implementation of EMR systems in hospital settings. It also demonstrates how ANT may be usefully deployed in evaluating such projects.

  4. Real-space grids and the Octopus code as tools for the development of new simulation approaches for electronic systems

    Science.gov (United States)

    Andrade, Xavier; Strubbe, David; De Giovannini, Umberto; Larsen, Ask Hjorth; Oliveira, Micael J. T.; Alberdi-Rodriguez, Joseba; Varas, Alejandro; Theophilou, Iris; Helbig, Nicole; Verstraete, Matthieu J.; Stella, Lorenzo; Nogueira, Fernando; Aspuru-Guzik, Alán; Castro, Alberto; Marques, Miguel A. L.; Rubio, Angel

    Real-space grids are a powerful alternative for the simulation of electronic systems. One of the main advantages of the approach is the flexibility and simplicity of working directly in real space where the different fields are discretized on a grid, combined with competitive numerical performance and great potential for parallelization. These properties constitute a great advantage at the time of implementing and testing new physical models. Based on our experience with the Octopus code, in this article we discuss how the real-space approach has allowed for the recent development of new ideas for the simulation of electronic systems. Among these applications are approaches to calculate response properties, modeling of photoemission, optimal control of quantum systems, simulation of plasmonic systems, and the exact solution of the Schr\\"odinger equation for low-dimensionality systems.

  5. Real-space local polynomial basis for solid-state electronic-structure calculations: A finite-element approach

    International Nuclear Information System (INIS)

    Pask, J.E.; Klein, B.M.; Fong, C.Y.; Sterne, P.A.

    1999-01-01

    We present an approach to solid-state electronic-structure calculations based on the finite-element method. In this method, the basis functions are strictly local, piecewise polynomials. Because the basis is composed of polynomials, the method is completely general and its convergence can be controlled systematically. Because the basis functions are strictly local in real space, the method allows for variable resolution in real space; produces sparse, structured matrices, enabling the effective use of iterative solution methods; and is well suited to parallel implementation. The method thus combines the significant advantages of both real-space-grid and basis-oriented approaches and so promises to be particularly well suited for large, accurate ab initio calculations. We develop the theory of our approach in detail, discuss advantages and disadvantages, and report initial results, including electronic band structures and details of the convergence of the method. copyright 1999 The American Physical Society

  6. Aid and growth regressions

    DEFF Research Database (Denmark)

    Hansen, Henrik; Tarp, Finn

    2001-01-01

    This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy....... investment. We conclude by stressing the need for more theoretical work before this kind of cross-country regressions are used for policy purposes.......This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy...

  7. Communication: Multireference equation of motion coupled cluster: A transform and diagonalize approach to electronic structure

    Czech Academy of Sciences Publication Activity Database

    Nooijen, M.; Demel, Ondřej; Datta, D.; Kong, L.; Shamasundar, K. R.; Lotrich, V.; Huntington, L. M.; Neese, F.

    2014-01-01

    Roč. 140, č. 8 (2014), 081102 ISSN 0021-9606 R&D Projects: GA ČR GPP208/10/P041; GA ČR GAP208/11/2222 Institutional support: RVO:61388955 Keywords : Electronic states * Electronic structure * Equations of motion Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.952, year: 2014

  8. Connecting Print and Electronic Titles: An Integrated Approach at the University of Nebraska-Lincoln

    Science.gov (United States)

    Wolfe, Judith; Konecky, Joan Latta; Boden, Dana W. R.

    2011-01-01

    Libraries make heavy investments in electronic resources, with many of these resources reflecting title changes, bundled subsets, or content changes of formerly print material. These changes can distance the electronic format from its print origins, creating discovery and access issues. A task force was formed to explore the enhancement of catalog…

  9. Regression of environmental noise in LIGO data

    International Nuclear Information System (INIS)

    Tiwari, V; Klimenko, S; Mitselmakher, G; Necula, V; Drago, M; Prodi, G; Frolov, V; Yakushin, I; Re, V; Salemi, F; Vedovato, G

    2015-01-01

    We address the problem of noise regression in the output of gravitational-wave (GW) interferometers, using data from the physical environmental monitors (PEM). The objective of the regression analysis is to predict environmental noise in the GW channel from the PEM measurements. One of the most promising regression methods is based on the construction of Wiener–Kolmogorov (WK) filters. Using this method, the seismic noise cancellation from the LIGO GW channel has already been performed. In the presented approach the WK method has been extended, incorporating banks of Wiener filters in the time–frequency domain, multi-channel analysis and regulation schemes, which greatly enhance the versatility of the regression analysis. Also we present the first results on regression of the bi-coherent noise in the LIGO data. (paper)

  10. New real space correlated-basis-functions approach for the electron correlations of the semiconductor inversion layer

    International Nuclear Information System (INIS)

    Feng Weiguo; Wang Hongwei; Wu Xiang

    1989-12-01

    Based on the real space Correlated-Basis-Functions theory and the collective oscillation behaviour of the electron gas with effective Coulomb interaction, the many body wave function is obtained for the quasi-two-dimensional electron system in the semiconductor inversion layer. The pair-correlation function and the correlation energy of the system have been calculated by the integro-differential method in this paper. The comparison with the other previous theoretical results is also made. The new theoretical approach and its numerical results show that the pair-correlation functions are definitely positive and satisfy the normalization condition. (author). 10 refs, 2 figs

  11. There is No Quantum Regression Theorem

    International Nuclear Information System (INIS)

    Ford, G.W.; OConnell, R.F.

    1996-01-01

    The Onsager regression hypothesis states that the regression of fluctuations is governed by macroscopic equations describing the approach to equilibrium. It is here asserted that this hypothesis fails in the quantum case. This is shown first by explicit calculation for the example of quantum Brownian motion of an oscillator and then in general from the fluctuation-dissipation theorem. It is asserted that the correct generalization of the Onsager hypothesis is the fluctuation-dissipation theorem. copyright 1996 The American Physical Society

  12. Marginal longitudinal semiparametric regression via penalized splines

    KAUST Repository

    Al Kadiri, M.

    2010-08-01

    We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.

  13. Marginal longitudinal semiparametric regression via penalized splines

    KAUST Repository

    Al Kadiri, M.; Carroll, R.J.; Wand, M.P.

    2010-01-01

    We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.

  14. An Open Architecture Framework for Electronic Warfare Based Approach to HLA Federate Development

    Directory of Open Access Journals (Sweden)

    HyunSeo Kang

    2018-01-01

    Full Text Available A variety of electronic warfare models are developed in the Electronic Warfare Research Center. An Open Architecture Framework for Electronic Warfare (OAFEw has been developed for reusability of various object models participating in the electronic warfare simulation and for extensibility of the electronic warfare simulator. OAFEw is a kind of component-based software (SW lifecycle management support framework. This OAFEw is defined by six components and ten rules. The purpose of this study is to construct a Distributed Simulation Interface Model, according to the rules of OAFEw, and create Use Case Model of OAFEw Reference Conceptual Model version 1.0. This is embodied in the OAFEw-FOM (Federate Object Model for High-Level Architecture (HLA based distributed simulation. Therefore, we design and implement EW real-time distributed simulation that can work with a model in C++ and MATLAB API (Application Programming Interface. In addition, OAFEw-FOM, electronic component model, and scenario of the electronic warfare domain were designed through simple scenarios for verification, and real-time distributed simulation between C++ and MATLAB was performed through OAFEw-Distributed Simulation Interface.

  15. Charge transfer dynamics from adsorbates to surfaces with single active electron and configuration interaction based approaches

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Raghunathan, E-mail: r.ramakrishnan@unibas.ch [Institute of Physical Chemistry, National Center for Computational Design and Discovery of Novel Materials (MARVEL), Department of Chemistry, University of Basel, Klingelbergstrasse 80, CH-4056 Basel (Switzerland); Nest, Mathias [Theoretische Chemie, Technische Universität München, Lichtenbergstr. 4, 85747 Garching (Germany)

    2015-01-13

    Highlights: • We model electron dynamics across cyano alkanethiolates attached to gold cluster. • We present electron transfer time scales from TD-DFT and TD-CI based simulations. • Both DFT and CI methods qualitatively predict the trend in time scales. • TD-CI predicts the experimental relative time scale very accurately. - Abstract: We employ wavepacket simulations based on many-body time-dependent configuration interaction (TD-CI), and single active electron theories, to predict the ultrafast molecule/metal electron transfer time scales, in cyano alkanethiolates bonded to model gold clusters. The initial states represent two excited states where a valence electron is promoted to one of the two virtual π{sup ∗} molecular orbitals localized on the cyanide fragment. The ratio of the two time scales indicate the efficiency of one charge transfer channel over the other. In both our one-and many-electron simulations, this ratio agree qualitatively with each other as well as with the previously reported experimental time scales (Blobner et al., 2012), measured for a macroscopic metal surface. We study the effect of cluster size and the description of electron correlation on the charge transfer process.

  16. A Theoretical Approach to Electronic Prescription System: Lesson Learned from Literature Review

    Science.gov (United States)

    Samadbeik, Mahnaz; Ahmadi, Maryam; Hosseini Asanjan, Seyed Masoud

    2013-01-01

    Context The tendency to use advanced technology in healthcare and the governmental policies have put forward electronic prescription. Electronic prescription is considered as the main solution to overcome the major drawbacks of the paper-based medication prescription, such as transcription errors. This study aims to provide practical information concerning electronic prescription system to a variety of stakeholders. Evidence Acquisition In this review study, PubMed, ISI Web of Science, Scopus, EMBASE databases, Iranian National Library Of Medicine (INLM) portal, Google Scholar, Google and Yahoo were searched for relevant English publications concerning the problems of paper-based prescription, and concept, features, levels, benefits, stakeholders and standards of electronic prescription system. Results There are many problems with the paper prescription system which, according to studies have jeopardized patients’ safety and negatively affected the outcomes of medication therapy. All of these problems are remedied through the implementation of e-prescriptions. Conclusions The sophistication of electronic prescription and integration with EHR will become a reality, if all its stakeholders collaborate in developing fast and secure electronic prescription systems. It is plausible that the required infrastructure should be provided for implementation of the national integrated electronic prescription systems in countries without the system. Given the barriers to the implementation and use, policymakers should consider multiple strategies and offer incentives to encourage e-prescription initiatives. This will result in widespread adoption of the system. PMID:24693376

  17. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  18. A combined experimental and analytical approach for interface fracture parameters of dissimilar materials in electronic packages

    International Nuclear Information System (INIS)

    Kay, N.R.; Ghosh, S.; Guven, I.; Madenci, E.

    2006-01-01

    This study concerns the development of a combined experimental and analytical technique to determine the critical values of fracture parameters for interfaces between dissimilar materials in electronic packages. This technique utilizes specimens from post-production electronic packages. The mechanical testing is performed inside a scanning electron microscope while the measurements are achieved by means of digital image correlation. The measured displacements around the crack tip are used as the boundary conditions for the analytical model to compute the energy release rate. The critical energy release rate values obtained from post-production package specimens are obtained to be lower than those laboratory specimens

  19. Quantum algorithm for linear regression

    Science.gov (United States)

    Wang, Guoming

    2017-07-01

    We present a quantum algorithm for fitting a linear regression model to a given data set using the least-squares approach. Differently from previous algorithms which yield a quantum state encoding the optimal parameters, our algorithm outputs these numbers in the classical form. So by running it once, one completely determines the fitted model and then can use it to make predictions on new data at little cost. Moreover, our algorithm works in the standard oracle model, and can handle data sets with nonsparse design matrices. It runs in time poly( log2(N ) ,d ,κ ,1 /ɛ ) , where N is the size of the data set, d is the number of adjustable parameters, κ is the condition number of the design matrix, and ɛ is the desired precision in the output. We also show that the polynomial dependence on d and κ is necessary. Thus, our algorithm cannot be significantly improved. Furthermore, we also give a quantum algorithm that estimates the quality of the least-squares fit (without computing its parameters explicitly). This algorithm runs faster than the one for finding this fit, and can be used to check whether the given data set qualifies for linear regression in the first place.

  20. A two potential embedding approach to the electronic structure of disordered binary alloys

    International Nuclear Information System (INIS)

    Ahmed, M.; Mookerjee, A.

    1988-06-01

    Using an embedding technique introduced in a recent publication by one of us, we study the electronic structure of disordered binary alloys within a pair-cluster coherent potential approximation. (author). 4 refs, 3 figs

  1. Non-relativistic electron transport in metals: a Monte Carlo approach

    International Nuclear Information System (INIS)

    Rahimi, F.; Ghal eh, N.

    2001-01-01

    A simple Monte Carlo procedure is described for simulating the multiple scattering and absorption of electrons with the incident energy in the range 1-50 keV moving through a slab of uniformly distributed material of given atomic number, density and thickness. The simulation is based on a screened Rutherford cross-section and Bethe continuous energy-loss equation. A FORTRAN program is written to determine backscattering, transmission and absorption coefficients, providing the user with a graphical output of the electron trajectories. The results of several simulations are presented by using various numbers of electrons, showing a good agreement with the experiment. The program is used to analyze the relation between the energy and the range of electron in the slab, the backscattering, absorption, transmission coefficients and the angular distribution

  2. Laser wakefield electron acceleration. A novel approach employing supersonic microjets and few-cycle laser pulses

    International Nuclear Information System (INIS)

    Schmid, Karl

    2011-01-01

    This thesis covers the few-cycle laser-driven acceleration of electrons in a laser-generated plasma. This process, known as laser wakefield acceleration (LWFA), relies on strongly driven plasma waves for the generation of accelerating gradients in the vicinity of several 100 GV/m, a value four orders of magnitude larger than that attainable by conventional accelerators. This thesis demonstrates that laser pulses with an ultrashort duration of 8 fs and a peak power of 6 TW allow the production of electron energies up to 50 MeV via LWFA. The special properties of laser accelerated electron pulses, namely the ultrashort pulse duration, the high brilliance, and the high charge density, open up new possibilities in many applications of these electron beams. (orig.)

  3. Optimising waste from electric and electronic equipment collection systems: a comparison of approaches in European countries.

    Science.gov (United States)

    Friege, Henning; Oberdörfer, Michael; Günther, Marko

    2015-03-01

    The first European waste from electric and electronic equipment directive obliged the Member States to collect 4 kg of used devices per inhabitant and year. The target of the amended directive focuses on the ratio between the amount of waste from electric and electronic equipment collected and the mass of electric and electronic devices put on the market in the three foregoing years. The minimum collection target is 45% starting in 2016, being increased to 65% in 2019 or alternatively 85% of waste from electric and electronic equipment generated. Being aware of the new target, the question arises how Member States with 'best practice' organise their collection systems and how they enforce the parties in this playing field. Therefore the waste from electric and electronic equipment schemes of Sweden, Denmark, Switzerland, Germany and the Flemish region of Belgium were investigated focusing on the categories IT and telecommunications equipment, consumer equipment like audio systems and discharge lamps containing hazardous substances, e.g. mercury. The systems for waste from electric and electronic equipment collection in these countries vary considerably. Recycling yards turned out to be the backbone of waste from electric and electronic equipment collection in most countries studied. For discharge lamps, take-back by retailers seems to be more important. Sampling points like special containers in shopping centres, lidded waste bins and complementary return of used devices in all retail shops for electric equipment may serve as supplements. High transparency of collection and recycling efforts can encourage ambition among the concerned parties. Though the results from the study cannot be transferred in a simplistic manner, they serve as an indication for best practice methods for waste from electric and electronic equipment collection. © The Author(s) 2015.

  4. Quantum inelastic electron-vibration scattering in molecular wires: Landauer-like versus Green's function approaches and temperature effects

    International Nuclear Information System (INIS)

    Ness, H

    2006-01-01

    In this paper, we consider the problem of inelastic electron transport in molecular systems in which both electronic and vibrational degrees of freedom are considered on the quantum level. The electronic transport properties of the corresponding molecular nanojunctions are obtained by means of a non-perturbative Landauer-like multi-channel inelastic scattering technique. The connections between this approach and other Green's function techniques that are useful in particular cases are studied in detail. The validity of the wide-band approximation, the effects of the lead self-energy and the dynamical polaron shift are also studied for a wide range of parameters. As a practical application of the method, we consider the effects of the temperature on the conductance properties of molecular breakjunctions in relation to recent experiments

  5. Inventory of electronic money as method of its control: process approach

    Directory of Open Access Journals (Sweden)

    A.Р. Semenets

    2016-09-01

    Full Text Available The extent of legal regulation of inventory of electronic money in the company is considered. The absence of developed techniques of valuation as well as reflection of electronic money on the accounts, which results in distortion of indicators of financial statements are detected. The author develops the organizational and methodical provisions of inventory of electronic money in accordance with the stages that will ensure the avoidance of misstatements in the financial statements and providing users with more reliable information about the amount and as well as oddments of electronic money at the company on the balance sheet date. The effect of accounting policies, provisions for the organization of accounting as well as job description on the control system for transactions with electronic money, including their inventory, are determined. The author discovers the typical violations that occur during reflecting the transactions with electronic money in accounting, early detection of which will enable appropriate adjustments for the avoidance of misstatements of the information provided in the financial statements of the company.

  6. Lie algebraic approach to valence bond theory of π-electron systems: a preliminary study of excited states

    Science.gov (United States)

    Paldus, J.; Li, X.

    1992-10-01

    Following a brief outline of various developments and exploitations of the unitary group approach (UGA), and its extension referred to as Clifford algebra UGA (CAUGA), in molecular electronic structure calculations, we present a summary of a recently introduced implementation of CAUGA for the valence bond (VB) method based on the Pariser-Parr-Pople (PPP)-type Hamiltonian. The existing applications of this PPP-VB approach have been limited to groundstates of various π-electron systems or, at any rate, to the lowest states of a given multiplicity. In this paper the method is applied to the low-lying excited states of several archetypal models, namely cyclobutadiene and benzene, representing antiaromatic and aromatic systems, hexatriene, representing linear polyenic systems and, finally, naphthalene, representing polyacenes.

  7. Dimer and cluster approach for the evaluation of electronic couplings governing charge transport: Application to two pentacene polymorphs

    International Nuclear Information System (INIS)

    Canola, Sofia; Pecoraro, Claudia; Negri, Fabrizia

    2016-01-01

    Hole transport properties are modeled for two polymorphs of pentacene: the single crystal polymorph and the thin film polymorph relevant for organic thin-film transistor applications. Electronic couplings are evaluated in the standard dimer approach but also considering a cluster approach in which the central molecule is surrounded by a large number of molecules quantum-chemically described. The effective electronic couplings suitable for the parametrization of a tight-binding model are derived either from the orthogonalization scheme limited to HOMO orbitals and from the orthogonalization of the full basis of molecular orbitals. The angular dependent mobilities estimated for the two polymorphs using the predicted pattern of couplings display different anisotropy characteristics as suggested from experimental investigations.

  8. Dimer and cluster approach for the evaluation of electronic couplings governing charge transport: Application to two pentacene polymorphs

    Energy Technology Data Exchange (ETDEWEB)

    Canola, Sofia; Pecoraro, Claudia; Negri, Fabrizia

    2016-10-20

    Hole transport properties are modeled for two polymorphs of pentacene: the single crystal polymorph and the thin film polymorph relevant for organic thin-film transistor applications. Electronic couplings are evaluated in the standard dimer approach but also considering a cluster approach in which the central molecule is surrounded by a large number of molecules quantum-chemically described. The effective electronic couplings suitable for the parametrization of a tight-binding model are derived either from the orthogonalization scheme limited to HOMO orbitals and from the orthogonalization of the full basis of molecular orbitals. The angular dependent mobilities estimated for the two polymorphs using the predicted pattern of couplings display different anisotropy characteristics as suggested from experimental investigations.

  9. High-resolution electron microscope image analysis approach for superconductor YBa2Cu3O7-x

    International Nuclear Information System (INIS)

    Xu, J.; Lu, F.; Jia, C.; Hua, Z.

    1991-01-01

    In this paper, an HREM (High-resolution electron microscope) image analysis approach has been developed. The image filtering, segmentation and particles extraction based on gray-scale mathematical morphological operations, are performed on the original HREM image. The final image is a pseudocolor image, with the background removed, relatively uniform brightness, filtered slanting elongation, regular shape for every kind of particle, and particle boundaries that no longer touch each other so that the superconducting material structure can be shown clearly

  10. The effectiveness of selected feed and water additives for reducing Salmonella spp. of public health importance in broiler chickens: a systematic review, meta-analysis, and meta-regression approach.

    Science.gov (United States)

    Totton, Sarah C; Farrar, Ashley M; Wilkins, Wendy; Bucher, Oliver; Waddell, Lisa A; Wilhelm, Barbara J; McEwen, Scott A; Rajić, Andrijana

    2012-10-01

    Eating inappropriately prepared poultry meat is a major cause of foodborne salmonellosis. Our objectives were to determine the efficacy of feed and water additives (other than competitive exclusion and antimicrobials) on reducing Salmonella prevalence or concentration in broiler chickens using systematic review-meta-analysis and to explore sources of heterogeneity found in the meta-analysis through meta-regression. Six electronic databases were searched (Current Contents (1999-2009), Agricola (1924-2009), MEDLINE (1860-2009), Scopus (1960-2009), Centre for Agricultural Bioscience (CAB) (1913-2009), and CAB Global Health (1971-2009)), five topic experts were contacted, and the bibliographies of review articles and a topic-relevant textbook were manually searched to identify all relevant research. Study inclusion criteria comprised: English-language primary research investigating the effects of feed and water additives on the Salmonella prevalence or concentration in broiler chickens. Data extraction and study methodological assessment were conducted by two reviewers independently using pretested forms. Seventy challenge studies (n=910 unique treatment-control comparisons), seven controlled studies (n=154), and one quasi-experiment (n=1) met the inclusion criteria. Compared to an assumed control group prevalence of 44 of 1000 broilers, random-effects meta-analysis indicated that the Salmonella cecal colonization in groups with prebiotics (fructooligosaccharide, lactose, whey, dried milk, lactulose, lactosucrose, sucrose, maltose, mannanoligosaccharide) added to feed or water was 15 out of 1000 broilers; with lactose added to feed or water it was 10 out of 1000 broilers; with experimental chlorate product (ECP) added to feed or water it was 21 out of 1000. For ECP the concentration of Salmonella in the ceca was decreased by 0.61 log(10)cfu/g in the treated group compared to the control group. Significant heterogeneity (Cochran's Q-statistic p≤0.10) was observed

  11. A correlative optical microscopy and scanning electron microscopy approach to locating nanoparticles in brain tumors.

    Science.gov (United States)

    Kempen, Paul J; Kircher, Moritz F; de la Zerda, Adam; Zavaleta, Cristina L; Jokerst, Jesse V; Mellinghoff, Ingo K; Gambhir, Sanjiv S; Sinclair, Robert

    2015-01-01

    The growing use of nanoparticles in biomedical applications, including cancer diagnosis and treatment, demands the capability to exactly locate them within complex biological systems. In this work a correlative optical and scanning electron microscopy technique was developed to locate and observe multi-modal gold core nanoparticle accumulation in brain tumor models. Entire brain sections from mice containing orthotopic brain tumors injected intravenously with nanoparticles were imaged using both optical microscopy to identify the brain tumor, and scanning electron microscopy to identify the individual nanoparticles. Gold-based nanoparticles were readily identified in the scanning electron microscope using backscattered electron imaging as bright spots against a darker background. This information was then correlated to determine the exact location of the nanoparticles within the brain tissue. The nanoparticles were located only in areas that contained tumor cells, and not in the surrounding healthy brain tissue. This correlative technique provides a powerful method to relate the macro- and micro-scale features visible in light microscopy with the nanoscale features resolvable in scanning electron microscopy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Surface electronic transport measurements: A micro multi-point probe approach

    DEFF Research Database (Denmark)

    Barreto, Lucas

    2014-01-01

    This work is mostly focused on the study of electronic transport properties of two-dimensional materials, in particular graphene and topological insulators. To study these, we have improved a unique micro multi-point probe instrument used to perform transport measurements. Not only the experimental...... quantities are extracted, such as conductivity, carrier density and carrier mobility. • A method to insulate electrically epitaxial graphene grown on metals, based on a stepwise intercalation methodology, is developed and transport measurements are performed in order to test the insulation. • We show...... a direct measurement of the surface electronic transport on a bulk topological insulator. The surface state conductivity and mobility are obtained. Apart from transport properties, we also investigate the atomic structure of the Bi2Se3(111) surface via surface x-ray diraction and low-energy electron...

  13. Effects of surface functionalization on the electronic and structural properties of carbon nanotubes: A computational approach

    Science.gov (United States)

    Ribeiro, M. S.; Pascoini, A. L.; Knupp, W. G.; Camps, I.

    2017-12-01

    Carbon nanotubes (CNTs) have important electronic, mechanical and optical properties. These features may be different when comparing a pristine nanotube with other presenting its surface functionalized. These changes can be explored in areas of research and application, such as construction of nanodevices that act as sensors and filters. Following this idea, in the current work, we present the results from a systematic study of CNT's surface functionalized with hydroxyl and carboxyl groups. Using the entropy as selection criterion, we filtered a library of 10k stochastically generated complexes for each functional concentration (5, 10, 15, 20 and 25%). The structurally related parameters (root-mean-square deviation, entropy, and volume/area) have a monotonic relationship with functionalization concentration. Differently, the electronic parameters (frontier molecular orbital energies, electronic gap, molecular hardness, and electrophilicity index) present and oscillatory behavior. For a set of concentrations, the nanotubes present spin polarized properties that can be used in spintronics.

  14. Forbidden transitions in excitation by electron impact in Co3+: an R-matrix approach

    International Nuclear Information System (INIS)

    Stancalie, V

    2011-01-01

    Collision strengths for the electron-impact excitation of forbidden transitions between 136 terms arising from 3d 6 , 3d 5 4s and 3d 5 4p configurations of Co 3+ have been calculated using the R-matrix method. The accuracy of a series of models for the target terms was considered, which form the basis for R-matrix collision calculations. The importance of including configuration interaction wave functions both in the target-state expansion and in the (N+1)-electron quadratically integrable function expansion is discussed. Collision strengths were calculated for incident electron energies up to 6 Ryd. These results are believed to be the first such values for this system and will be important for plasma modelling.

  15. Space-group approach to two-electron states in unconventional superconductors

    International Nuclear Information System (INIS)

    Yarzhemsky, V. G.

    2008-01-01

    The direct application of the space-group representation theory, makes possible to obtain limitations for the symmetry of SOP on lines and planes of symmetry in one-electron Brillouin zone. In the case of highly symmetric UPt 3 only theoretical nodal structure of IR E 2u is in agreement with all the experimental results. On the other hand, in the case of high-T c superconductors the two electron description of Cooper pairs in D 2h symmetry is not sufficient to describe experimental nodal structure. It was shown that in this case, the nodal structure is the result of underlying interactions between two-electron states and hidden symmetry D-4 h . (author)

  16. Approaching an organic semimetal: Electron pockets at the Fermi level for a p-benzoquinonemonoimine zwitterion

    Energy Technology Data Exchange (ETDEWEB)

    Rosa, Luis G.; Velev, Julian [Department of Physics and Electronics, University of Puerto Rico, Humacao (United States); Institute for Functional Nanomaterials, University of Puerto Rico, San Juan (United States); Department of Physics and Astronomy, Nebraska Center for Materials and Nanoscience, University of Nebraska-Lincoln, NE (United States); Zhang, Zhengzheng [Department of Physics, University of Puerto Rico, Rio Piedras, San Juan (United States); Alvira, Jose; Vega, Omar; Diaz, Gerson [Department of Physics and Electronics, University of Puerto Rico, Humacao (United States); Routaboul, Lucie; Braunstein, Pierre [Laboratoire de Chimie de Coordination, Institut de Chimie (UMR 7177 CNRS), Universite de Strasbourg (France); Doudin, Bernard [Institut de Physique, Applique de Physique et Chimie des Materiaux de Strasbourg, Universite Louis Pasteur Strasbourg (France); Losovyj, Yaroslav B. [Institute for Functional Nanomaterials, University of Puerto Rico, San Juan (United States); J. Bennett Johnston Sr. Center for Advanced Microstructures and Devices, Louisiana State Univ., Baton Rouge, LA (United States); Dowben, Peter A. [Institute for Functional Nanomaterials, University of Puerto Rico, San Juan (United States)

    2012-08-15

    There is compelling evidence of electron pockets, at the Fermi level, in the band structure for an organic zwitterion molecule of the p-benzoquinonemonoimine type. The electronic structure of the zwitterion molecular film has a definite, although small, density of states evident at the Fermi level as well as a nonzero inner potential and thus is very different from a true insulator. In spite of a small Brillouin zone, significant band width is observed in the intermolecular band dispersion. The results demonstrate that Bloch's theorem applies to the wave vector dependence of the electronic band structure formed from the molecular orbitals of adjacent molecules in a molecular thin film of a p-benzoquinonemonoimine type zwitterion. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  17. X-ray Microscopy as an Approach to Increasing Accuracy and Efficiency of Serial Block-face Imaging for Correlated Light and Electron Microscopy of Biological Specimens

    OpenAIRE

    Bushong, Eric A.; Johnson, Donald D.; Kim, Keun-Young; Terada, Masako; Hatori, Megumi; Peltier, Steven T.; Panda, Satchidananda; Merkle, Arno; Ellisman, Mark H.

    2014-01-01

    The recently developed three-dimensional electron microscopic (EM) method of serial block-face scanning electron microscopy (SBEM) has rapidly established itself as a powerful imaging approach. Volume EM imaging with this scanning electron microscopy (SEM) method requires intense staining of biological specimens with heavy metals to allow sufficient back-scatter electron signal and also to render specimens sufficiently conductive to control charging artifacts. These more extreme heavy metal s...

  18. New approach for the electronic energies of the hydrogen molecular ion

    International Nuclear Information System (INIS)

    Scott, Tony C.; Aubert-Frecon, Monique; Grotendorst, Johannes

    2006-01-01

    Herein, we present analytical solutions for the electronic energy eigenvalues of the hydrogen molecular ion H 2 + , namely the one-electron two-fixed-center problem. These are given for the homonuclear case for the countable infinity of discrete states when the magnetic quantum number m is zero, i.e., for 2 Σ + states. In this case, these solutions are the roots of a set of two coupled three-term recurrence relations. The eigensolutions are obtained from an application of experimental mathematics using Computer Algebra as its principal tool and are vindicated by numerical and algebraic demonstrations. Finally, the mathematical nature of the eigenenergies is identified

  19. Total-dielectric-function approach to electron and phonon response in solids

    International Nuclear Information System (INIS)

    Penn, D.R.; Lewis, S.P.; Cohen, M.L.

    1995-01-01

    The interaction between two test charges, the response of a solid to an external field, and the normal modes of the solid can be determined from a total dielectric function that includes both electronic and lattice polarizabilities as well as local-field effects. In this paper we examine the relationship between superconductivity and the stability of a solid and derive sum rules for the electronic part of the dielectric function. It is also shown that there are negative eigenvalues of the total static dielectric function, implying the possibility of an attractive interaction between test charges. An attractive interaction is required for superconductivity

  20. Electron and Nucleon Localization Functions of Oganesson: Approaching the Thomas-Fermi Limit

    Science.gov (United States)

    Jerabek, Paul; Schuetrumpf, Bastian; Schwerdtfeger, Peter; Nazarewicz, Witold

    2018-02-01

    Fermion localization functions are used to discuss electronic and nucleonic shell structure effects in the superheavy element oganesson, the heaviest element discovered to date. Spin-orbit splitting in the 7 p electronic shell becomes so large (˜10 eV ) that Og is expected to show uniform-gas-like behavior in the valence region with a rather large dipole polarizability compared to the lighter rare gas elements. The nucleon localization in Og is also predicted to undergo a transition to the Thomas-Fermi gas behavior in the valence region. This effect, particularly strong for neutrons, is due to the high density of single-particle orbitals.

  1. Polynomial regression analysis and significance test of the regression function

    International Nuclear Information System (INIS)

    Gao Zhengming; Zhao Juan; He Shengping

    2012-01-01

    In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)

  2. Recursive Algorithm For Linear Regression

    Science.gov (United States)

    Varanasi, S. V.

    1988-01-01

    Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.

  3. Integration of electronic nose technology with spirometry: validation of a new approach for exhaled breath analysis

    NARCIS (Netherlands)

    de Vries, R.; Brinkman, P.; van der Schee, M. P.; Fens, N.; Dijkers, E.; Bootsma, S. K.; de Jongh, F. H. C.; Sterk, P. J.

    2015-01-01

    New 'omics'-technologies have the potential to better define airway disease in terms of pathophysiological and clinical phenotyping. The integration of electronic nose (eNose) technology with existing diagnostic tests, such as routine spirometry, can bring this technology to 'point-of-care'. We

  4. FELIX: a high-throughput network approach for interfacing to front end electronics for ATLAS upgrades

    NARCIS (Netherlands)

    Anderson, J.; Borga, A.; Boterenbrood, H.; Chen, H.; Chen, K.; Drake, G.; Francis, D.; Gorini, B.; Lanni, F.; Lehmann Miotto, G.; Levinson, L.; Narevicius, J.; Plessl, C.; Roich, A.; Ryu, S.; Schreuder, F.; Schumacher, J.; Vandelli, W.; Vermeulen, J.; Zhang, J.

    2015-01-01

    The ATLAS experiment at CERN is planning full deployment of a new unified optical link technology for connecting detector front end electronics on the timescale of the LHC Run 4 (2025). It is estimated that roughly 8000 GBT (GigaBit Transceiver) links, with transfer rates up to 10.24 Gbps, will

  5. A new approach to nuclear microscopy: The ion-electron emission microscope

    International Nuclear Information System (INIS)

    Doyle, B.L.; Vizkelethy, G.; Walsh, D.S.; Senftinger, B.; Mellon, M.

    1998-01-01

    A new multidimensional high lateral resolution ion beam analysis technique, Ion-Electron Emission Microscopy or IEEM is described. Using MeV energy ions, IEEM is shown to be capable of Ion Beam Induced Charge Collection (IBICC) measurements in semiconductors. IEEM should also be capable of microscopically and multidimensionally mapping the surface and bulk composition of solids. As such, IIEM has nearly identical capabilities as traditional nuclear microprobe analysis, with the advantage that the ion beam does not have to be focused. The technique is based on determining the position where an individual ion enters the surface of the sample by projection secondary electron emission microscopy. The x-y origination point of a secondary electron, and hence the impact coordinates of the corresponding incident ion, is recorded with a position sensitive detector connected to a standard photoemission electron microscope (PEEM). These signals are then used to establish coincidence with IBICC, atomic, or nuclear reaction induced ion beam analysis signals simultaneously caused by the incident ion

  6. Direct Methanol Fuel Cell systems in portable electronics - a metrics-based conceptualization approach

    NARCIS (Netherlands)

    Flipsen, S.F.J.

    2010-01-01

    It is impossible to imagine life without portable electronics like the laptop computer and cell phone. All these products are powered by a battery, granting them grid independence and all-round protability. Connectivity to the internet and an increase of functionality demands for a better battery.

  7. Single-electron transfer living radical copolymerization of SWCNT-g-PMMA via graft from approach

    Czech Academy of Sciences Publication Activity Database

    Jaisankar, S. N.; Haridharan, N.; Murali, A.; Ponyrko, Sergii; Špírková, Milena; Mandal, A. B.; Matějka, Libor

    2014-01-01

    Roč. 55, č. 13 (2014), s. 2959-2966 ISSN 0032-3861 R&D Projects: GA ČR GAP108/12/1459 Institutional support: RVO:61389013 Keywords : single electron transfer * single-walled carbon nanotubes * controlled radical polymerization Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.562, year: 2014

  8. Developing a Systematic Architecture Approach for Designing an Enhanced Electronic Medical Record (EEMR) System

    Science.gov (United States)

    Aldukheil, Maher A.

    2013-01-01

    The Healthcare industry is characterized by its complexity in delivering care to the patients. Accordingly, healthcare organizations adopt and implement Information Technology (IT) solutions to manage complexity, improve quality of care, and transform to a fully integrated and digitized environment. Electronic Medical Records (EMR), which is…

  9. Computer Series, 98. Electronics for Scientists: A Computer-Intensive Approach.

    Science.gov (United States)

    Scheeline, Alexander; Mork, Brian J.

    1988-01-01

    Reports the design for a principles-before-details presentation of electronics for an instrumental analysis class. Uses computers for data collection and simulations. Requires one semester with two 2.5-hour periods and two lectures per week. Includes lab and lecture syllabi. (MVL)

  10. Backscattered electron imaging at low emerging angles: A physical approach to contrast in LVSEM

    Energy Technology Data Exchange (ETDEWEB)

    Cazaux, J., E-mail: jacques.cazaux@univ-reims.fr [LISM, EA 4695 Faculty of Sciences, BP 1039, 51687 Reims Cedex 2 (France); Kuwano, N. [Malaysia–Japan International Institute of Technology, Universiti Teknologi Malaysia, Jalan Semarak, 54100 Kuala Lumpur (Malaysia); Sato, K. [Steel Research Laboratory, JFE Steel Corporation, 1 Kawasaki-cho, Chuo-ku, Chiba 260-0835 (Japan)

    2013-12-15

    Due to the influence of refraction effects on the escape probability of the Back-Scattered Electrons (BSE), an expression of the fraction of these BSE is given as a function of the beam energy, E°, and emission angle (with respect to the normal) α. It has been shown that these effects are very sensitive to a local change of the work function in particular for low emerging angles. This sensitivity suggests a new type of contrast in Low Voltage Scanning Electron Microscopy (LVSEM for E°<2 keV): the work function contrast. Involving the change of ϕ with crystalline orientation, this possibility is supported by a new interpretation of a few published images. Some other correlated contrasts are also suggested. These are topographical contrasts or contrasts due to subsurface particles and cracks. Practical considerations of the detection system and its optimization are indicated. - Highlights: • Refraction effects experienced by Back-Scattered Electrons at sample/vacuum interfaces are evaluated as a function of energy and angles. • Sensitive to local work function changes with crystalline orientation these effects concern mainly keV-electrons at low emerging angles. • A new type of contrast in SEM is thus deduced and illustrated. • Some other correlated contrasts, topographical contrasts or contrasts due to subsurface particles and cracks are also suggested.

  11. Development of an Electronic Portfolio System Success Model: An Information Systems Approach

    Science.gov (United States)

    Balaban, Igor; Mu, Enrique; Divjak, Blazenka

    2013-01-01

    This research has two main goals: to develop an instrument for assessing Electronic Portfolio (ePortfolio) success and to build a corresponding ePortfolio success model using DeLone and McLean's information systems success model as the theoretical framework. For this purpose, we developed an ePortfolio success measurement instrument and structural…

  12. FELIX: A high-throughput network approach for interfacing to front end electronics for ATLAS upgrades

    CERN Document Server

    Anderson, John Thomas; The ATLAS collaboration; Boterenbrood, Hendrik; Chen, Hucheng; Chen, Kai; Drake, Gary; Francis, David; Gorini, Benedetto; Lanni, Francesco; Lehmann Miotto, Giovanna; Levinson, Lorne; Narevicius, Julia; Christian Plessl; Roich, Alexander; Schreuder, Frans Philip; Schumacher, Jorn; Vandelli, Wainer; Vermeulen, Jos; Zhang, Jinlong

    2015-01-01

    The ATLAS experiment at CERN is planning full deployment of a new unified link technology for connecting detector front end electronics on the timescale of the LHC Run 4 (2025). It is estimated that roughly 8000 GBT (GigaBit Transceiver) links, with transfer rates probably up to 9.6 Gbps, will replace existing links used for readout, detector control and distribution of timing and trigger information. In particular the links used for readout are often detector-specific. Already in Run 3 this technology will be deployed in conjunction with new muon detectors, additional muon first-level triggering electronics and new on-detector and off-detector liquid argon calorimeter electronics to be used for first level triggering. A total of roughly 2000 GBT links or GBT-like links (for connecting to off-detector trigger electronics) will be needed. A new class of devices will need to be developed to interface many GBT links to the rest of the trigger, data-acquisition and detector control systems. In this paper we prese...

  13. Trials and Tribulations: Student Approaches and Difficulties with Proposing Mechanisms Using the Electron-Pushing Formalism

    Science.gov (United States)

    Bhattacharyya, Gautam

    2014-01-01

    The skill of proposing mechanisms of reactions using the electron-pushing formalism (EPF) is not only of value to practicing organic chemists but it is also emphasized to students enrolled in organic chemistry courses at all levels. Several research studies in the past decade have documented the difficulties that undergraduate, and even graduate…

  14. On mixed electron-photon radiation therapy optimization using the column generation approach.

    Science.gov (United States)

    Renaud, Marc-André; Serban, Monica; Seuntjens, Jan

    2017-08-01

    Despite considerable increase in the number of degrees of freedom handled by recent radiotherapy optimisation algorithms, treatments are still typically delivered using a single modality. Column generation is an iterative method for solving large optimisation problems. It is well suited for mixed-modality (e.g., photon-electron) optimisation as the aperture shaping and modality selection problem can be solved rapidly, and the performance of the algorithm scales favourably with increasing degrees of freedom. We demonstrate that the column generation method applied to mixed photon-electron planning can efficiently generate treatment plans and investigate its behaviour under different aperture addition schemes. Column generation was applied to the problem of mixed-modality treatment planning for a chest wall case and a leg sarcoma case. 6 MV beamlets (100 cm SAD) were generated for the photon components along with 5 energies for electron beamlets (6, 9, 12, 16 and 20 MeV), simulated as shortened-SAD (80 cm) beams collimated with a photon MLC. For the chest wall case, IMRT-only, modulated electron radiation therapy (MERT)-only, and mixed electron-photon (MBRT) treatment plans were created using the same planning criteria. For the sarcoma case, MBRT and MERT plans were created to study the behaviour of the algorithm under two different sets of planning criteria designed to favour specific modalities. Finally, the efficiency and plan quality of four different aperture addition schemes was analysed by creating chest wall MBRT treatment plans which incorporate more than a single aperture per iteration of the column generation loop based on a heuristic aperture ranking scheme. MBRT plans produced superior target coverage and homogeneity relative to IMRT and MERT plans created using the same optimisation criteria, all the while preserving the normal tissue-sparing advantages of electron therapy. Adjusting the planning criteria to favour a specific modality in the sarcoma

  15. Single electron ionization and electron capture cross sections for (C{sup 6+}, H{sub 2}O) interaction within the Classical Trajectory Monte Carlo (CTMC) approach

    Energy Technology Data Exchange (ETDEWEB)

    Tran, H.N., E-mail: tranngochoang@tdt.edu.vn [Division of Nuclear Physics, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Dao, D.D. [Division of Nuclear Physics, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Incerti, S. [Division of Nuclear Physics, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Université de Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); Bernal, M.A. [Instituto de Física Gleb Wataghin, Universidade Estadual de Campinas, SP (Brazil); Karamitros, M. [CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Université de Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); Nhan Hao, T.V. [Center of Research and Development, Duy Tan University, K7/25 Quang Trung, Danang (Viet Nam); Center for Theoretical and Computational Physics, College of Education, Hue University, 34 Le Loi Street, Hue City (Viet Nam); Dang, T.M. [VNUHCM-University of Science (Viet Nam); Francis, Z. [Saint Joseph University, Beyrouth (Lebanon)

    2016-01-01

    In this work, we present a derivation of cross sections for single ionization and electron capture processes within the Classical Trajectory Monte Carlo (CTMC) approach. Specifically, we have used a potential stemming from an ab initio calculation in Green et al.’s framework to describe the dynamics of the water molecule system. Proposing a modified version of the Classical Over-Barrier (COB) potential, we have found that a cut-off of roughly 28 a.u. on the initial distance of the projectile produced a reasonable accuracy. A global agreement has been obtained in our calculations compared to experimental and other theoretical results for C{sup 6+} ion energies ranging from 10 keV/u to 10 MeV/u.

  16. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach.

    Science.gov (United States)

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts' deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Non-parametric tests and AHP approach using Expert Choice software. The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution "controlling and improving the process in handling users complaints" is of the utmost importance and authorities have to have it on the website and place great importance on updating this process

  17. Quality assessment of Isfahan Medical Faculty web site electronic services and prioritizing solutions using analytic hierarchy process approach

    Science.gov (United States)

    Hajrahimi, Nafiseh; Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nargess; Sarmadi, Sima

    2014-01-01

    Context: Implementing information technology in the best possible way can bring many advantages such as applying electronic services and facilitating tasks. Therefore, assessment of service providing systems is a way to improve the quality and elevate these systems including e-commerce, e-government, e-banking, and e-learning. Aims: This study was aimed to evaluate the electronic services in the website of Isfahan University of Medical Sciences in order to propose solutions to improve them. Furthermore, we aim to rank the solutions based on the factors that enhance the quality of electronic services by using analytic hierarchy process (AHP) method. Materials and Methods: Non-parametric test was used to assess the quality of electronic services. The assessment of propositions was based on Aqual model and they were prioritized using AHP approach. The AHP approach was used because it directly applies experts’ deductions in the model, and lead to more objective results in the analysis and prioritizing the risks. After evaluating the quality of the electronic services, a multi-criteria decision making frame-work was used to prioritize the proposed solutions. Statistical Analysis Used: Non-parametric tests and AHP approach using Expert Choice software. Results: The results showed that students were satisfied in most of the indicators. Only a few indicators received low satisfaction from students including, design attractiveness, the amount of explanation and details of information, honesty and responsiveness of authorities, and the role of e-services in the user's relationship with university. After interviewing with Information and Communications Technology (ICT) experts at the university, measurement criteria, and solutions to improve the quality were collected. The best solutions were selected by EC software. According to the results, the solution “controlling and improving the process in handling users complaints” is of the utmost importance and authorities

  18. Combining Alphas via Bounded Regression

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-11-01

    Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.

  19. Regression in autistic spectrum disorders.

    Science.gov (United States)

    Stefanatos, Gerry A

    2008-12-01

    A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.

  20. A practical approach to temperature effects in dissociative electron attachment cross sections using local complex potential theory

    International Nuclear Information System (INIS)

    Sugioka, Yuji; Takayanagi, Toshiyuki

    2012-01-01

    Highlights: ► Dissociative electron attachment cross sections for polyatomic molecules are calculated by a simple theoretical approach. ► Temperature effects can be reasonably reproduced with the present model. ► All the degrees-of-freedom are taken into account in the present dynamics approach. -- Abstract: We propose a practical computational scheme to obtain temperature dependence of dissociative electron attachment cross sections to polyatomic molecules within a local complex potential theory formalism. First we perform quantum path-integral molecular dynamics simulations on the potential energy surface for the neutral molecule in order to sample initial nuclear configurations as well as momenta. Classical trajectories are subsequently integrated on the potential energy surface for the anionic state and survival probabilities are simultaneously calculated along the obtained trajectories. We have applied this simple scheme to dissociative electron attachment processes to H 2 O and CF 3 Cl, for which several previous studies are available from both the experimental and theoretical sides.

  1. A practical approach to temperature effects in dissociative electron attachment cross sections using local complex potential theory

    Energy Technology Data Exchange (ETDEWEB)

    Sugioka, Yuji [Department of Chemistry, Saitama University, 255 Shimo-Okubo, Sakura-ku, Saitama City, Saitama 338-8570 (Japan); Takayanagi, Toshiyuki, E-mail: tako@mail.saitama-u.ac.jp [Department of Chemistry, Saitama University, 255 Shimo-Okubo, Sakura-ku, Saitama City, Saitama 338-8570 (Japan)

    2012-09-11

    Highlights: Black-Right-Pointing-Pointer Dissociative electron attachment cross sections for polyatomic molecules are calculated by a simple theoretical approach. Black-Right-Pointing-Pointer Temperature effects can be reasonably reproduced with the present model. Black-Right-Pointing-Pointer All the degrees-of-freedom are taken into account in the present dynamics approach. -- Abstract: We propose a practical computational scheme to obtain temperature dependence of dissociative electron attachment cross sections to polyatomic molecules within a local complex potential theory formalism. First we perform quantum path-integral molecular dynamics simulations on the potential energy surface for the neutral molecule in order to sample initial nuclear configurations as well as momenta. Classical trajectories are subsequently integrated on the potential energy surface for the anionic state and survival probabilities are simultaneously calculated along the obtained trajectories. We have applied this simple scheme to dissociative electron attachment processes to H{sub 2}O and CF{sub 3}Cl, for which several previous studies are available from both the experimental and theoretical sides.

  2. A statistical approach to inelastic electron tunneling spectroscopy on fullerene-terminated molecules

    DEFF Research Database (Denmark)

    Fock, Jeppe; Sørensen, Jakob Kryger; Lörtscher, Emanuel

    2011-01-01

    We report on the vibrational fingerprint of single C(60) terminated molecules in a mechanically controlled break junction (MCBJ) setup using a novel statistical approach manipulating the junction mechanically to address different molecular configurations and to monitor the corresponding vibration...

  3. Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing

    NARCIS (Netherlands)

    Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.

    2006-01-01

    The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval

  4. Electron probe micro-analysis of gas bubbles in solids: a novel approach

    International Nuclear Information System (INIS)

    Verwerft, M.; Vos, B.

    1999-01-01

    The local analysis of retained noble gas in nuclear fuel is inherently difficult since the physical form under which it is stored varies from atomically dispersed to bubbles with a diameter of several hundreds of nanometers. One of the techniques that has been applied since pore than twenty years is EPMA. Although many important results have been obtained with this technique, its application to the analysis of highly inhomogeneous materials is limited. The EPMA technique is indeed difficult to apply to samples that are not homogeneous on the scale of the electron-solid interaction volume. The paper discusses the development of a method to analyse a system of as bubbles distributed in a solid matrix. This method has been based on a multiple voltage EPMA measurement combined with a scanning Electron Microscopic analysis of the bubble size distribution

  5. Gradient ascent pulse engineering approach to CNOT gates in donor electron spin quantum computing

    International Nuclear Information System (INIS)

    Tsai, D.-B.; Goan, H.-S.

    2008-01-01

    In this paper, we demonstrate how gradient ascent pulse engineering (GRAPE) optimal control methods can be implemented on donor electron spin qubits in semiconductors with an architecture complementary to the original Kane's proposal. We focus on the high fidelity controlled-NOT (CNOT) gate and we explicitly find the digitized control sequences for a controlled-NOT gate by optimizing its fidelity using the effective, reduced donor electron spin Hamiltonian with external controls over the hyperfine A and exchange J interactions. We then simulate the CNOT-gate sequence with the full spin Hamiltonian and find that it has an error of 10 -6 that is below the error threshold of 10 -4 required for fault-tolerant quantum computation. Also the CNOT gate operation time of 100 ns is 3 times faster than 297 ns of the proposed global control scheme.

  6. Linear regression in astronomy. I

    Science.gov (United States)

    Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh

    1990-01-01

    Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.

  7. Liquidity dynamics in an electronic open limit order book: An event study approach

    OpenAIRE

    Gomber, Peter; Schweickert, Uwe; Theissen, Erik

    2011-01-01

    We analyze the dynamics of liquidity in Xetra, an electronic open limit order book. We use the Exchange Liquidity Measure (XLM), a measure of the cost of a roundtrip trade of given size V. This measure captures the price and the quantity dimension of liquidity. We present descriptive statistics, analyze the cross-sectional determinants of the XLM measure and document its intraday pattern. Our main contribution is an analysis of the dynamics of the XLM measure around liquidity shocks. We use i...

  8. A Qualitative Approach to Understanding Real-World Electronic Cigarette Use: Implications for Measurement and Regulation

    OpenAIRE

    Maria Cooper, PhD; Melissa B. Harrell, PhD; Cheryl L. Perry, PhD

    2016-01-01

    Introduction An understanding of the real-world use of electronic cigarettes (e-cigarettes) is needed to inform surveillance efforts and future state and federal regulation. This study investigates the behavioral aspects of e-cigarette use. Methods We used qualitative methods to examine salient characteristics of e-cigarette use. The lead investigator (M.C.) conducted in-depth, semistructured individual interviews to explore patterns and behaviors associated with e-cigarette use a...

  9. An intramolecular inverse electron demand Diels–Alder approach to annulated α-carbolines

    Directory of Open Access Journals (Sweden)

    Zhiyuan Ma

    2012-06-01

    Full Text Available Intramolecular inverse electron demand cycloadditions of isatin-derived 1,2,4-triazines with acetylenic dienophiles tethered by amidations or transesterifications proceed in excellent yields to produce lactam- or lactone-fused α-carbolines. Beginning with various isatins and alkynyl dienophiles, a pilot-scale library of eighty-eight α-carbolines was prepared by using this robust methodology for biological evaluation.

  10. Finite Algorithms for Robust Linear Regression

    DEFF Research Database (Denmark)

    Madsen, Kaj; Nielsen, Hans Bruun

    1990-01-01

    The Huber M-estimator for robust linear regression is analyzed. Newton type methods for solution of the problem are defined and analyzed, and finite convergence is proved. Numerical experiments with a large number of test problems demonstrate efficiency and indicate that this kind of approach may...

  11. An improved approach to identify irradiated spices using electronic nose, FTIR, and EPR spectroscopy.

    Science.gov (United States)

    Sanyal, Bhaskar; Ahn, Jae-Jun; Maeng, Jeong-Hwan; Kyung, Hyun-Kyu; Lim, Ha-Kyeong; Sharma, Arun; Kwon, Joong-Ho

    2014-09-01

    Changes in cumin and chili powder from India resulting from electron-beam irradiation were investigated using 3 analytical methods: electronic nose (E-nose), Fourier transform infrared (FTIR) spectroscopy, and electron paramagnetic resonance (EPR) spectroscopy. The spices had been exposed to 6 to 14 kGy doses recommended for microbial decontamination. E-nose measured a clear difference in flavor patterns of the irradiated spices in comparison with the nonirradiated samples. Principal component analysis further showed a dose-dependent variation. FTIR spectra of the samples showed strong absorption bands at 3425, 3007 to 2854, and 1746 cm(-1). However, both nonirradiated and irradiated spice samples had comparable patterns without any noteworthy changes in functional groups. EPR spectroscopy of the irradiated samples showed a radiation-specific triplet signal at g = 2.006 with a hyper-fine coupling constant of 3 mT confirming the results obtained with the E-nose technique. Thus, E-nose was found to be a potential tool to identify irradiated spices. © 2014 Institute of Food Technologists®

  12. Low-energy-electron interactions with DNA: approaching cellular conditions with atmospheric experiments

    International Nuclear Information System (INIS)

    Alizadeh, E.; Sanche, L.

    2014-01-01

    A novel technique has been developed to investigate low energy electron (LEE)-DNA interactions in the presence of small biomolecules (e.g., N 2 , O 2 , H 2 O) found near DNA in the cell nucleus, in order to simulate cellular conditions. In this technique, LEEs are emitted from a metallic surface exposed by soft X-rays and interact with DNA thin films at standard ambient temperature and pressure (SATP). Whereas atmospheric N 2 had little effect on the yields of LEE-induced single and double strand breaks, both O 2 and H 2 O considerably modified and increased such damage. The highest yields were obtained when DNA is embedded in a combined O 2 and H 2 O atmosphere. In this case, the amount of additional double strand breaks was supper-additive. The effect of modifying the chemical and physical stability of DNA by platinum-based chemotherapeutic agents (Pt-drugs) including cisplatin, carboplatin and oxaliplatin was also investigated with this technique. The results obtained provide information on the role played by subexcitation-energy electrons and dissociative electron attachment in the radiosensitization of DNA by Pt-drugs, which is an important step to unravel the mechanisms of radiosensitization of these agents in chemo-radiation cancer therapy. (authors)

  13. Path integral approach for electron transport in disturbed magnetic field lines

    Energy Technology Data Exchange (ETDEWEB)

    Kanno, Ryutaro; Nakajima, Noriyoshi; Takamaru, Hisanori

    2002-05-01

    A path integral method is developed to investigate statistical property of an electron transport described as a Langevin equation in a statically disturbed magnetic field line structure; especially a transition probability of electrons strongly tied to field lines is considered. The path integral method has advantages that 1) it does not include intrinsically a growing numerical error of an orbit, which is caused by evolution of the Langevin equation under a finite calculation accuracy in a chaotic field line structure, and 2) it gives a method of understanding the qualitative content of the Langevin equation and assists to expect statistical property of the transport. Monte Carlo calculations of the electron distributions under both effects of chaotic field lines and collisions are demonstrated to comprehend above advantages through some examples. The mathematical techniques are useful to study statistical properties of various phenomena described as Langevin equations in general. By using parallel generators of random numbers, the Monte Carlo scheme to calculate a transition probability can be suitable for a parallel computation. (author)

  14. Low-energy-electron interactions with DNA: approaching cellular conditions with atmospheric experiments

    Science.gov (United States)

    Alizadeh, Elahe; Sanche, Léon

    2014-04-01

    A novel technique has been developed to investigate low energy electron (LEE)-DNA interactions in the presence of small biomolecules (e.g., N2, O2, H2O) found near DNA in the cell nucleus, in order to simulate cellular conditions. In this technique, LEEs are emitted from a metallic surface exposed by soft X-rays and interact with DNA thin films at standard ambient temperature and pressure (SATP). Whereas atmospheric N2 had little effect on the yields of LEE-induced single and double strand breaks, both O2 and H2O considerably modified and increased such damage. The highest yields were obtained when DNA is embedded in a combined O2 and H2O atmosphere. In this case, the amount of additional double strand breaks was supper-additive. The effect of modifying the chemical and physical stability of DNA by platinum-based chemotherapeutic agents (Pt-drugs) including cisplatin, carboplatin and oxaliplatin was also investigated with this technique. The results obtained provide information on the role played by subexcitation-energy electrons and dissociative electron attachment in the radiosensitization of DNA by Pt-drugs, which is an important step to unravel the mechanisms of radiosensitisation of these agents in chemoradiation cancer therapy.

  15. Effect of random vacancies on the electronic properties of graphene and T graphene: a theoretical approach

    Science.gov (United States)

    Sadhukhan, B.; Nayak, A.; Mookerjee, A.

    2017-12-01

    In this communication we present together four distinct techniques for the study of electronic structure of solids: the tight-binding linear muffin-tin orbitals, the real space and augmented space recursions and the modified exchange-correlation. Using this we investigate the effect of random vacancies on the electronic properties of the carbon hexagonal allotrope, graphene, and the non-hexagonal allotrope, planar T graphene. We have inserted random vacancies at different concentrations, to simulate disorder in pristine graphene and planar T graphene sheets. The resulting disorder, both on-site (diagonal disorder) as well as in the hopping integrals (off-diagonal disorder), introduces sharp peaks in the vicinity of the Dirac point built up from localized states for both hexagonal and non-hexagonal structures. These peaks become resonances with increasing vacancy concentration. We find that in presence of vacancies, graphene-like linear dispersion appears in planar T graphene and the cross points form a loop in the first Brillouin zone similar to buckled T graphene that originates from π and π* bands without regular hexagonal symmetry. We also calculate the single-particle relaxation time, τ (ěc {q}) of ěc {q} labeled quantum electronic states which originates from scattering due to presence of vacancies, causing quantum level broadening.

  16. Learning centred approach for developing the electronic information search processes of students.

    Science.gov (United States)

    Shanahan, Madeleine

    2009-11-01

    Undergraduate students of the twenty-first century are widely regarded as 'technologically savvy' and have embraced the electronic information world. The literature, however, describes undergraduate students as using a limited range of electronic information sources and not critically evaluating the information they retrieve from internet searches. The aim was to evaluate a purposefully designed intervention that sought to expand the information search and evaluation practices of undergraduate students. The intervention scaffolded an independent learning activity in the form of a group-based project. Survey methodology was used to collect data from student pre- and post-intervention for two cohorts of students who undertook the intervention in 2005 and 2006 involving a total of 71 students. Percentages were used to describe survey findings and chi-square analysis and Fisher's exact test examined differences between groups. Questionnaires were completed by 59 students (response rate 83%) pre-intervention and 49 students (response rate 69%) post-intervention. Post-intervention there were positive and statistically significant differences in database searching behaviour (p = 0.000), use of Google Scholar (p = 0.035) and number of criteria used to evaluate information retrieved from the internet (p = 0.000) by students. By positively reshaping the electronic information search and evaluation practices of students we are helping students to find informed information sources as they engage in independent learning activities at university and as future health professionals.

  17. Strongly correlated electrons at high pressure: an approach by inelastic X-Ray scattering

    International Nuclear Information System (INIS)

    Rueff, J.P.

    2007-06-01

    Inelastic X-ray scattering (IXS) and associated methods has turn out to be a powerful alternative for high-pressure physics. It is an all-photon technique fully compatible with high-pressure environments and applicable to a vast range of materials. Standard focalization of X-ray in the range of 100 microns is typical of the sample size in the pressure cell. Our main aim is to provide an overview of experimental results obtained by IXS under high pressure in 2 classes of materials which have been at the origin of the renewal of condensed matter physics: strongly correlated transition metal oxides and rare-earth compounds. Under pressure, d and f-electron materials show behaviors far more complex that what would be expected from a simplistic band picture of electron delocalization. These spectroscopic studies have revealed unusual phenomena in the electronic degrees of freedom, brought up by the increased density, the changes in the charge-carrier concentration, the over-lapping between orbitals, and hybridization under high pressure conditions. Particularly we discuss about pressure induced magnetic collapse and metal-insulator transitions in 3d compounds and valence fluctuations phenomena in 4f and 5f compounds. Thanks to its superior penetration depth, chemical selectivity and resonant enhancement, resonant inelastic X-ray scattering has appeared extremely well suited to high pressure physics in strongly correlated materials. (A.C.)

  18. Advanced statistics: linear regression, part I: simple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  19. Electron acceleration by an obliquely propagating electromagnetic wave in the regime of validity of the Fokker-Planck-Kolmogorov approach

    Science.gov (United States)

    Hizanidis, Kyriakos; Vlahos, L.; Polymilis, C.

    1989-01-01

    The relativistic motion of an ensemble of electrons in an intense monochromatic electromagnetic wave propagating obliquely in a uniform external magnetic field is studied. The problem is formulated from the viewpoint of Hamiltonian theory and the Fokker-Planck-Kolmogorov approach analyzed by Hizanidis (1989), leading to a one-dimensional diffusive acceleration along paths of constant zeroth-order generalized Hamiltonian. For values of the wave amplitude and the propagating angle inside the analytically predicted stochastic region, the numerical results suggest that the diffusion probes proceeds in stages. In the first stage, the electrons are accelerated to relatively high energies by sampling the first few overlapping resonances one by one. During that stage, the ensemble-average square deviation of the variable involved scales quadratically with time. During the second stage, they scale linearly with time. For much longer times, deviation from linear scaling slowly sets in.

  20. Electronic Excitations in Solution: The Interplay between State Specific Approaches and a Time-Dependent Density Functional Theory Description.

    Science.gov (United States)

    Guido, Ciro A; Jacquemin, Denis; Adamo, Carlo; Mennucci, Benedetta

    2015-12-08

    We critically analyze the performances of continuum solvation models when coupled to time-dependent density functional theory (TD-DFT) to predict solvent effects on both absorption and emission energies of chromophores in solution. Different polarization schemes of the polarizable continuum model (PCM), such as linear response (LR) and three different state specific (SS) approaches, are considered and compared. We show the necessity of introducing a SS model in cases where large electron density rearrangements are involved in the excitations, such as charge-transfer transitions in both twisted and quadrupolar compounds, and underline the very delicate interplay between the selected polarization method and the chosen exchange-correlation functional. This interplay originates in the different descriptions of the transition and ground/excited state multipolar moments by the different functionals. As a result, the choice of both the DFT functional and the solvent polarization scheme has to be consistent with the nature of the studied electronic excitation.