Abbring, J.H.
2009-01-01
We study mixed hitting-time models, which specify durations as the first time a Levy process (a continuous-time process with stationary and independent increments) crosses a heterogeneous threshold. Such models of substantial interest because they can be reduced from optimal-stopping models with
Quantum walks with infinite hitting times
International Nuclear Information System (INIS)
Krovi, Hari; Brun, Todd A.
2006-01-01
Hitting times are the average time it takes a walk to reach a given final vertex from a given starting vertex. The hitting time for a classical random walk on a connected graph will always be finite. We show that, by contrast, quantum walks can have infinite hitting times for some initial states. We seek criteria to determine if a given walk on a graph will have infinite hitting times, and find a sufficient condition, which for discrete time quantum walks is that the degeneracy of the evolution operator be greater than the degree of the graph. The set of initial states which give an infinite hitting time form a subspace. The phenomenon of infinite hitting times is in general a consequence of the symmetry of the graph and its automorphism group. Using the irreducible representations of the automorphism group, we derive conditions such that quantum walks defined on this graph must have infinite hitting times for some initial states. In the case of the discrete walk, if this condition is satisfied the walk will have infinite hitting times for any choice of a coin operator, and we give a class of graphs with infinite hitting times for any choice of coin. Hitting times are not very well defined for continuous time quantum walks, but we show that the idea of infinite hitting-time walks naturally extends to the continuous time case as well
Hitting your foothills target the first time
Energy Technology Data Exchange (ETDEWEB)
Ewanek, J. [MI Drilling Fluids Canada, Calgary, AB (Canada); Young, S. [M-I L.L.C., Calgary, AB (Canada)
2001-07-01
As the demand for gas increases, operators are exploring for more long-term gas reserves in the foothills and in more complex structural traps and reservoirs. The high tectonic activity in the foothills has rendered the structural geology complex, making it difficult to hit an exploration target the first time. Costly sidetracking operations are common. The use of oil based fluids is often necessary for drilling in such technically challenging environments. However, dips/structural evaluation tools such as the Formation Micro Imager (FMI) and the GeoVision 675 Logging While Drilling (LWD) tool cannot be used because of the non-conductive nature of oil based fluids. Therefore, a conductive oil based fluid was developed with the recent advances in oil based mud technology, and it is now available. This new conductive oil based fluid allows LWD tools to transmit structural information in real time and FMI logs to give detailed structural information while wireline logging the hole. The combination of LWD and FMI data plus a conductive oil based fluid makes it possible to gather better structural information while drilling. This minimizes sidetracks and leads to a better understanding of the structural geology in that field. It was concluded that the use of this technology well enable better pre-planning on future well sites and will make it possible to reduce costs associated with drilling and oilfield operations in the foothills. 9 refs., 2 tabs., 13 figs.
Time-adaptive quantile regression
DEFF Research Database (Denmark)
Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik
2008-01-01
and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....
Multi-hit time-to-amplitude CAMAC module (MTAC)
International Nuclear Information System (INIS)
Kang, H.
1980-10-01
A Multi-Hit Time-to-Amplitude Module (MTAC) for the SLAC Mark III drift chamber system has been designed to measure drift time by converting time-proportional chamber signals into analog levels, and converting the analog data by slow readout via a semi-autonomous controller in a CAMAC crate. The single width CAMAC module has 16 wire channels, each with a 4-hit capacity. An externally generated common start initiates an internal precision ramp voltage which is then sampled using a novel shift register gating scheme and CMOS sampling switches. The detailed design and performance specifications are described
Concentrated Hitting Times of Randomized Search Heuristics with Variable Drift
DEFF Research Database (Denmark)
Lehre, Per Kristian; Witt, Carsten
2014-01-01
Drift analysis is one of the state-of-the-art techniques for the runtime analysis of randomized search heuristics (RSHs) such as evolutionary algorithms (EAs), simulated annealing etc. The vast majority of existing drift theorems yield bounds on the expected value of the hitting time for a target...
Adiabatic condition and the quantum hitting time of Markov chains
International Nuclear Information System (INIS)
Krovi, Hari; Ozols, Maris; Roland, Jeremie
2010-01-01
We present an adiabatic quantum algorithm for the abstract problem of searching marked vertices in a graph, or spatial search. Given a random walk (or Markov chain) P on a graph with a set of unknown marked vertices, one can define a related absorbing walk P ' where outgoing transitions from marked vertices are replaced by self-loops. We build a Hamiltonian H(s) from the interpolated Markov chain P(s)=(1-s)P+sP ' and use it in an adiabatic quantum algorithm to drive an initial superposition over all vertices to a superposition over marked vertices. The adiabatic condition implies that, for any reversible Markov chain and any set of marked vertices, the running time of the adiabatic algorithm is given by the square root of the classical hitting time. This algorithm therefore demonstrates a novel connection between the adiabatic condition and the classical notion of hitting time of a random walk. It also significantly extends the scope of previous quantum algorithms for this problem, which could only obtain a full quadratic speedup for state-transitive reversible Markov chains with a unique marked vertex.
Directory of Open Access Journals (Sweden)
Sharma Neha Gupta
2015-12-01
Full Text Available The J-PET detector being developed at the Jagiellonian University is a positron emission tomograph composed of the long strips of polymer scintillators. At the same time, it is a detector system that will be used for studies of the decays of positronium atoms. The shape of photomultiplier signals depends on the hit time and hit position of the gamma quantum. In order to take advantage of this fact, a dedicated sampling front-end electronics that enables to sample signals in voltage domain with the time precision of about 20 ps and novel reconstruction method based on the comparison of examined signal with the model signals stored in the library has been developed. As a measure of the similarity, we use the Mahalanobis distance. The achievable position and time resolution depend on the number and values of the threshold levels at which the signal is sampled. A reconstruction method as well as preliminary results are presented and discussed.
Dynamic travel time estimation using regression trees.
2008-10-01
This report presents a methodology for travel time estimation by using regression trees. The dissemination of travel time information has become crucial for effective traffic management, especially under congested road conditions. In the absence of c...
Parameter inference from hitting times for perturbed Brownian motion
Czech Academy of Sciences Publication Activity Database
Tamborrino, M.; Ditlevsen, S.; Lánský, Petr
2015-01-01
Roč. 21, č. 3 (2015), s. 331-352 ISSN 1380-7870 Institutional support: RVO:67985823 Keywords : first passage times * maximum likelihood estimation * Wiener proces * degradation proces * effect of intervention * survival analysis Subject RIV: BD - Theory of Information Impact factor: 0.810, year: 2015
Parameter inference from hitting times for perturbed Brownian motion
DEFF Research Database (Denmark)
Tamborrino, Massimiliano; Ditlevsen, Susanne; Lansky, Peter
2015-01-01
.g. the political conflict finishes, the industrial component breaks down or the person dies. Imagine an intervention, e.g., a political decision, maintenance of a component or a medical treatment, is initiated to the process before the event occurs. How can we evaluate whether the intervention had an effect......A latent internal process describes the state of some system, e.g. the social tension in a political conflict, the strength of an industrial component or the health status of a person. When this process reaches a predefined threshold, the process terminates and an observable event occurs, e......? To answer this question we describe the effect of the intervention through parameter changes of the law governing the internal process. Then, the time interval between the start of the process and the final event is divided into two subintervals: the time from the start to the instant of intervention...
An orthogonal-polynomial approach to first-hitting times of birth-death processes
van Doorn, Erik A.
In a recent paper in this journal, Gong, Mao and Zhang, using the theory of Dirichlet forms, extended Karlin and McGregor’s classical results on first-hitting times of a birth–death process on the nonnegative integers by establishing a representation for the Laplace transform E[exp(sTij)] of the
An orthogonal-polynomial approach to first-hitting times of birth-death processes
van Doorn, Erik A.
In a recent paper [J. Theor. Probab. 25 (2012) 950-980] Gong, Mao and Zhang, using the theory of Dirichlet forms, extended Karlin and McGregor's classical results on first-hitting times of a birth-death process on the nonnegative integers by establishing a representation for the Laplace transform
Energy Technology Data Exchange (ETDEWEB)
Yokoyama, Akihito, E-mail: yokoyama.akihito@jaea.go.jp [Graduate School of Science and Technology, Gunma University, 1-5-1 Tenjin-cho, Kiryu, Gunma 376-8515 (Japan); Takasaki Advanced Radiation Research Institute (TARRI), Japan Atomic Energy Agency (JAEA), 1233 Watanuki-machi, Takasaki, Gunma 370-1292 (Japan); Kada, Wataru [Graduate School of Science and Technology, Gunma University, 1-5-1 Tenjin-cho, Kiryu, Gunma 376-8515 (Japan); Satoh, Takahiro; Koka, Masashi [Takasaki Advanced Radiation Research Institute (TARRI), Japan Atomic Energy Agency (JAEA), 1233 Watanuki-machi, Takasaki, Gunma 370-1292 (Japan); Shimada, Keisuke; Yokoata, Yuya; Miura, Kenta; Hanaizumi, Osamu [Graduate School of Science and Technology, Gunma University, 1-5-1 Tenjin-cho, Kiryu, Gunma 376-8515 (Japan)
2016-03-15
In this paper, we propose and test a real-time detection system for single-ion hits using mega-electronvolt (MeV)-heavy ions. The system was constructed using G2000 and G9 glass scintillators, as well as an electron-multiplying charge-coupled device (EMCCD) camera combined with an inverted microscope with a 10× objective lens. Commercially available G2000 and G9 glass scintillators, which have been reported to exhibit strong photoluminescence at 489, 543, 585, and 622 nm as a result of the Tb{sup 3+} f–f transition, were employed for highly accurate ionized particle detection. The EMCCD camera had a resolution of 512 × 512 pixels, each with a size of 16 μm × 16 μm, and a maximum linear gain of 8 × 10{sup 5} electrons. For 260-MeV Ne, 3 ion hits/s were detected by our system. The intensity of the ionoluminescence (IL) peak induced by the heavy ions was 140 times the noise intensity. In contrast, the luminous diameter at the full width at half maximum (FWHM) in both the horizontal and vertical directions was calculated to be approximately 4.5 μm. These results suggest that our detection system can accurately detect single-ion hits with a diameter of the order of 1 μm.
Hitting times of local and global optima in genetic algorithms with very high selection pressure
Directory of Open Access Journals (Sweden)
Eremeev Anton V.
2017-01-01
Full Text Available The paper is devoted to upper bounds on the expected first hitting times of the sets of local or global optima for non-elitist genetic algorithms with very high selection pressure. The results of this paper extend the range of situations where the upper bounds on the expected runtime are known for genetic algorithms and apply, in particular, to the Canonical Genetic Algorithm. The obtained bounds do not require the probability of fitness-decreasing mutation to be bounded by a constant which is less than one.
A double hit model for the distribution of time to AIDS onset
Chillale, Nagaraja Rao
2013-09-01
Incubation time is a key epidemiologic descriptor of an infectious disease. In the case of HIV infection this is a random variable and is probably the longest one. The probability distribution of incubation time is the major determinant of the relation between the incidences of HIV infection and its manifestation to Aids. This is also one of the key factors used for accurate estimation of AIDS incidence in a region. The present article i) briefly reviews the work done, points out uncertainties in estimation of AIDS onset time and stresses the need for its precise estimation, ii) highlights some of the modelling features of onset distribution including immune failure mechanism, and iii) proposes a 'Double Hit' model for the distribution of time to AIDS onset in the cases of (a) independent and (b) dependent time variables of the two markers and examined the applicability of a few standard probability models.
Linear Regression Based Real-Time Filtering
Directory of Open Access Journals (Sweden)
Misel Batmend
2013-01-01
Full Text Available This paper introduces real time filtering method based on linear least squares fitted line. Method can be used in case that a filtered signal is linear. This constraint narrows a band of potential applications. Advantage over Kalman filter is that it is computationally less expensive. The paper further deals with application of introduced method on filtering data used to evaluate a position of engraved material with respect to engraving machine. The filter was implemented to the CNC engraving machine control system. Experiments showing its performance are included.
Drift chamber electronics with multi-hit capability for time and current division measurements
Energy Technology Data Exchange (ETDEWEB)
Manarin, A; Pregernig, L; Rabany, M; Saban, R; Vismara, G
1983-11-15
Drift chambers have been installed for luminosity measurements in intersection 5 of the SPS accelerator working in panti p colliding mode. The required electronics is described. The system is able to process up to 16 hits per wire with a double pulse resolution of 40 ns; drift time and current division, with 1.25 ns and 1.6% resolution respectively, are recorded. Transconductance preamplifiers and discriminators are directly mounted on the chamber; 160 m of twisted-apir cable bring the signals to the digitizer unit. Coarse time is measured using RAM techniques, while fine time is obtained by means of a microstrip delay associated with a 100 K ECL priority encoder. Current division used a single 50 MHz Flash ADC which alows 26 dB dynamic range with 6 bit resolution. First operational results are reported.
International Nuclear Information System (INIS)
Jafri, Y.Z.; Kamal, L.
2007-01-01
Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)
A method to increase optical timing spectra measurement rates using a multi-hit TDC
International Nuclear Information System (INIS)
Moses, W.W.
1993-01-01
A method is presented for using a modern time to digital converter (TDC) to increase the data collection rate for optical timing measurements such as scintillator decay times. It extends the conventional delayed coincidence method, where a synchronization signal ''starts'' a TDC and a photomultiplier tube (PMT) sampling the optical signal ''stops'' the TDC. Data acquisition rates are low with the conventional method because ε, the light collection efficiency of the ''stop'' PMT, is artificially limited to ε∼0.01 photons per ''start'' signal to reduce the probability of detecting more than one photon during the sampling period. With conventional TDCs, these multiple photon events bias the time spectrum since only the first ''stop'' pulse is digitized. The new method uses a modern TDC to detect whether additional ''stop'' signals occur during the sampling period, and actively reject these multiple photon events. This allows ε to be increased to almost 1 photon per ''start'' signal, which maximizes the data acquisition rate at a value nearly 20 times higher. Multi-hit TDCs can digitize the arrival times of n ''stop'' signals per ''start'' signal, which allows ε to be increased to ∼3n/4. While overlap of the ''stop'' signals prevents the full gain in data collection rate to be realized, significant improvements are possible for most applications. (orig.)
Real-time regression analysis with deep convolutional neural networks
Huerta, E. A.; George, Daniel; Zhao, Zhizhen; Allen, Gabrielle
2018-01-01
We discuss the development of novel deep learning algorithms to enable real-time regression analysis for time series data. We showcase the application of this new method with a timely case study, and then discuss the applicability of this approach to tackle similar challenges across science domains.
Forecasting daily meteorological time series using ARIMA and regression models
Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir
2018-04-01
The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.
BOX-COX REGRESSION METHOD IN TIME SCALING
Directory of Open Access Journals (Sweden)
ATİLLA GÖKTAŞ
2013-06-01
Full Text Available Box-Cox regression method with λj, for j = 1, 2, ..., k, power transformation can be used when dependent variable and error term of the linear regression model do not satisfy the continuity and normality assumptions. The situation obtaining the smallest mean square error when optimum power λj, transformation for j = 1, 2, ..., k, of Y has been discussed. Box-Cox regression method is especially appropriate to adjust existence skewness or heteroscedasticity of error terms for a nonlinear functional relationship between dependent and explanatory variables. In this study, the advantage and disadvantage use of Box-Cox regression method have been discussed in differentiation and differantial analysis of time scale concept.
DEFF Research Database (Denmark)
Backe, Hans-Joachim
2017-01-01
Daniel Cermak-Sassenrath, Max Alexander Wrighton, Hans-Joachim Backe. Hit Parade. Installation. Kulturnatten 2017, ITU, Copenhagen, DK, Oct 13, 2017.......Daniel Cermak-Sassenrath, Max Alexander Wrighton, Hans-Joachim Backe. Hit Parade. Installation. Kulturnatten 2017, ITU, Copenhagen, DK, Oct 13, 2017....
DEFF Research Database (Denmark)
Cermak, Daniel; Wrighton, Max Alexander; Backe, Hans-Joachim
2016-01-01
Daniel Cermak-Sassenrath, Max Alexander Wrighton, Hans-Joachim Backe. Hit Parade. Installation. Demo Night, ITU, Copenhagen, DK, Oct 5, 2016.......Daniel Cermak-Sassenrath, Max Alexander Wrighton, Hans-Joachim Backe. Hit Parade. Installation. Demo Night, ITU, Copenhagen, DK, Oct 5, 2016....
DEFF Research Database (Denmark)
Cermak, Daniel; Wrighton, Max Alexander; Backe, Hans-Joachim
2016-01-01
Daniel Cermak-Sassenrath, Max Alexander Wrighton, Hans-Joachim Backe. Hit Parade. Installation. Kulturnatten 2016, Danish Science Ministry, Copenhagen, DK, Oct 14, 2016.......Daniel Cermak-Sassenrath, Max Alexander Wrighton, Hans-Joachim Backe. Hit Parade. Installation. Kulturnatten 2016, Danish Science Ministry, Copenhagen, DK, Oct 14, 2016....
Simulation of a Real-Time Brain Computer Interface for Detecting a Self-Paced Hitting Task.
Hammad, Sofyan H; Kamavuako, Ernest N; Farina, Dario; Jensen, Winnie
2016-12-01
An invasive brain-computer interface (BCI) is a promising neurorehabilitation device for severely disabled patients. Although some systems have been shown to work well in restricted laboratory settings, their utility must be tested in less controlled, real-time environments. Our objective was to investigate whether a specific motor task could be reliably detected from multiunit intracortical signals from freely moving animals in a simulated, real-time setting. Intracortical signals were first obtained from electrodes placed in the primary motor cortex of four rats that were trained to hit a retractable paddle (defined as a "Hit"). In the simulated real-time setting, the signal-to-noise-ratio was first increased by wavelet denoising. Action potentials were detected, and features were extracted (spike count, mean absolute values, entropy, and combination of these features) within pre-defined time windows (200 ms, 300 ms, and 400 ms) to classify the occurrence of a "Hit." We found higher detection accuracy of a "Hit" (73.1%, 73.4%, and 67.9% for the three window sizes, respectively) when the decision was made based on a combination of features rather than on a single feature. However, the duration of the window length was not statistically significant (p = 0.5). Our results showed the feasibility of detecting a motor task in real time in a less restricted environment compared to environments commonly applied within invasive BCI research, and they showed the feasibility of using information extracted from multiunit recordings, thereby avoiding the time-consuming and complex task of extracting and sorting single units. © 2016 International Neuromodulation Society.
Outlier detection algorithms for least squares time series regression
DEFF Research Database (Denmark)
Johansen, Søren; Nielsen, Bent
We review recent asymptotic results on some robust methods for multiple regression. The regressors include stationary and non-stationary time series as well as polynomial terms. The methods include the Huber-skip M-estimator, 1-step Huber-skip M-estimators, in particular the Impulse Indicator Sat...
Marginal regression analysis of recurrent events with coarsened censoring times.
Hu, X Joan; Rosychuk, Rhonda J
2016-12-01
Motivated by an ongoing pediatric mental health care (PMHC) study, this article presents weakly structured methods for analyzing doubly censored recurrent event data where only coarsened information on censoring is available. The study extracted administrative records of emergency department visits from provincial health administrative databases. The available information of each individual subject is limited to a subject-specific time window determined up to concealed data. To evaluate time-dependent effect of exposures, we adapt the local linear estimation with right censored survival times under the Cox regression model with time-varying coefficients (cf. Cai and Sun, Scandinavian Journal of Statistics 2003, 30, 93-111). We establish the pointwise consistency and asymptotic normality of the regression parameter estimator, and examine its performance by simulation. The PMHC study illustrates the proposed approach throughout the article. © 2016, The International Biometric Society.
Time series regression model for infectious disease and weather.
Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro
2015-10-01
Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Clarke, Louise
2010-01-01
Issue 9 of the Saatchi Gallery Magazine: Art&Music is dedicated to Sex. The article Dirty Hits invited a cross-section of contemporary artists and musicians to answer: What makes a dirty hit? As one of the artists invited, I wrote an autobiographical piece to reveal how these fumbling, feral sexual experiences of my childhood landscape, along with irrational superstition and folk law inform my life and underpin my work. The article also included an artwork: Louise Clarke, Sip (2009)
Physics constrained nonlinear regression models for time series
International Nuclear Information System (INIS)
Majda, Andrew J; Harlim, John
2013-01-01
A central issue in contemporary science is the development of data driven statistical nonlinear dynamical models for time series of partial observations of nature or a complex physical model. It has been established recently that ad hoc quadratic multi-level regression (MLR) models can have finite-time blow up of statistical solutions and/or pathological behaviour of their invariant measure. Here a new class of physics constrained multi-level quadratic regression models are introduced, analysed and applied to build reduced stochastic models from data of nonlinear systems. These models have the advantages of incorporating memory effects in time as well as the nonlinear noise from energy conserving nonlinear interactions. The mathematical guidelines for the performance and behaviour of these physics constrained MLR models as well as filtering algorithms for their implementation are developed here. Data driven applications of these new multi-level nonlinear regression models are developed for test models involving a nonlinear oscillator with memory effects and the difficult test case of the truncated Burgers–Hopf model. These new physics constrained quadratic MLR models are proposed here as process models for Bayesian estimation through Markov chain Monte Carlo algorithms of low frequency behaviour in complex physical data. (paper)
Time-localized wavelet multiple regression and correlation
Fernández-Macho, Javier
2018-02-01
This paper extends wavelet methodology to handle comovement dynamics of multivariate time series via moving weighted regression on wavelet coefficients. The concept of wavelet local multiple correlation is used to produce one single set of multiscale correlations along time, in contrast with the large number of wavelet correlation maps that need to be compared when using standard pairwise wavelet correlations with rolling windows. Also, the spectral properties of weight functions are investigated and it is argued that some common time windows, such as the usual rectangular rolling window, are not satisfactory on these grounds. The method is illustrated with a multiscale analysis of the comovements of Eurozone stock markets during this century. It is shown how the evolution of the correlation structure in these markets has been far from homogeneous both along time and across timescales featuring an acute divide across timescales at about the quarterly scale. At longer scales, evidence from the long-term correlation structure can be interpreted as stable perfect integration among Euro stock markets. On the other hand, at intramonth and intraweek scales, the short-term correlation structure has been clearly evolving along time, experiencing a sharp increase during financial crises which may be interpreted as evidence of financial 'contagion'.
Bayesian Travel Time Inversion adopting Gaussian Process Regression
Mauerberger, S.; Holschneider, M.
2017-12-01
A major application in seismology is the determination of seismic velocity models. Travel time measurements are putting an integral constraint on the velocity between source and receiver. We provide insight into travel time inversion from a correlation-based Bayesian point of view. Therefore, the concept of Gaussian process regression is adopted to estimate a velocity model. The non-linear travel time integral is approximated by a 1st order Taylor expansion. A heuristic covariance describes correlations amongst observations and a priori model. That approach enables us to assess a proxy of the Bayesian posterior distribution at ordinary computational costs. No multi dimensional numeric integration nor excessive sampling is necessary. Instead of stacking the data, we suggest to progressively build the posterior distribution. Incorporating only a single evidence at a time accounts for the deficit of linearization. As a result, the most probable model is given by the posterior mean whereas uncertainties are described by the posterior covariance.As a proof of concept, a synthetic purely 1d model is addressed. Therefore a single source accompanied by multiple receivers is considered on top of a model comprising a discontinuity. We consider travel times of both phases - direct and reflected wave - corrupted by noise. Left and right of the interface are assumed independent where the squared exponential kernel serves as covariance.
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Simultaneous hit finding and timing method for pulse shape analysis of drift chamber signals
Energy Technology Data Exchange (ETDEWEB)
Schaile, D; Schaile, O; Schwarz, J
1986-01-01
An algorithm for the analysis of the digitized signal waveform of drift chamber pulses is described which yields a good multihit resolution and an accurate drift time determination with little processing time. The method has been tested and evaluated with measured pulse shapes from the full size prototype of the OPAL central detector which were digitized by 100 MHz FADCs. (orig.).
Simultaneous hit finding and timing method for pulse shape analysis of drift chamber signals
Energy Technology Data Exchange (ETDEWEB)
Schaile, D; Schaile, O; Schwarz, J
1986-01-01
An algorithm for the analysis of the digitized signal waveform of drift chamber pulses is described which yields a good multihit resolution and an accurate drift time determination with little processing time. The method has been tested and evaluated with measured pulse shapes from the full size prototype of the OPAL central detector which were digitized by 100 MHz FADCs.
Accelerated failure time regression for backward recurrence times and current durations
DEFF Research Database (Denmark)
Keiding, N; Fine, J P; Hansen, O H
2011-01-01
Backward recurrence times in stationary renewal processes and current durations in dynamic populations observed at a cross-section may yield estimates of underlying interarrival times or survival distributions under suitable stationarity assumptions. Regression models have been proposed for these......Backward recurrence times in stationary renewal processes and current durations in dynamic populations observed at a cross-section may yield estimates of underlying interarrival times or survival distributions under suitable stationarity assumptions. Regression models have been proposed...... for these situations, but accelerated failure time models have the particularly attractive feature that they are preserved when going from the backward recurrence times to the underlying survival distribution of interest. This simple fact has recently been noticed in a sociological context and is here illustrated...... by a study of current duration of time to pregnancy...
Prospective versus predictive control in timing of hitting a falling ball.
Katsumata, Hiromu; Russell, Daniel M
2012-02-01
Debate exists as to whether humans use prospective or predictive control to intercept an object falling under gravity (Baurès et al. in Vis Res 47:2982-2991, 2007; Zago et al. in Vis Res 48:1532-1538, 2008). Prospective control involves using continuous information to regulate action. τ, the ratio of the size of the gap to the rate of gap closure, has been proposed as the information used in guiding interceptive actions prospectively (Lee in Ecol Psychol 10:221-250, 1998). This form of control is expected to generate movement modulation, where variability decreases over the course of an action based upon more accurate timing information. In contrast, predictive control assumes that a pre-programmed movement is triggered at an appropriate criterion timing variable. For a falling object it is commonly argued that an internal model of gravitational acceleration is used to predict the motion of the object and determine movement initiation. This form of control predicts fixed duration movements initiated at consistent time-to-contact (TTC), either across conditions (constant criterion operational timing) or within conditions (variable criterion operational timing). The current study sought to test predictive and prospective control hypotheses by disrupting continuous visual information of a falling ball and examining consistency in movement initiation and duration, and evidence for movement modulation. Participants (n = 12) batted a ball dropped from three different heights (1, 1.3 and 1.5 m), under both full-vision and partial occlusion conditions. In the occlusion condition, only the initial ball drop and the final 200 ms of ball flight to the interception point could be observed. The initiation of the swing did not occur at a consistent TTC, τ, or any other timing variable across drop heights, in contrast with previous research. However, movement onset was not impacted by occluding the ball flight for 280-380 ms. This finding indicates that humans did not
Tracking time-varying parameters with local regression
DEFF Research Database (Denmark)
Joensen, Alfred Karsten; Nielsen, Henrik Aalborg; Nielsen, Torben Skov
2000-01-01
This paper shows that the recursive least-squares (RLS) algorithm with forgetting factor is a special case of a varying-coe\\$cient model, and a model which can easily be estimated via simple local regression. This observation allows us to formulate a new method which retains the RLS algorithm, bu......, but extends the algorithm by including polynomial approximations. Simulation results are provided, which indicates that this new method is superior to the classical RLS method, if the parameter variations are smooth....
Technological progress and regress in pre-industrial times
DEFF Research Database (Denmark)
Aiyar, Shekhar; Dalgaard, Carl-Johan Lars; Moav, Omer
2008-01-01
This paper offers micro-foundations for the dynamic relationship between technology and population in the pre-industrial world, accounting for both technological progress and the hitherto neglected but common phenomenon of technological regress. A positive feedback between population and the adop....... Inventions don't just get adopted once and forever; they have to be constantly practised and transmitted, or useful techniques may be forgotten. Jared Diamond, Ten Thousand Years of Solitude, 1993...
Woolhandler, Steffie; Himmelstein, David U
2016-01-01
U.S. employment-based health benefits are exempt from income and payroll taxes, an exemption that provided tax subsidies of $326.2 billion in 2015. Both liberal and conservative economists have denounced these subsidies as “regressive” and lauded a provision of the Affordable Care Act—the Cadillac Tax—that would curtail them. The claim that the subsidies are regressive rests on estimates showing that the affluent receive the largest subsidies in absolute dollars. But this claim ignores the standard definition of regressivity, which is based on the share of income paid by the wealthy versus the poor, rather than on dollar amounts. In this study, we calculate the value of tax subsidies in 2009 as a share of income for each income quintile and for the wealthiest Americans. In absolute dollars, tax subsidies were highest for families between the 80th and 95th percentiles of family income and lowest for the poorest 20%. However, as shares of income, subsidies were largest for the middle and fourth income quintiles and smallest for the wealthiest 0.5% of Americans. We conclude that the tax subsidy to employment-based insurance is neither markedly regressive, nor progressive. The Cadillac Tax will disproportionately harm families with (2009) incomes between $38,550 and $100,000, while sparing the wealthy.
Correlation, regression, and cointegration of nonstationary economic time series
DEFF Research Database (Denmark)
Johansen, Søren
Yule (1926) introduced the concept of spurious or nonsense correlation, and showed by simulation that for some nonstationary processes, that the empirical correlations seem not to converge in probability even if the processes were independent. This was later discussed by Granger and Newbold (1974......), and Phillips (1986) found the limit distributions. We propose to distinguish between empirical and population correlation coeffients and show in a bivariate autoregressive model for nonstationary variables that the empirical correlation and regression coe¢ cients do not converge to the relevant population...
Correlation, Regression, and Cointegration of Nonstationary Economic Time Series
DEFF Research Database (Denmark)
Johansen, Søren
), and Phillips (1986) found the limit distributions. We propose to distinguish between empirical and population correlation coefficients and show in a bivariate autoregressive model for nonstationary variables that the empirical correlation and regression coefficients do not converge to the relevant population...... values, due to the trending nature of the data. We conclude by giving a simple cointegration analysis of two interests. The analysis illustrates that much more insight can be gained about the dynamic behavior of the nonstationary variables then simply by calculating a correlation coefficient......Yule (1926) introduced the concept of spurious or nonsense correlation, and showed by simulation that for some nonstationary processes, that the empirical correlations seem not to converge in probability even if the processes were independent. This was later discussed by Granger and Newbold (1974...
You Can’t Think and Hit at the Same Time: Neural Correlates of Baseball Pitch Classification
Directory of Open Access Journals (Sweden)
Jason eSherwin
2012-12-01
Full Text Available Hitting a baseball is often described as the most difficult thing to do in sports. A key aptitude of a good hitter is the ability to determine which pitch is coming. This rapid decision requires the batter to make a judgment in a fraction of a second based largely on the trajectory and spin of the ball. When does this decision occur relative to the ball’s trajectory and is it possible to identify neural correlates that represent how the decision evolves over a split second? Using single-trial analysis of electroencephalography (EEG we address this question within the context of subjects discriminating three types of pitches (fastball, curveball, slider based on pitch trajectories. We find clear neural signatures of pitch classification and, using signal detection theory, we identify the times of discrimination on a trial-to-trial basis. Based on these neural signatures we estimate neural discrimination distributions as a function of the distance the ball is from the plate. We find all three pitches yield unique distributions, namely the timing of the discriminating neural signatures relative to the position of the ball in its trajectory. For instance, fastballs are discriminated at the earliest points in their trajectory, relative to the two other pitches, which is consistent with the need for some constant time to generate and execute the motor plan for the swing (or inhibition of the swing. We also find incorrect discrimination of a pitch (errors yields neural sources in Brodmann Area 10 (BA 10, which has been implicated in prospective memory, recall and task difficulty. In summary, we show that single-trial analysis of EEG yields informative distributions of the relative point in a baseball’s trajectory when the batter makes a decision on which pitch is coming.
Overcoming Spurious Regression Using time-Varying Fourier ...
African Journals Online (AJOL)
Non-stationary time series data have been traditionally analyzed in the frequency domain by assuming constant amplitudes regardless of the timelag. A new approach called time-varying amplitude method (TVAM) is presented here. Oscillations are analyzed for changes in the magnitude of Fourier Coefficients which are ...
A generalized additive regression model for survival times
DEFF Research Database (Denmark)
Scheike, Thomas H.
2001-01-01
Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models......Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models...
Martingale Regressions for a Continuous Time Model of Exchange Rates
Guo, Zi-Yi
2017-01-01
One of the daunting problems in international finance is the weak explanatory power of existing theories of the nominal exchange rates, the so-called “foreign exchange rate determination puzzle”. We propose a continuous-time model to study the impact of order flow on foreign exchange rates. The model is estimated by a newly developed econometric tool based on a time-change sampling from calendar to volatility time. The estimation results indicate that the effect of order flow on exchange rate...
Conditional mode regression: Application to functional time series prediction
Dabo-Niang, Sophie; Laksaci, Ali
2008-01-01
We consider $\\alpha$-mixing observations and deal with the estimation of the conditional mode of a scalar response variable $Y$ given a random variable $X$ taking values in a semi-metric space. We provide a convergence rate in $L^p$ norm of the estimator. A useful and typical application to functional times series prediction is given.
Optimum short-time polynomial regression for signal analysis
Indian Academy of Sciences (India)
A Sreenivasa Murthy
the Proceedings of European Signal Processing Conference. (EUSIPCO) 2008. ... In a seminal paper, Savitzky and Golay [4] showed that short-time polynomial modeling is ...... We next consider a linearly frequency-modulated chirp with an exponentially .... 1 http://www.physionet.org/physiotools/matlab/ECGwaveGen/.
A Polynomial Time Construction of a Hitting Set for Read-Once Branching Programs of Width 3
Czech Academy of Sciences Publication Activity Database
Šíma, Jiří; Žák, Stanislav
-, subm. 2015 (2018) ISSN 0022-0000 R&D Projects: GA ČR GBP202/12/G061; GA ČR GAP202/10/1333 Institutional support: RVO:67985807 Keywords : derandomization * Hitting Set * read-once branching programs * bounded width Subject RIV: IN - Informatics, Computer Science Impact factor: 1.678, year: 2016
Regression analysis for bivariate gap time with missing first gap time data.
Huang, Chia-Hui; Chen, Yi-Hau
2017-01-01
We consider ordered bivariate gap time while data on the first gap time are unobservable. This study is motivated by the HIV infection and AIDS study, where the initial HIV contracting time is unavailable, but the diagnosis times for HIV and AIDS are available. We are interested in studying the risk factors for the gap time between initial HIV contraction and HIV diagnosis, and gap time between HIV and AIDS diagnoses. Besides, the association between the two gap times is also of interest. Accordingly, in the data analysis we are faced with two-fold complexity, namely data on the first gap time is completely missing, and the second gap time is subject to induced informative censoring due to dependence between the two gap times. We propose a modeling framework for regression analysis of bivariate gap time under the complexity of the data. The estimating equations for the covariate effects on, as well as the association between, the two gap times are derived through maximum likelihood and suitable counting processes. Large sample properties of the resulting estimators are developed by martingale theory. Simulations are performed to examine the performance of the proposed analysis procedure. An application of data from the HIV and AIDS study mentioned above is reported for illustration.
Ng, Kar Yong; Awang, Norhashidah
2018-01-06
Frequent haze occurrences in Malaysia have made the management of PM 10 (particulate matter with aerodynamic less than 10 μm) pollution a critical task. This requires knowledge on factors associating with PM 10 variation and good forecast of PM 10 concentrations. Hence, this paper demonstrates the prediction of 1-day-ahead daily average PM 10 concentrations based on predictor variables including meteorological parameters and gaseous pollutants. Three different models were built. They were multiple linear regression (MLR) model with lagged predictor variables (MLR1), MLR model with lagged predictor variables and PM 10 concentrations (MLR2) and regression with time series error (RTSE) model. The findings revealed that humidity, temperature, wind speed, wind direction, carbon monoxide and ozone were the main factors explaining the PM 10 variation in Peninsular Malaysia. Comparison among the three models showed that MLR2 model was on a same level with RTSE model in terms of forecasting accuracy, while MLR1 model was the worst.
Hahn, Andrew D; Rowe, Daniel B
2012-02-01
As more evidence is presented suggesting that the phase, as well as the magnitude, of functional MRI (fMRI) time series may contain important information and that there are theoretical drawbacks to modeling functional response in the magnitude alone, removing noise in the phase is becoming more important. Previous studies have shown that retrospective correction of noise from physiologic sources can remove significant phase variance and that dynamic main magnetic field correction and regression of estimated motion parameters also remove significant phase fluctuations. In this work, we investigate the performance of physiologic noise regression in a framework along with correction for dynamic main field fluctuations and motion regression. Our findings suggest that including physiologic regressors provides some benefit in terms of reduction in phase noise power, but it is small compared to the benefit of dynamic field corrections and use of estimated motion parameters as nuisance regressors. Additionally, we show that the use of all three techniques reduces phase variance substantially, removes undesirable spatial phase correlations and improves detection of the functional response in magnitude and phase. Copyright © 2011 Elsevier Inc. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Winkelmann, Tim, E-mail: tim.winkelmann@med.uni-heidelberg.de; Cee, Rainer; Haberer, Thomas; Naas, Bernd; Peters, Andreas; Schreiner, Jochen [Heidelberger Ionenstrahl-Therapie Centrum (HIT), D -69120 Heidelberg (Germany)
2014-02-15
The clinical operation at the Heidelberg Ion Beam Therapy Center (HIT) started in November 2009; since then more than 1600 patients have been treated. In a 24/7 operation scheme two 14.5 GHz electron cyclotron resonance ion sources are routinely used to produce protons and carbon ions. The modification of the low energy beam transport line and the integration of a third ion source into the therapy facility will be shown. In the last year we implemented a new extraction system at all three sources to enhance the lifetime of extraction parts and reduce preventive and corrective maintenance. The new four-electrode-design provides electron suppression as well as lower beam emittance. Unwanted beam sputtering effects which typically lead to contamination of the insulator ceramics and subsequent high-voltage break-downs are minimized by the beam guidance of the new extraction system. By this measure the service interval can be increased significantly. As a side effect, the beam emittance can be reduced allowing a less challenging working point for the ion sources without reducing the effective beam performance. This paper gives also an outlook to further enhancements at the HIT ion source testbench.
Time course for tail regression during metamorphosis of the ascidian Ciona intestinalis.
Matsunobu, Shohei; Sasakura, Yasunori
2015-09-01
In most ascidians, the tadpole-like swimming larvae dramatically change their body-plans during metamorphosis and develop into sessile adults. The mechanisms of ascidian metamorphosis have been researched and debated for many years. Until now information on the detailed time course of the initiation and completion of each metamorphic event has not been described. One dramatic and important event in ascidian metamorphosis is tail regression, in which ascidian larvae lose their tails to adjust themselves to sessile life. In the present study, we measured the time associated with tail regression in the ascidian Ciona intestinalis. Larvae are thought to acquire competency for each metamorphic event in certain developmental periods. We show that the timing with which the competence for tail regression is acquired is determined by the time since hatching, and this timing is not affected by the timing of post-hatching events such as adhesion. Because larvae need to adhere to substrates with their papillae to induce tail regression, we measured the duration for which larvae need to remain adhered in order to initiate tail regression and the time needed for the tail to regress. Larvae acquire the ability to adhere to substrates before they acquire tail regression competence. We found that when larvae adhered before they acquired tail regression competence, they were able to remember the experience of adhesion until they acquired the ability to undergo tail regression. The time course of the events associated with tail regression provides a valuable reference, upon which the cellular and molecular mechanisms of ascidian metamorphosis can be elucidated. Copyright © 2015 Elsevier Inc. All rights reserved.
Tightness of M-estimators for multiple linear regression in time series
DEFF Research Database (Denmark)
Johansen, Søren; Nielsen, Bent
We show tightness of a general M-estimator for multiple linear regression in time series. The positive criterion function for the M-estimator is assumed lower semi-continuous and sufficiently large for large argument: Particular cases are the Huber-skip and quantile regression. Tightness requires...
Schryver, T. de; Eisinga, R.
2010-01-01
The key question in research on dismissals of head coaches in sports clubs is not whether they should happen but when they will happen. This paper applies piecewise linear regression to advance our understanding of the timing of head coach dismissals. Essentially, the regression sacrifices degrees
Fuzzy Linear Regression for the Time Series Data which is Fuzzified with SMRGT Method
Directory of Open Access Journals (Sweden)
Seçil YALAZ
2016-10-01
Full Text Available Our work on regression and classification provides a new contribution to the analysis of time series used in many areas for years. Owing to the fact that convergence could not obtained with the methods used in autocorrelation fixing process faced with time series regression application, success is not met or fall into obligation of changing the models’ degree. Changing the models’ degree may not be desirable in every situation. In our study, recommended for these situations, time series data was fuzzified by using the simple membership function and fuzzy rule generation technique (SMRGT and to estimate future an equation has created by applying fuzzy least square regression (FLSR method which is a simple linear regression method to this data. Although SMRGT has success in determining the flow discharge in open channels and can be used confidently for flow discharge modeling in open canals, as well as in pipe flow with some modifications, there is no clue about that this technique is successful in fuzzy linear regression modeling. Therefore, in order to address the luck of such a modeling, a new hybrid model has been described within this study. In conclusion, to demonstrate our methods’ efficiency, classical linear regression for time series data and linear regression for fuzzy time series data were applied to two different data sets, and these two approaches performances were compared by using different measures.
Directory of Open Access Journals (Sweden)
Hsin-Lun Wu
Full Text Available Although procedure time analyses are important for operating room management, it is not easy to extract useful information from clinical procedure time data. A novel approach was proposed to analyze procedure time during anesthetic induction. A two-step regression analysis was performed to explore influential factors of anesthetic induction time (AIT. Linear regression with stepwise model selection was used to select significant correlates of AIT and then quantile regression was employed to illustrate the dynamic relationships between AIT and selected variables at distinct quantiles. A total of 1,060 patients were analyzed. The first and second-year residents (R1-R2 required longer AIT than the third and fourth-year residents and attending anesthesiologists (p = 0.006. Factors prolonging AIT included American Society of Anesthesiologist physical status ≧ III, arterial, central venous and epidural catheterization, and use of bronchoscopy. Presence of surgeon before induction would decrease AIT (p < 0.001. Types of surgery also had significant influence on AIT. Quantile regression satisfactorily estimated extra time needed to complete induction for each influential factor at distinct quantiles. Our analysis on AIT demonstrated the benefit of quantile regression analysis to provide more comprehensive view of the relationships between procedure time and related factors. This novel two-step regression approach has potential applications to procedure time analysis in operating room management.
Krishnan, M.; Bhowmik, B.; Hazra, B.; Pakrashi, V.
2018-02-01
In this paper, a novel baseline free approach for continuous online damage detection of multi degree of freedom vibrating structures using Recursive Principal Component Analysis (RPCA) in conjunction with Time Varying Auto-Regressive Modeling (TVAR) is proposed. In this method, the acceleration data is used to obtain recursive proper orthogonal components online using rank-one perturbation method, followed by TVAR modeling of the first transformed response, to detect the change in the dynamic behavior of the vibrating system from its pristine state to contiguous linear/non-linear-states that indicate damage. Most of the works available in the literature deal with algorithms that require windowing of the gathered data owing to their data-driven nature which renders them ineffective for online implementation. Algorithms focussed on mathematically consistent recursive techniques in a rigorous theoretical framework of structural damage detection is missing, which motivates the development of the present framework that is amenable for online implementation which could be utilized along with suite experimental and numerical investigations. The RPCA algorithm iterates the eigenvector and eigenvalue estimates for sample covariance matrices and new data point at each successive time instants, using the rank-one perturbation method. TVAR modeling on the principal component explaining maximum variance is utilized and the damage is identified by tracking the TVAR coefficients. This eliminates the need for offline post processing and facilitates online damage detection especially when applied to streaming data without requiring any baseline data. Numerical simulations performed on a 5-dof nonlinear system under white noise excitation and El Centro (also known as 1940 Imperial Valley earthquake) excitation, for different damage scenarios, demonstrate the robustness of the proposed algorithm. The method is further validated on results obtained from case studies involving
Directory of Open Access Journals (Sweden)
Jun Bi
2018-04-01
Full Text Available Battery electric vehicles (BEVs reduce energy consumption and air pollution as compared with conventional vehicles. However, the limited driving range and potential long charging time of BEVs create new problems. Accurate charging time prediction of BEVs helps drivers determine travel plans and alleviate their range anxiety during trips. This study proposed a combined model for charging time prediction based on regression and time-series methods according to the actual data from BEVs operating in Beijing, China. After data analysis, a regression model was established by considering the charged amount for charging time prediction. Furthermore, a time-series method was adopted to calibrate the regression model, which significantly improved the fitting accuracy of the model. The parameters of the model were determined by using the actual data. Verification results confirmed the accuracy of the model and showed that the model errors were small. The proposed model can accurately depict the charging time characteristics of BEVs in Beijing.
Tippett, Elizabeth C; Chen, Brian K
2015-12-01
Attorneys sponsor television advertisements that include repeated warnings about adverse drug events to solicit consumers for lawsuits against drug manufacturers. The relationship between such advertising, safety actions by the US Food and Drug Administration (FDA), and healthcare use is unknown. To investigate the relationship between attorney advertising, FDA actions, and prescription drug claims. The study examined total users per month and prescription rates for seven drugs with substantial attorney advertising volume and FDA or other safety interventions during 2009. Segmented regression analysis was used to detect pre-intervention trends, post-intervention level changes, and changes in post-intervention trends relative to the pre-intervention trends in the use of these seven drugs, using advertising volume, media hits, and the number of Medicare enrollees as covariates. Data for these variables were obtained from the Center for Medicare and Medicaid Services, Kantar Media, and LexisNexis. Several types of safety actions were associated with reductions in drug users and/or prescription rates, particularly for fentanyl, varenicline, and paroxetine. In most cases, attorney advertising volume rose in conjunction with major safety actions. Attorney advertising volume was positively correlated with prescription rates in five of seven drugs, likely because advertising volume began rising before safety actions, when prescription rates were still increasing. On the other hand, attorney advertising had mixed associations with the number of users per month. Regulatory and safety actions likely reduced the number of users and/or prescription rates for some drugs. Attorneys may have strategically chosen to begin advertising adverse drug events prior to major safety actions, but we found little evidence that attorney advertising reduced drug use. Further research is needed to better understand how consumers and physicians respond to attorney advertising.
Phase Space Prediction of Chaotic Time Series with Nu-Support Vector Machine Regression
International Nuclear Information System (INIS)
Ye Meiying; Wang Xiaodong
2005-01-01
A new class of support vector machine, nu-support vector machine, is discussed which can handle both classification and regression. We focus on nu-support vector machine regression and use it for phase space prediction of chaotic time series. The effectiveness of the method is demonstrated by applying it to the Henon map. This study also compares nu-support vector machine with back propagation (BP) networks in order to better evaluate the performance of the proposed methods. The experimental results show that the nu-support vector machine regression obtains lower root mean squared error than the BP networks and provides an accurate chaotic time series prediction. These results can be attributable to the fact that nu-support vector machine implements the structural risk minimization principle and this leads to better generalization than the BP networks.
Time series modeling by a regression approach based on a latent process.
Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice
2009-01-01
Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.
Cruz, Antonio M; Barr, Cameron; Puñales-Pozo, Elsa
2008-01-01
This research's main goals were to build a predictor for a turnaround time (TAT) indicator for estimating its values and use a numerical clustering technique for finding possible causes of undesirable TAT values. The following stages were used: domain understanding, data characterisation and sample reduction and insight characterisation. Building the TAT indicator multiple linear regression predictor and clustering techniques were used for improving corrective maintenance task efficiency in a clinical engineering department (CED). The indicator being studied was turnaround time (TAT). Multiple linear regression was used for building a predictive TAT value model. The variables contributing to such model were clinical engineering department response time (CE(rt), 0.415 positive coefficient), stock service response time (Stock(rt), 0.734 positive coefficient), priority level (0.21 positive coefficient) and service time (0.06 positive coefficient). The regression process showed heavy reliance on Stock(rt), CE(rt) and priority, in that order. Clustering techniques revealed the main causes of high TAT values. This examination has provided a means for analysing current technical service quality and effectiveness. In doing so, it has demonstrated a process for identifying areas and methods of improvement and a model against which to analyse these methods' effectiveness.
Time-based cluster and hit finding for the STS detector in the CBM experiment at FAIR
Energy Technology Data Exchange (ETDEWEB)
Kozlov, Grigory [Goethe University, Frankfurt am Main (Germany); Frankfurt Institute for Advanced Studies, Frankfurt am Main (Germany); GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Joint Institute for Nuclear Research, Dubna (Russian Federation); Kisel, Ivan [Goethe University, Frankfurt am Main (Germany); Frankfurt Institute for Advanced Studies, Frankfurt am Main (Germany); GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Collaboration: CBM-Collaboration
2016-07-01
The goal of the future CBM experiment at FAIR is to explore the QCD phase diagram in the region of high baryon densities using high-energy nucleus-nucleus collisions. An important feature of the experiment is the real time reconstruction and physical analysis. It will allow select important events immediately after the collision and increase the quality of the data. In this case, the data are supplied to processing in form of time slices containing a large number of collisions. Preprocessing of the time-based results requires special algorithms that take into account not only the coordinates, but also the time of flight of each particle. Clustering algorithm for the STS detector has been designed and integrated into the CBMROOT framework. It enables data processing with high efficiency for the time slices of any length at frequencies of 107 and over collisions per second. The algorithm has a high speed and it can operate in event-based mode as well as in time-based.
A fast online hit verification method for the single ion hit system at GSI
International Nuclear Information System (INIS)
Du, G.; Fischer, B.; Barberet, P.; Heiss, M.
2006-01-01
For a single ion hit facility built to irradiate specific targets inside biological cells, it is necessary to prove that the ions hit the selected targets reliably because the ion hits usually cannot be seen. That ability is traditionally tested either indirectly by aiming at pre-etched tracks in a nuclear track detector or directly by making the ion tracks inside cells visible using a stain coupled to special proteins produced in response to ion hits. However, both methods are time consuming and hits can be verified only after the experiment. This means that targeting errors in the experiment cannot be corrected during the experiment. Therefore, we have developed a fast online hit verification method that measures the targeting accuracy electronically with a spatial resolution of ±1 μm before cell irradiation takes place. (authors)
Nedley Depression Hit Hypothesis
Nedley, Neil; Ramirez, Francisco E.
2014-01-01
Depression is often diagnosed using the Diagnostic and Statistical Manual of Mental Disorders Fifth Edition (DSM-5) criteria. We propose how certain lifestyle choices and non-modifiable factors can predict the development of depression. We identified 10 cause categories (hits or ?blows? to the brain) and theorize that four or more active hits could trigger a depression episode. Methods. A sample of 4271 participants from our community-based program (70% female; ages 17-94 years) was assessed ...
Replica analysis of overfitting in regression models for time-to-event data
Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.
2017-09-01
Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.
Improving the Prediction of Total Surgical Procedure Time Using Linear Regression Modeling.
Edelman, Eric R; van Kuijk, Sander M J; Hamaekers, Ankie E W; de Korte, Marcel J M; van Merode, Godefridus G; Buhre, Wolfgang F F A
2017-01-01
For efficient utilization of operating rooms (ORs), accurate schedules of assigned block time and sequences of patient cases need to be made. The quality of these planning tools is dependent on the accurate prediction of total procedure time (TPT) per case. In this paper, we attempt to improve the accuracy of TPT predictions by using linear regression models based on estimated surgeon-controlled time (eSCT) and other variables relevant to TPT. We extracted data from a Dutch benchmarking database of all surgeries performed in six academic hospitals in The Netherlands from 2012 till 2016. The final dataset consisted of 79,983 records, describing 199,772 h of total OR time. Potential predictors of TPT that were included in the subsequent analysis were eSCT, patient age, type of operation, American Society of Anesthesiologists (ASA) physical status classification, and type of anesthesia used. First, we computed the predicted TPT based on a previously described fixed ratio model for each record, multiplying eSCT by 1.33. This number is based on the research performed by van Veen-Berkx et al., which showed that 33% of SCT is generally a good approximation of anesthesia-controlled time (ACT). We then systematically tested all possible linear regression models to predict TPT using eSCT in combination with the other available independent variables. In addition, all regression models were again tested without eSCT as a predictor to predict ACT separately (which leads to TPT by adding SCT). TPT was most accurately predicted using a linear regression model based on the independent variables eSCT, type of operation, ASA classification, and type of anesthesia. This model performed significantly better than the fixed ratio model and the method of predicting ACT separately. Making use of these more accurate predictions in planning and sequencing algorithms may enable an increase in utilization of ORs, leading to significant financial and productivity related benefits.
Improving the Prediction of Total Surgical Procedure Time Using Linear Regression Modeling
Directory of Open Access Journals (Sweden)
Eric R. Edelman
2017-06-01
Full Text Available For efficient utilization of operating rooms (ORs, accurate schedules of assigned block time and sequences of patient cases need to be made. The quality of these planning tools is dependent on the accurate prediction of total procedure time (TPT per case. In this paper, we attempt to improve the accuracy of TPT predictions by using linear regression models based on estimated surgeon-controlled time (eSCT and other variables relevant to TPT. We extracted data from a Dutch benchmarking database of all surgeries performed in six academic hospitals in The Netherlands from 2012 till 2016. The final dataset consisted of 79,983 records, describing 199,772 h of total OR time. Potential predictors of TPT that were included in the subsequent analysis were eSCT, patient age, type of operation, American Society of Anesthesiologists (ASA physical status classification, and type of anesthesia used. First, we computed the predicted TPT based on a previously described fixed ratio model for each record, multiplying eSCT by 1.33. This number is based on the research performed by van Veen-Berkx et al., which showed that 33% of SCT is generally a good approximation of anesthesia-controlled time (ACT. We then systematically tested all possible linear regression models to predict TPT using eSCT in combination with the other available independent variables. In addition, all regression models were again tested without eSCT as a predictor to predict ACT separately (which leads to TPT by adding SCT. TPT was most accurately predicted using a linear regression model based on the independent variables eSCT, type of operation, ASA classification, and type of anesthesia. This model performed significantly better than the fixed ratio model and the method of predicting ACT separately. Making use of these more accurate predictions in planning and sequencing algorithms may enable an increase in utilization of ORs, leading to significant financial and productivity related
Stagewise pseudo-value regression for time-varying effects on the cumulative incidence
DEFF Research Database (Denmark)
Zöller, Daniela; Schmidtmann, Irene; Weinmann, Arndt
2016-01-01
In a competing risks setting, the cumulative incidence of an event of interest describes the absolute risk for this event as a function of time. For regression analysis, one can either choose to model all competing events by separate cause-specific hazard models or directly model the association...... for time-varying effects. This is implemented by coupling variable selection between the grid times, but determining estimates separately. The effect estimates are regularized to also allow for model fitting with a low to moderate number of observations. This technique is illustrated in an application...
Constructive Technology Assessment for HIT development
DEFF Research Database (Denmark)
Høstgaard, Anna Marie Balling; Bertelsen, Pernille; Petersen, Lone Stub
2013-01-01
Experience and time has shown a need for new evaluation methods for evaluating Health Information Technology (HIT), as summative evaluation methods fail to accommodate the rapid and constant changes in HIT over time and to involve end-users, which has been recognized as an important success facto...... during all the phases in the process. Thereby anumber of problems were prevented to occur later on.Thus, the CTA method and its framework are useful for evaluators and project-management in order to facilitate and support successful HIT development....
Wilson, Barry T.; Knight, Joseph F.; McRoberts, Ronald E.
2018-03-01
Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several methods have previously been developed for use with finer temporal resolution imagery (e.g. AVHRR and MODIS), including image compositing and harmonic regression using Fourier series. The manuscript presents a study, using Minnesota, USA during the years 2009-2013 as the study area and timeframe. The study examined the relative predictive power of land cover models, in particular those related to tree cover, using predictor variables based solely on composite imagery versus those using estimated harmonic regression coefficients. The study used two common non-parametric modeling approaches (i.e. k-nearest neighbors and random forests) for fitting classification and regression models of multiple attributes measured on USFS Forest Inventory and Analysis plots using all available Landsat imagery for the study area and timeframe. The estimated Fourier coefficients developed by harmonic regression of tasseled cap transformation time series data were shown to be correlated with land cover, including tree cover. Regression models using estimated Fourier coefficients as predictor variables showed a two- to threefold increase in explained variance for a small set of continuous response variables, relative to comparable models using monthly image composites. Similarly, the overall accuracies of classification models using the estimated Fourier coefficients were approximately 10-20 percentage points higher than the models using the image composites, with corresponding individual class accuracies between six and 45 percentage points higher.
Time-trend of melanoma screening practice by primary care physicians: A meta-regression analysis
Valachis, Antonis; Mauri, Davide; Karampoiki, Vassiliki; Polyzos, Nikolaos P; Cortinovis, Ivan; Koukourakis, Georgios; Zacharias, Georgios; Xilomenos, Apostolos; Tsappi, Maria; Casazza, Giovanni
2009-01-01
Objective To assess whether the proportion of primary care physicians implementing full body skin examination (FBSE) to screen for melanoma changed over time. Methods Meta-regression analyses of available data. Data Sources: MEDLINE, ISI, Cochrane Central Register of Controlled Trials. Results Fifteen studies surveying 10,336 physicians were included in the analyses. Overall, 15%?82% of them reported to perform FBSE to screen for melanoma. The proportion of physicians using FBSE screening ten...
Longitudinal beta regression models for analyzing health-related quality of life scores over time
Directory of Open Access Journals (Sweden)
Hunger Matthias
2012-09-01
Full Text Available Abstract Background Health-related quality of life (HRQL has become an increasingly important outcome parameter in clinical trials and epidemiological research. HRQL scores are typically bounded at both ends of the scale and often highly skewed. Several regression techniques have been proposed to model such data in cross-sectional studies, however, methods applicable in longitudinal research are less well researched. This study examined the use of beta regression models for analyzing longitudinal HRQL data using two empirical examples with distributional features typically encountered in practice. Methods We used SF-6D utility data from a German older age cohort study and stroke-specific HRQL data from a randomized controlled trial. We described the conceptual differences between mixed and marginal beta regression models and compared both models to the commonly used linear mixed model in terms of overall fit and predictive accuracy. Results At any measurement time, the beta distribution fitted the SF-6D utility data and stroke-specific HRQL data better than the normal distribution. The mixed beta model showed better likelihood-based fit statistics than the linear mixed model and respected the boundedness of the outcome variable. However, it tended to underestimate the true mean at the upper part of the distribution. Adjusted group means from marginal beta model and linear mixed model were nearly identical but differences could be observed with respect to standard errors. Conclusions Understanding the conceptual differences between mixed and marginal beta regression models is important for their proper use in the analysis of longitudinal HRQL data. Beta regression fits the typical distribution of HRQL data better than linear mixed models, however, if focus is on estimating group mean scores rather than making individual predictions, the two methods might not differ substantially.
Online Support Vector Regression with Varying Parameters for Time-Dependent Data
International Nuclear Information System (INIS)
Omitaomu, Olufemi A.; Jeong, Myong K.; Badiru, Adedeji B.
2011-01-01
Support vector regression (SVR) is a machine learning technique that continues to receive interest in several domains including manufacturing, engineering, and medicine. In order to extend its application to problems in which datasets arrive constantly and in which batch processing of the datasets is infeasible or expensive, an accurate online support vector regression (AOSVR) technique was proposed. The AOSVR technique efficiently updates a trained SVR function whenever a sample is added to or removed from the training set without retraining the entire training data. However, the AOSVR technique assumes that the new samples and the training samples are of the same characteristics; hence, the same value of SVR parameters is used for training and prediction. This assumption is not applicable to data samples that are inherently noisy and non-stationary such as sensor data. As a result, we propose Accurate On-line Support Vector Regression with Varying Parameters (AOSVR-VP) that uses varying SVR parameters rather than fixed SVR parameters, and hence accounts for the variability that may exist in the samples. To accomplish this objective, we also propose a generalized weight function to automatically update the weights of SVR parameters in on-line monitoring applications. The proposed function allows for lower and upper bounds for SVR parameters. We tested our proposed approach and compared results with the conventional AOSVR approach using two benchmark time series data and sensor data from nuclear power plant. The results show that using varying SVR parameters is more applicable to time dependent data.
Lee, Eunjee; Zhu, Hongtu; Kong, Dehan; Wang, Yalin; Giovanello, Kelly Sullivan; Ibrahim, Joseph G
2015-12-01
The aim of this paper is to develop a Bayesian functional linear Cox regression model (BFLCRM) with both functional and scalar covariates. This new development is motivated by establishing the likelihood of conversion to Alzheimer's disease (AD) in 346 patients with mild cognitive impairment (MCI) enrolled in the Alzheimer's Disease Neuroimaging Initiative 1 (ADNI-1) and the early markers of conversion. These 346 MCI patients were followed over 48 months, with 161 MCI participants progressing to AD at 48 months. The functional linear Cox regression model was used to establish that functional covariates including hippocampus surface morphology and scalar covariates including brain MRI volumes, cognitive performance (ADAS-Cog), and APOE status can accurately predict time to onset of AD. Posterior computation proceeds via an efficient Markov chain Monte Carlo algorithm. A simulation study is performed to evaluate the finite sample performance of BFLCRM.
Visser, H.; Molenaar, J.
1995-05-01
The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of
Liu, Shelley H; Bobb, Jennifer F; Lee, Kyu Ha; Gennings, Chris; Claus Henn, Birgit; Bellinger, David; Austin, Christine; Schnaas, Lourdes; Tellez-Rojo, Martha M; Hu, Howard; Wright, Robert O; Arora, Manish; Coull, Brent A
2018-07-01
The impact of neurotoxic chemical mixtures on children's health is a critical public health concern. It is well known that during early life, toxic exposures may impact cognitive function during critical time intervals of increased vulnerability, known as windows of susceptibility. Knowledge on time windows of susceptibility can help inform treatment and prevention strategies, as chemical mixtures may affect a developmental process that is operating at a specific life phase. There are several statistical challenges in estimating the health effects of time-varying exposures to multi-pollutant mixtures, such as: multi-collinearity among the exposures both within time points and across time points, and complex exposure-response relationships. To address these concerns, we develop a flexible statistical method, called lagged kernel machine regression (LKMR). LKMR identifies critical exposure windows of chemical mixtures, and accounts for complex non-linear and non-additive effects of the mixture at any given exposure window. Specifically, LKMR estimates how the effects of a mixture of exposures change with the exposure time window using a Bayesian formulation of a grouped, fused lasso penalty within a kernel machine regression (KMR) framework. A simulation study demonstrates the performance of LKMR under realistic exposure-response scenarios, and demonstrates large gains over approaches that consider each time window separately, particularly when serial correlation among the time-varying exposures is high. Furthermore, LKMR demonstrates gains over another approach that inputs all time-specific chemical concentrations together into a single KMR. We apply LKMR to estimate associations between neurodevelopment and metal mixtures in Early Life Exposures in Mexico and Neurotoxicology, a prospective cohort study of child health in Mexico City.
International Nuclear Information System (INIS)
Belforte, S.; Dell'Orso, M.; Donati, S.
1996-01-01
The Hit Buffer is part of the Silicon Vertex Tracker, a trigger processor dedicated to the reconstruction of particle trajectories in the Silicon Vertex Detector and the Central Tracking Chamber of the Collider Detector at Fermilab. The Hit Buffer is a high speed data-traffic node, where thousands of words are received in arbitrary order and simultaneously organized in an internal structured data base, to be later promptly retrieved and delivered in response to specific requests. The Hit Buffer is capable of processing data at a rate of 25 MHz, thanks to the use of special fast devices like Cache-Tag RAMs and high performance Erasable Programmable Logic Devices from the XILINX XC7300 family
Directory of Open Access Journals (Sweden)
Guan Lian
2018-01-01
Full Text Available Accurate prediction of taxi-out time is significant precondition for improving the operationality of the departure process at an airport, as well as reducing the long taxi-out time, congestion, and excessive emission of greenhouse gases. Unfortunately, several of the traditional methods of predicting taxi-out time perform unsatisfactorily at congested airports. This paper describes and tests three of those conventional methods which include Generalized Linear Model, Softmax Regression Model, and Artificial Neural Network method and two improved Support Vector Regression (SVR approaches based on swarm intelligence algorithm optimization, which include Particle Swarm Optimization (PSO and Firefly Algorithm. In order to improve the global searching ability of Firefly Algorithm, adaptive step factor and Lévy flight are implemented simultaneously when updating the location function. Six factors are analysed, of which delay is identified as one significant factor in congested airports. Through a series of specific dynamic analyses, a case study of Beijing International Airport (PEK is tested with historical data. The performance measures show that the proposed two SVR approaches, especially the Improved Firefly Algorithm (IFA optimization-based SVR method, not only perform as the best modelling measures and accuracy rate compared with the representative forecast models, but also can achieve a better predictive performance when dealing with abnormal taxi-out time states.
Time series regression-based pairs trading in the Korean equities market
Kim, Saejoon; Heo, Jun
2017-07-01
Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.
Predictive densities for day-ahead electricity prices using time-adaptive quantile regression
DEFF Research Database (Denmark)
Jónsson, Tryggvi; Pinson, Pierre; Madsen, Henrik
2014-01-01
A large part of the decision-making problems actors of the power system are facing on a daily basis requires scenarios for day-ahead electricity market prices. These scenarios are most likely to be generated based on marginal predictive densities for such prices, then enhanced with a temporal...... dependence structure. A semi-parametric methodology for generating such densities is presented: it includes: (i) a time-adaptive quantile regression model for the 5%–95% quantiles; and (ii) a description of the distribution tails with exponential distributions. The forecasting skill of the proposed model...
Failure and reliability prediction by support vector machines regression of time series data
International Nuclear Information System (INIS)
Chagas Moura, Marcio das; Zio, Enrico; Lins, Isis Didier; Droguett, Enrique
2011-01-01
Support Vector Machines (SVMs) are kernel-based learning methods, which have been successfully adopted for regression problems. However, their use in reliability applications has not been widely explored. In this paper, a comparative analysis is presented in order to evaluate the SVM effectiveness in forecasting time-to-failure and reliability of engineered components based on time series data. The performance on literature case studies of SVM regression is measured against other advanced learning methods such as the Radial Basis Function, the traditional MultiLayer Perceptron model, Box-Jenkins autoregressive-integrated-moving average and the Infinite Impulse Response Locally Recurrent Neural Networks. The comparison shows that in the analyzed cases, SVM outperforms or is comparable to other techniques. - Highlights: → Realistic modeling of reliability demands complex mathematical formulations. → SVM is proper when the relation input/output is unknown or very costly to be obtained. → Results indicate the potential of SVM for reliability time series prediction. → Reliability estimates support the establishment of adequate maintenance strategies.
Real-time prediction of respiratory motion based on local regression methods
International Nuclear Information System (INIS)
Ruan, D; Fessler, J A; Balter, J M
2007-01-01
Recent developments in modulation techniques enable conformal delivery of radiation doses to small, localized target volumes. One of the challenges in using these techniques is real-time tracking and predicting target motion, which is necessary to accommodate system latencies. For image-guided-radiotherapy systems, it is also desirable to minimize sampling rates to reduce imaging dose. This study focuses on predicting respiratory motion, which can significantly affect lung tumours. Predicting respiratory motion in real-time is challenging, due to the complexity of breathing patterns and the many sources of variability. We propose a prediction method based on local regression. There are three major ingredients of this approach: (1) forming an augmented state space to capture system dynamics, (2) local regression in the augmented space to train the predictor from previous observation data using semi-periodicity of respiratory motion, (3) local weighting adjustment to incorporate fading temporal correlations. To evaluate prediction accuracy, we computed the root mean square error between predicted tumor motion and its observed location for ten patients. For comparison, we also investigated commonly used predictive methods, namely linear prediction, neural networks and Kalman filtering to the same data. The proposed method reduced the prediction error for all imaging rates and latency lengths, particularly for long prediction lengths
Prediction of hourly PM2.5 using a space-time support vector regression model
Yang, Wentao; Deng, Min; Xu, Feng; Wang, Hang
2018-05-01
Real-time air quality prediction has been an active field of research in atmospheric environmental science. The existing methods of machine learning are widely used to predict pollutant concentrations because of their enhanced ability to handle complex non-linear relationships. However, because pollutant concentration data, as typical geospatial data, also exhibit spatial heterogeneity and spatial dependence, they may violate the assumptions of independent and identically distributed random variables in most of the machine learning methods. As a result, a space-time support vector regression model is proposed to predict hourly PM2.5 concentrations. First, to address spatial heterogeneity, spatial clustering is executed to divide the study area into several homogeneous or quasi-homogeneous subareas. To handle spatial dependence, a Gauss vector weight function is then developed to determine spatial autocorrelation variables as part of the input features. Finally, a local support vector regression model with spatial autocorrelation variables is established for each subarea. Experimental data on PM2.5 concentrations in Beijing are used to verify whether the results of the proposed model are superior to those of other methods.
Time-trend of melanoma screening practice by primary care physicians: a meta-regression analysis.
Valachis, Antonis; Mauri, Davide; Karampoiki, Vassiliki; Polyzos, Nikolaos P; Cortinovis, Ivan; Koukourakis, Georgios; Zacharias, Georgios; Xilomenos, Apostolos; Tsappi, Maria; Casazza, Giovanni
2009-01-01
To assess whether the proportion of primary care physicians implementing full body skin examination (FBSE) to screen for melanoma changed over time. Meta-regression analyses of available data. MEDLINE, ISI, Cochrane Central Register of Controlled Trials. Fifteen studies surveying 10,336 physicians were included in the analyses. Overall, 15%-82% of them reported to perform FBSE to screen for melanoma. The proportion of physicians using FBSE screening tended to decrease by 1.72% per year (P =0.086). Corresponding annual changes in European, North American, and Australian settings were -0.68% (P =0.494), -2.02% (P =0.044), and +2.59% (P =0.010), respectively. Changes were not influenced by national guide-lines. Considering the increasing incidence of melanoma and other skin malignancies, as well as their relative potential consequences, the FBSE implementation time-trend we retrieved should be considered a worrisome phenomenon.
International Nuclear Information System (INIS)
Bogdan, Mircea; Frisch, Henry; Heintz, Mary; Paramonov, Alexander; Sanders, Harold; Chappa, Steve; DeMaat, Robert; Klein, Rod; Miao, Ting; Wilson, Peter; Phillips, Thomas J.
2005-01-01
We describe an field-programmable gate arrays based (FPGA), 96-channel, Time-to-Digital converter (TDC) and trigger logic board intended for use with the Central Outer Tracker (COT) [T. Affolder et al., Nucl. Instr. and Meth. A 526 (2004) 249] in the CDF Experiment [The CDF-II detector is described in the CDF Technical Design Report (TDR), FERMILAB-Pub-96/390-E. The TDC described here is intended as a further upgrade beyond that described in the TDR] at the Fermilab Tevatron. The COT system is digitized and read out by 315 TDC cards, each serving 96 wires of the chamber. The TDC is physically configured as a 9U VME card. The functionality is almost entirely programmed in firmware in two Altera Stratix FPGAs. The special capabilities of this device are the availability of 840MHz LVDS inputs, multiple phase-locked clock modules, and abundant memory. The TDC system operates with an input resolution of 1.2ns, a minimum input pulse width of 4.8ns and a minimum separation of 4.8ns between pulses. Each input can accept up to 7 hits per collision. The time-to-digital conversion is done by first sampling each of the 96 inputs in 1.2-ns bins and filling a circular memory; the memory addresses of logical transitions (edges) in the input data are then translated into the time of arrival and width of the COT pulses. Memory pipelines with a depth of 5.5μs allow deadtime-less operation in the first-level trigger; the data are multiple-buffered to diminish deadtime in the second-level trigger. The complete process of edge-detection and filling of buffers for readout takes 12μs. The TDC VME interface allows a 64-bit Chain Block Transfer of multiple boards in a crate with transfer-rates up to 47Mbytes/s. The TDC module also produces prompt trigger data every Tevatron crossing via a deadtimeless fast logic path that can be easily reprogrammed. The trigger bits are clocked onto the P3 VME backplane connector with a 22-ns clock for transmission to the trigger. The full TDC design and
Febrian Umbara, Rian; Tarwidi, Dede; Budi Setiawan, Erwin
2018-03-01
The paper discusses the prediction of Jakarta Composite Index (JCI) in Indonesia Stock Exchange. The study is based on JCI historical data for 1286 days to predict the value of JCI one day ahead. This paper proposes predictions done in two stages., The first stage using Fuzzy Time Series (FTS) to predict values of ten technical indicators, and the second stage using Support Vector Regression (SVR) to predict the value of JCI one day ahead, resulting in a hybrid prediction model FTS-SVR. The performance of this combined prediction model is compared with the performance of the single stage prediction model using SVR only. Ten technical indicators are used as input for each model.
Duda, Piotr; Jaworski, Maciej; Rutkowski, Leszek
2018-03-01
One of the greatest challenges in data mining is related to processing and analysis of massive data streams. Contrary to traditional static data mining problems, data streams require that each element is processed only once, the amount of allocated memory is constant and the models incorporate changes of investigated streams. A vast majority of available methods have been developed for data stream classification and only a few of them attempted to solve regression problems, using various heuristic approaches. In this paper, we develop mathematically justified regression models working in a time-varying environment. More specifically, we study incremental versions of generalized regression neural networks, called IGRNNs, and we prove their tracking properties - weak (in probability) and strong (with probability one) convergence assuming various concept drift scenarios. First, we present the IGRNNs, based on the Parzen kernels, for modeling stationary systems under nonstationary noise. Next, we extend our approach to modeling time-varying systems under nonstationary noise. We present several types of concept drifts to be handled by our approach in such a way that weak and strong convergence holds under certain conditions. Finally, in the series of simulations, we compare our method with commonly used heuristic approaches, based on forgetting mechanism or sliding windows, to deal with concept drift. Finally, we apply our concept in a real life scenario solving the problem of currency exchange rates prediction.
75 FR 21629 - HIT Standards Committee's Workgroup Meetings; Notice of Meetings
2010-04-26
... Technology; HIT Standards Committee's Workgroup Meetings; Notice of Meetings AGENCY: Office of the National... only. Name of Committees: HIT Standards Committee's Workgroups: Clinical Operations Vocabulary... developed by the HIT Policy Committee. Date and Time: The HIT Standards Committee Workgroups will hold the...
76 FR 46297 - HIT Standards Committee's Workgroup Meetings; Notice of Meetings
2011-08-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Standards Committee's... developed by the HIT Policy Committee. Date and Time: The HIT Standards Committee Workgroups will hold the...
Directory of Open Access Journals (Sweden)
Irene Fernández Monsalve
2014-08-01
Full Text Available During language comprehension, semantic contextual information is used to generate expectations about upcoming items. This has been commonly studied through the N400 event-related potential (ERP, as a measure of facilitated lexical retrieval. However, the associative relationships in multi-word expressions (MWE may enable the generation of a categorical expectation, leading to lexical retrieval before target word onset. Processing of the target word would thus reflect a target-identification mechanism, possibly indexed by a P3 ERP component. However, given their time overlap (200-500 ms post-stimulus onset, differentiating between N400/P3 ERP responses (averaged over multiple linguistically variable trials is problematic. In the present study, we analyzed EEG data from a previous experiment, which compared ERP responses to highly expected words that were placed either in a MWE or a regular non-fixed compositional context, and to low predictability controls. We focused on oscillatory dynamics and regression analyses, in order to dissociate between the two contexts by modeling the electrophysiological response as a function of item-level parameters. A significant interaction between word position and condition was found in the regression model for power in a theta range (~7-9 Hz, providing evidence for the presence of qualitative differences between conditions. Power levels within this band were lower for MWE than compositional contexts then the target word appeared later on in the sentence, confirming that in the former lexical retrieval would have taken place before word onset. On the other hand, gamma-power (~50-70 Hz was also modulated by predictability of the item in all conditions, which is interpreted as an index of a similar `matching' sub-step for both types of contexts, binding an expected representation and the external input.
Directory of Open Access Journals (Sweden)
Giuseppe Perinetti
2016-01-01
Full Text Available The knowledge of the associations between the timing of skeletal maturation and craniofacial growth is of primary importance when planning a functional treatment for most of the skeletal malocclusions. This cross-sectional study was thus aimed at evaluating whether sagittal and vertical craniofacial growth has an association with the timing of circumpubertal skeletal maturation. A total of 320 subjects (160 females and 160 males were included in the study (mean age, 12.3±1.7 years; range, 7.6–16.7 years. These subjects were equally distributed in the circumpubertal cervical vertebral maturation (CVM stages 2 to 5. Each CVM stage group also had equal number of females and males. Multiple regression models were run for each CVM stage group to assess the significance of the association of cephalometric parameters (ANB, SN/MP, and NSBa angles with age of attainment of the corresponding CVM stage (in months. Significant associations were seen only for stage 3, where the SN/MP angle was negatively associated with age (β coefficient, −0.7. These results show that hyperdivergent and hypodivergent subjects may have an anticipated and delayed attainment of the pubertal CVM stage 3, respectively. However, such association remains of little entity and it would become clinically relevant only in extreme cases.
Non-linear auto-regressive models for cross-frequency coupling in neural time series
Tallot, Lucille; Grabot, Laetitia; Doyère, Valérie; Grenier, Yves; Gramfort, Alexandre
2017-01-01
We address the issue of reliably detecting and quantifying cross-frequency coupling (CFC) in neural time series. Based on non-linear auto-regressive models, the proposed method provides a generative and parametric model of the time-varying spectral content of the signals. As this method models the entire spectrum simultaneously, it avoids the pitfalls related to incorrect filtering or the use of the Hilbert transform on wide-band signals. As the model is probabilistic, it also provides a score of the model “goodness of fit” via the likelihood, enabling easy and legitimate model selection and parameter comparison; this data-driven feature is unique to our model-based approach. Using three datasets obtained with invasive neurophysiological recordings in humans and rodents, we demonstrate that these models are able to replicate previous results obtained with other metrics, but also reveal new insights such as the influence of the amplitude of the slow oscillation. Using simulations, we demonstrate that our parametric method can reveal neural couplings with shorter signals than non-parametric methods. We also show how the likelihood can be used to find optimal filtering parameters, suggesting new properties on the spectrum of the driving signal, but also to estimate the optimal delay between the coupled signals, enabling a directionality estimation in the coupling. PMID:29227989
A joint logistic regression and covariate-adjusted continuous-time Markov chain model.
Rubin, Maria Laura; Chan, Wenyaw; Yamal, Jose-Miguel; Robertson, Claudia Sue
2017-12-10
The use of longitudinal measurements to predict a categorical outcome is an increasingly common goal in research studies. Joint models are commonly used to describe two or more models simultaneously by considering the correlated nature of their outcomes and the random error present in the longitudinal measurements. However, there is limited research on joint models with longitudinal predictors and categorical cross-sectional outcomes. Perhaps the most challenging task is how to model the longitudinal predictor process such that it represents the true biological mechanism that dictates the association with the categorical response. We propose a joint logistic regression and Markov chain model to describe a binary cross-sectional response, where the unobserved transition rates of a two-state continuous-time Markov chain are included as covariates. We use the method of maximum likelihood to estimate the parameters of our model. In a simulation study, coverage probabilities of about 95%, standard deviations close to standard errors, and low biases for the parameter values show that our estimation method is adequate. We apply the proposed joint model to a dataset of patients with traumatic brain injury to describe and predict a 6-month outcome based on physiological data collected post-injury and admission characteristics. Our analysis indicates that the information provided by physiological changes over time may help improve prediction of long-term functional status of these severely ill subjects. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Predicting the "graduate on time (GOT)" of PhD students using binary logistics regression model
Shariff, S. Sarifah Radiah; Rodzi, Nur Atiqah Mohd; Rahman, Kahartini Abdul; Zahari, Siti Meriam; Deni, Sayang Mohd
2016-10-01
Malaysian government has recently set a new goal to produce 60,000 Malaysian PhD holders by the year 2023. As a Malaysia's largest institution of higher learning in terms of size and population which offers more than 500 academic programmes in a conducive and vibrant environment, UiTM has taken several initiatives to fill up the gap. Strategies to increase the numbers of graduates with PhD are a process that is challenging. In many occasions, many have already identified that the struggle to get into the target set is even more daunting, and that implementation is far too ideal. This has further being progressing slowly as the attrition rate increases. This study aims to apply the proposed models that incorporates several factors in predicting the number PhD students that will complete their PhD studies on time. Binary Logistic Regression model is proposed and used on the set of data to determine the number. The results show that only 6.8% of the 2014 PhD students are predicted to graduate on time and the results are compared wih the actual number for validation purpose.
Rosso, Luigi; Riatti, Riccardo
2016-01-01
The knowledge of the associations between the timing of skeletal maturation and craniofacial growth is of primary importance when planning a functional treatment for most of the skeletal malocclusions. This cross-sectional study was thus aimed at evaluating whether sagittal and vertical craniofacial growth has an association with the timing of circumpubertal skeletal maturation. A total of 320 subjects (160 females and 160 males) were included in the study (mean age, 12.3 ± 1.7 years; range, 7.6–16.7 years). These subjects were equally distributed in the circumpubertal cervical vertebral maturation (CVM) stages 2 to 5. Each CVM stage group also had equal number of females and males. Multiple regression models were run for each CVM stage group to assess the significance of the association of cephalometric parameters (ANB, SN/MP, and NSBa angles) with age of attainment of the corresponding CVM stage (in months). Significant associations were seen only for stage 3, where the SN/MP angle was negatively associated with age (β coefficient, −0.7). These results show that hyperdivergent and hypodivergent subjects may have an anticipated and delayed attainment of the pubertal CVM stage 3, respectively. However, such association remains of little entity and it would become clinically relevant only in extreme cases. PMID:27995136
Near Real-Time Dust Aerosol Detection with Support Vector Machines for Regression
Rivas-Perea, P.; Rivas-Perea, P. E.; Cota-Ruiz, J.; Aragon Franco, R. A.
2015-12-01
Remote sensing instruments operating in the near-infrared spectrum usually provide the necessary information for further dust aerosol spectral analysis using statistical or machine learning algorithms. Such algorithms have proven to be effective in analyzing very specific case studies or dust events. However, very few make the analysis open to the public on a regular basis, fewer are designed specifically to operate in near real-time to higher resolutions, and almost none give a global daily coverage. In this research we investigated a large-scale approach to a machine learning algorithm called "support vector regression". The algorithm uses four near-infrared spectral bands from NASA MODIS instrument: B20 (3.66-3.84μm), B29 (8.40-8.70μm), B31 (10.78-11.28μm), and B32 (11.77-12.27μm). The algorithm is presented with ground truth from more than 30 distinct reported dust events, from different geographical regions, at different seasons, both over land and sea cover, in the presence of clouds and clear sky, and in the presence of fires. The purpose of our algorithm is to learn to distinguish the dust aerosols spectral signature from other spectral signatures, providing as output an estimate of the probability of a data point being consistent with dust aerosol signatures. During modeling with ground truth, our algorithm achieved more than 90% of accuracy, and the current live performance of the algorithm is remarkable. Moreover, our algorithm is currently operating in near real-time using NASA's Land, Atmosphere Near real-time Capability for EOS (LANCE) servers, providing a high resolution global overview including 64, 32, 16, 8, 4, 2, and 1km. The near real-time analysis of our algorithm is now available to the general public at http://dust.reev.us and archives of the results starting from 2012 are available upon request.
Statistical properties and pre-hit dynamics of price limit hits in the Chinese stock markets.
Wan, Yu-Lei; Xie, Wen-Jie; Gu, Gao-Feng; Jiang, Zhi-Qiang; Chen, Wei; Xiong, Xiong; Zhang, Wei; Zhou, Wei-Xing
2015-01-01
Price limit trading rules are adopted in some stock markets (especially emerging markets) trying to cool off traders' short-term trading mania on individual stocks and increase market efficiency. Under such a microstructure, stocks may hit their up-limits and down-limits from time to time. However, the behaviors of price limit hits are not well studied partially due to the fact that main stock markets such as the US markets and most European markets do not set price limits. Here, we perform detailed analyses of the high-frequency data of all A-share common stocks traded on the Shanghai Stock Exchange and the Shenzhen Stock Exchange from 2000 to 2011 to investigate the statistical properties of price limit hits and the dynamical evolution of several important financial variables before stock price hits its limits. We compare the properties of up-limit hits and down-limit hits. We also divide the whole period into three bullish periods and three bearish periods to unveil possible differences during bullish and bearish market states. To uncover the impacts of stock capitalization on price limit hits, we partition all stocks into six portfolios according to their capitalizations on different trading days. We find that the price limit trading rule has a cooling-off effect (object to the magnet effect), indicating that the rule takes effect in the Chinese stock markets. We find that price continuation is much more likely to occur than price reversal on the next trading day after a limit-hitting day, especially for down-limit hits, which has potential practical values for market practitioners.
Statistical Properties and Pre-Hit Dynamics of Price Limit Hits in the Chinese Stock Markets
Wan, Yu-Lei; Xie, Wen-Jie; Gu, Gao-Feng; Jiang, Zhi-Qiang; Chen, Wei; Xiong, Xiong; Zhang, Wei; Zhou, Wei-Xing
2015-01-01
Price limit trading rules are adopted in some stock markets (especially emerging markets) trying to cool off traders’ short-term trading mania on individual stocks and increase market efficiency. Under such a microstructure, stocks may hit their up-limits and down-limits from time to time. However, the behaviors of price limit hits are not well studied partially due to the fact that main stock markets such as the US markets and most European markets do not set price limits. Here, we perform detailed analyses of the high-frequency data of all A-share common stocks traded on the Shanghai Stock Exchange and the Shenzhen Stock Exchange from 2000 to 2011 to investigate the statistical properties of price limit hits and the dynamical evolution of several important financial variables before stock price hits its limits. We compare the properties of up-limit hits and down-limit hits. We also divide the whole period into three bullish periods and three bearish periods to unveil possible differences during bullish and bearish market states. To uncover the impacts of stock capitalization on price limit hits, we partition all stocks into six portfolios according to their capitalizations on different trading days. We find that the price limit trading rule has a cooling-off effect (object to the magnet effect), indicating that the rule takes effect in the Chinese stock markets. We find that price continuation is much more likely to occur than price reversal on the next trading day after a limit-hitting day, especially for down-limit hits, which has potential practical values for market practitioners. PMID:25874716
A regressive methodology for estimating missing data in rainfall daily time series
Barca, E.; Passarella, G.
2009-04-01
The "presence" of gaps in environmental data time series represents a very common, but extremely critical problem, since it can produce biased results (Rubin, 1976). Missing data plagues almost all surveys. The problem is how to deal with missing data once it has been deemed impossible to recover the actual missing values. Apart from the amount of missing data, another issue which plays an important role in the choice of any recovery approach is the evaluation of "missingness" mechanisms. When data missing is conditioned by some other variable observed in the data set (Schafer, 1997) the mechanism is called MAR (Missing at Random). Otherwise, when the missingness mechanism depends on the actual value of the missing data, it is called NCAR (Not Missing at Random). This last is the most difficult condition to model. In the last decade interest arose in the estimation of missing data by using regression (single imputation). More recently multiple imputation has become also available, which returns a distribution of estimated values (Scheffer, 2002). In this paper an automatic methodology for estimating missing data is presented. In practice, given a gauging station affected by missing data (target station), the methodology checks the randomness of the missing data and classifies the "similarity" between the target station and the other gauging stations spread over the study area. Among different methods useful for defining the similarity degree, whose effectiveness strongly depends on the data distribution, the Spearman correlation coefficient was chosen. Once defined the similarity matrix, a suitable, nonparametric, univariate, and regressive method was applied in order to estimate missing data in the target station: the Theil method (Theil, 1950). Even though the methodology revealed to be rather reliable an improvement of the missing data estimation can be achieved by a generalization. A first possible improvement consists in extending the univariate technique to
Hirsch, Robert M.; Moyer, Douglas; Archfield, Stacey A.
2010-01-01
A new approach to the analysis of long-term surface water-quality data is proposed and implemented. The goal of this approach is to increase the amount of information that is extracted from the types of rich water-quality datasets that now exist. The method is formulated to allow for maximum flexibility in representations of the long-term trend, seasonal components, and discharge-related components of the behavior of the water-quality variable of interest. It is designed to provide internally consistent estimates of the actual history of concentrations and fluxes as well as histories that eliminate the influence of year-to-year variations in streamflow. The method employs the use of weighted regressions of concentrations on time, discharge, and season. Finally, the method is designed to be useful as a diagnostic tool regarding the kinds of changes that are taking place in the watershed related to point sources, groundwater sources, and surface-water nonpoint sources. The method is applied to datasets for the nine large tributaries of Chesapeake Bay from 1978 to 2008. The results show a wide range of patterns of change in total phosphorus and in dissolved nitrate plus nitrite. These results should prove useful in further examination of the causes of changes, or lack of changes, and may help inform decisions about future actions to reduce nutrient enrichment in the Chesapeake Bay and its watershed.
Time series linear regression of half-hourly radon levels in a residence
International Nuclear Information System (INIS)
Hull, D.A.
1990-01-01
This paper uses time series linear regression modelling to assess the impact of temperature and pressure differences on the radon measured in the basement and in the basement drain of a research house in the Princeton area of New Jersey. The models examine half-hour averages of several climate and house parameters for several periods of up to 11 days. The drain radon concentrations follow a strong diurnal pattern that shifts 12 hours in phase between the summer and the fall seasons. This shift can be linked both to the change in temperature differences between seasons and to an experiment which involved sealing the connection between the drain and the basement. We have found that both the basement and the drain radon concentrations are correlated to basement-outdoor and soil-outdoor temperature differences (the coefficient of determination varies between 0.6 and 0.8). The statistical models for the summer periods clearly describe a physical system where the basement drain pumps radon in during the night and sucks radon out during the day
Time series regression and ARIMAX for forecasting currency flow at Bank Indonesia in Sulawesi region
Suharsono, Agus; Suhartono, Masyitha, Aulia; Anuravega, Arum
2015-12-01
The purpose of the study is to forecast the outflow and inflow of currency at Indonesian Central Bank or Bank Indonesia (BI) in Sulawesi Region. The currency outflow and inflow data tend to have a trend pattern which is influenced by calendar variation effects. Therefore, this research focuses to apply some forecasting methods that could handle calendar variation effects, i.e. Time Series Regression (TSR) and ARIMAX models, and compare the forecast accuracy with ARIMA model. The best model is selected based on the lowest of Root Mean Squares Errors (RMSE) at out-sample dataset. The results show that ARIMA is the best model for forecasting the currency outflow and inflow at South Sulawesi. Whereas, the best model for forecasting the currency outflow at Central Sulawesi and Southeast Sulawesi, and for forecasting the currency inflow at South Sulawesi and North Sulawesi is TSR. Additionally, ARIMAX is the best model for forecasting the currency outflow at North Sulawesi. Hence, the results show that more complex models do not neccessary yield more accurate forecast than the simpler one.
Hu, L; Zhang, Z G; Mouraux, A; Iannetti, G D
2015-05-01
Transient sensory, motor or cognitive event elicit not only phase-locked event-related potentials (ERPs) in the ongoing electroencephalogram (EEG), but also induce non-phase-locked modulations of ongoing EEG oscillations. These modulations can be detected when single-trial waveforms are analysed in the time-frequency domain, and consist in stimulus-induced decreases (event-related desynchronization, ERD) or increases (event-related synchronization, ERS) of synchrony in the activity of the underlying neuronal populations. ERD and ERS reflect changes in the parameters that control oscillations in neuronal networks and, depending on the frequency at which they occur, represent neuronal mechanisms involved in cortical activation, inhibition and binding. ERD and ERS are commonly estimated by averaging the time-frequency decomposition of single trials. However, their trial-to-trial variability that can reflect physiologically-important information is lost by across-trial averaging. Here, we aim to (1) develop novel approaches to explore single-trial parameters (including latency, frequency and magnitude) of ERP/ERD/ERS; (2) disclose the relationship between estimated single-trial parameters and other experimental factors (e.g., perceived intensity). We found that (1) stimulus-elicited ERP/ERD/ERS can be correctly separated using principal component analysis (PCA) decomposition with Varimax rotation on the single-trial time-frequency distributions; (2) time-frequency multiple linear regression with dispersion term (TF-MLRd) enhances the signal-to-noise ratio of ERP/ERD/ERS in single trials, and provides an unbiased estimation of their latency, frequency, and magnitude at single-trial level; (3) these estimates can be meaningfully correlated with each other and with other experimental factors at single-trial level (e.g., perceived stimulus intensity and ERP magnitude). The methods described in this article allow exploring fully non-phase-locked stimulus-induced cortical
High energy ion hit technique to local area using microbeam
Energy Technology Data Exchange (ETDEWEB)
Tanaka, Ryuichi; Kamiya, Tomihiro; Suda, Tamotsu; Sakai, Takuro; Hirao, Toshio; Kobayashi, Yasuhiko; Watanabe, Hiroshi [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment
1997-03-01
Single energetic ion hit technique has been developed as an application of ion microbeam technique, in order to study the effect of local damage or injury to materials and living organisms. The overall performance is basically defined by those of separate techniques: microbeam formation, microbeam positioning, single ion detection, detection signal processing, hit timing control, and hit verification. Recent progress on the developments of these techniques at JAERI-TIARA facility are reviewed. (author)
Prahutama, Alan; Suparti; Wahyu Utami, Tiani
2018-03-01
Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.
Computational Physics' Greatest Hits
Bug, Amy
2011-03-01
The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.
Suhartono, Lee, Muhammad Hisyam; Prastyo, Dedy Dwi
2015-12-01
The aim of this research is to develop a calendar variation model for forecasting retail sales data with the Eid ul-Fitr effect. The proposed model is based on two methods, namely two levels ARIMAX and regression methods. Two levels ARIMAX and regression models are built by using ARIMAX for the first level and regression for the second level. Monthly men's jeans and women's trousers sales in a retail company for the period January 2002 to September 2009 are used as case study. In general, two levels of calendar variation model yields two models, namely the first model to reconstruct the sales pattern that already occurred, and the second model to forecast the effect of increasing sales due to Eid ul-Fitr that affected sales at the same and the previous months. The results show that the proposed two level calendar variation model based on ARIMAX and regression methods yields better forecast compared to the seasonal ARIMA model and Neural Networks.
The analysis of nonstationary time series using regression, correlation and cointegration
DEFF Research Database (Denmark)
Johansen, Søren
2012-01-01
There are simple well-known conditions for the validity of regression and correlation as statistical tools. We analyse by examples the effect of nonstationarity on inference using these methods and compare them to model based inference using the cointegrated vector autoregressive model. Finally we...... analyse some monthly data from US on interest rates as an illustration of the methods...
The Analysis of Nonstationary Time Series Using Regression, Correlation and Cointegration
Directory of Open Access Journals (Sweden)
Søren Johansen
2012-06-01
Full Text Available There are simple well-known conditions for the validity of regression and correlation as statistical tools. We analyse by examples the effect of nonstationarity on inference using these methods and compare them to model based inference using the cointegrated vector autoregressive model. Finally we analyse some monthly data from US on interest rates as an illustration of the methods.
Post-hit dynamics of price limit hits in the Chinese stock markets
Wu, Ting; Wang, Yue; Li, Ming-Xia
2017-01-01
Price limit trading rules are useful to cool off traders short-term trading mania on individual stocks. The price dynamics approaching the limit boards are known as the magnet effect. However, the price dynamics after opening price limit hits are not well investigated. Here, we provide a detailed analysis on the price dynamics after the hits of up-limit or down-limit is open based on all A-share stocks traded in the Chinese stock markets. A "W" shape is found in the expected return, which reveals high probability of a continuous price limit hit on the following day. We also find that price dynamics after opening limit hits are dependent on the market trends. The time span of continuously hitting the price limit is found to an influence factor of the expected profit after the limit hit is open. Our analysis provides a better understanding of the price dynamics around the limit boards and contributes potential practical values for investors.
Müller, M. S.; Urban, S.; Jutzi, B.
2017-08-01
The number of unmanned aerial vehicles (UAVs) is increasing since low-cost airborne systems are available for a wide range of users. The outdoor navigation of such vehicles is mostly based on global navigation satellite system (GNSS) methods to gain the vehicles trajectory. The drawback of satellite-based navigation are failures caused by occlusions and multi-path interferences. Beside this, local image-based solutions like Simultaneous Localization and Mapping (SLAM) and Visual Odometry (VO) can e.g. be used to support the GNSS solution by closing trajectory gaps but are computationally expensive. However, if the trajectory estimation is interrupted or not available a re-localization is mandatory. In this paper we will provide a novel method for a GNSS-free and fast image-based pose regression in a known area by utilizing a small convolutional neural network (CNN). With on-board processing in mind, we employ a lightweight CNN called SqueezeNet and use transfer learning to adapt the network to pose regression. Our experiments show promising results for GNSS-free and fast localization.
Almirall, Daniel; Griffin, Beth Ann; McCaffrey, Daniel F.; Ramchand, Rajeev; Yuen, Robert A.; Murphy, Susan A.
2014-01-01
This article considers the problem of examining time-varying causal effect moderation using observational, longitudinal data in which treatment, candidate moderators, and possible confounders are time varying. The structural nested mean model (SNMM) is used to specify the moderated time-varying causal effects of interest in a conditional mean model for a continuous response given time-varying treatments and moderators. We present an easy-to-use estimator of the SNMM that combines an existing regression-with-residuals (RR) approach with an inverse-probability-of-treatment weighting (IPTW) strategy. The RR approach has been shown to identify the moderated time-varying causal effects if the time-varying moderators are also the sole time-varying confounders. The proposed IPTW+RR approach provides estimators of the moderated time-varying causal effects in the SNMM in the presence of an additional, auxiliary set of known and measured time-varying confounders. We use a small simulation experiment to compare IPTW+RR versus the traditional regression approach and to compare small and large sample properties of asymptotic versus bootstrap estimators of the standard errors for the IPTW+RR approach. This article clarifies the distinction between time-varying moderators and time-varying confounders. We illustrate the methodology in a case study to assess if time-varying substance use moderates treatment effects on future substance use. PMID:23873437
Reduction of multiple hits in atom probe tomography
International Nuclear Information System (INIS)
Thuvander, Mattias; Kvist, Anders; Johnson, Lars J.S.; Weidow, Jonathan; Andrén, Hans-Olof
2013-01-01
The accuracy of compositional measurements using atom probe tomography is often reduced because some ions are not recorded when several ions hit the detector in close proximity to each other and within a very short time span. In some cases, for example in analysis of carbides, the multiple hits result in a preferential loss of certain elements, namely those elements that frequently field evaporate in bursts or as dissociating molecules. In this paper a method of reducing the effect of multiple hits is explored. A fine metal grid was mounted a few millimeters behind the local electrode, effectively functioning as a filter. This resulted in a decrease in the overall detection efficiency, from 37% to about 5%, but also in a decrease in the fraction of multiple hits. In an analysis of tungsten carbide the fraction of ions originating from multiple hits decreased from 46% to 10%. As a result, the measured carbon concentration increased from 48.2 at%to 49.8 at%, very close to the expected 50.0 at%. The characteristics of the multiple hits were compared for analyses with and without the grid filter. - Highlights: ► APT experiments have been performed with a reduced amount of multiple hits. ► The multiple hits were reduced by placing a grid behind the electrode. ► This resulted in improved carbon measurement of WC
Langella, Giuliano; Basile, Angelo; Bonfante, Antonello; Manna, Piero; Terribile, Fabio
2013-04-01
Digital soil mapping procedures are widespread used to build two-dimensional continuous maps about several pedological attributes. Our work addressed a regression kriging (RK) technique and a bootstrapped artificial neural network approach in order to evaluate and compare (i) the accuracy of prediction, (ii) the susceptibility of being included in automatic engines (e.g. to constitute web processing services), and (iii) the time cost needed for calibrating models and for making predictions. Regression kriging is maybe the most widely used geostatistical technique in the digital soil mapping literature. Here we tried to apply the EBLUP regression kriging as it is deemed to be the most statistically sound RK flavor by pedometricians. An unusual multi-parametric and nonlinear machine learning approach was accomplished, called BAGAP (Bootstrap aggregating Artificial neural networks with Genetic Algorithms and Principal component regression). BAGAP combines a selected set of weighted neural nets having specified characteristics to yield an ensemble response. The purpose of applying these two particular models is to ascertain whether and how much a more cumbersome machine learning method could be much promising in making more accurate/precise predictions. Being aware of the difficulty to handle objects based on EBLUP-RK as well as BAGAP when they are embedded in environmental applications, we explore the susceptibility of them in being wrapped within Web Processing Services. Two further kinds of aspects are faced for an exhaustive evaluation and comparison: automaticity and time of calculation with/without high performance computing leverage.
Semiparametric regression analysis of failure time data with dependent interval censoring.
Chen, Chyong-Mei; Shen, Pao-Sheng
2017-09-20
Interval-censored failure-time data arise when subjects are examined or observed periodically such that the failure time of interest is not examined exactly but only known to be bracketed between two adjacent observation times. The commonly used approaches assume that the examination times and the failure time are independent or conditionally independent given covariates. In many practical applications, patients who are already in poor health or have a weak immune system before treatment usually tend to visit physicians more often after treatment than those with better health or immune system. In this situation, the visiting rate is positively correlated with the risk of failure due to the health status, which results in dependent interval-censored data. While some measurable factors affecting health status such as age, gender, and physical symptom can be included in the covariates, some health-related latent variables cannot be observed or measured. To deal with dependent interval censoring involving unobserved latent variable, we characterize the visiting/examination process as recurrent event process and propose a joint frailty model to account for the association of the failure time and visiting process. A shared gamma frailty is incorporated into the Cox model and proportional intensity model for the failure time and visiting process, respectively, in a multiplicative way. We propose a semiparametric maximum likelihood approach for estimating model parameters and show the asymptotic properties, including consistency and weak convergence. Extensive simulation studies are conducted and a data set of bladder cancer is analyzed for illustrative purposes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
76 FR 46298 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-08-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held virtually on August 17, 2011...
76 FR 55914 - HIT Policy Committee's Workgroup Meetings; Notice of Meetings
2011-09-09
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Policy Committee's Workgroups... standards, implementation specifications, and certification criteria are needed. Date and Time: The HIT...
77 FR 16035 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-03-19
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on March 27, 2012, from 9 a.m...
2010-10-26
... Technology; HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards... Strategic Plan, and in accordance with policies developed by the HIT Policy Committee. Date and Time: The...
76 FR 79684 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-12-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on January 25, 2012, from 9 a...
76 FR 50734 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-08-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on September 28, 2011, from 9...
77 FR 2727 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-01-19
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on February 29, 2012, from 9...
77 FR 15760 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-03-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on April 18, 2012, from 9 a.m...
76 FR 46297 - HIT Policy Committee's Workgroup Meetings; Notice of Meetings
2011-08-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Policy Committee's Workgroups... standards, implementation specifications, and certification criteria are needed. Date and Time: The HIT...
76 FR 14976 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-03-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on April 20, 2011, from 9 a.m...
76 FR 70455 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-11-14
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on December 14, 2011, from 9...
2011-01-25
... Technology; HIT Policy Committee's Workgroup Meetings; Notice of Meetings AGENCY: Office of the National... only. Name of Committees: HIT Policy Committee's Workgroups: Meaningful Use, Privacy & Security Tiger..., implementation specifications, and certification criteria are needed. Date and Time: The HIT Policy Committee...
76 FR 39109 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-07-05
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on July 20, 2011, from 9 a.m...
77 FR 73661 - HIT Standards Committee Advisory Meetings; Notice of Meetings
2012-12-11
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meetings; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: These meetings will be held on the following dates and...
76 FR 70454 - HIT Policy Committee's Workgroup Meetings; Notice of Meetings
2011-11-14
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Policy Committee's Workgroups... certification criteria are needed. Date and Time: The HIT Policy Committee Workgroups will hold the following...
2010-09-17
... Technology; HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards... Strategic Plan, and in accordance with policies developed by the HIT Policy Committee. Date and Time: The...
76 FR 28782 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-05-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on June 22, 2011, from 9 a.m...
77 FR 27459 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-05-10
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on June 20, 2012, from 9 a.m...
76 FR 22399 - HIT Policy Committee's Workgroup Meetings; Notice of Meetings
2011-04-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Policy Committee's Workgroups... standards, implementation specifications, and certification criteria are needed. Date and Time: The HIT...
76 FR 28784 - HIT Policy Committee's Workgroup Meetings; Notice of Meetings
2011-05-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Policy Committee's Workgroups... standards, implementation specifications, and certification criteria are needed. Date and Time: The HIT...
75 FR 29762 - HIT Policy Committee's Workgroup Meetings; Notice of Meetings
2010-05-27
... Technology HIT Policy Committee's Workgroup Meetings; Notice of Meetings AGENCY: Office of the National... only. Name of Committees: HIT Policy Committee's Workgroups: Meaningful Use, Privacy & Security Policy... specifications, and certification criteria are needed. Date and Time: The HIT Policy Committee Workgroups will...
77 FR 37408 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-06-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... be open to the public. Name of Committee: HIT Standards Committee. General Function of the Committee... with policies developed by the HIT Policy Committee. Date and Time: The meeting will be held on July 19...
77 FR 22787 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-04-17
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on May 24, 2012, from 9 a.m...
77 FR 65691 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-10-30
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on November 13, 2012, from 9...
77 FR 50690 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-08-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on September 19, 2012, from 9...
75 FR 21628 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2010-04-26
... Technology HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards... Strategic Plan, and in accordance with policies developed by the HIT Policy Committee. Date and Time: The...
76 FR 14974 - HIT Policy Committee's Workgroup Meetings; Notice of Meetings
2011-03-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Policy Committee's Workgroups... standards, implementation specifications, and certification criteria are needed. Date and Time: The HIT...
76 FR 22396 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-04-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on May 18, 2011, from 9 a.m...
76 FR 55913 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-09-09
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: to provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held virtually on October 21, 2011...
76 FR 50735 - HIT Policy Committee's Workgroup Meetings; Notice of Meetings
2011-08-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Policy Committee's Workgroups... standards, implementation specifications, and certification criteria are needed. Date and Time: The HIT...
77 FR 60438 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-10-03
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on October 17, 2012, from 9 a...
2010-01-25
... Technology; HIT Policy Committee's Workgroup Meetings; Notice of Meetings AGENCY: Office of the National... only. Name of Committees: HIT Policy Committee's Workgroups: Meaningful Use, Privacy & Security Policy... specifications, and certification criteria are needed. Date and Time: The HIT Policy Committee Workgroups will...
77 FR 65690 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-10-30
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on December 19, 2012, from 9...
76 FR 9783 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2011-02-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on March 29, 2011, from 9 a.m...
77 FR 45353 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2012-07-31
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting; Notice of... public. Name of Committee: HIT Standards Committee. General Function of the Committee: To provide... developed by the HIT Policy Committee. Date and Time: The meeting will be held on August 15, 2012, from 9:00...
2010-07-20
... Technology; HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards... Strategic Plan, and in accordance with policies developed by the HIT Policy Committee. Date and Time: The...
Leroux, Romain; Chatellier, Ludovic; David, Laurent
2018-01-01
This article is devoted to the estimation of time-resolved particle image velocimetry (TR-PIV) flow fields using a time-resolved point measurements of a voltage signal obtained by hot-film anemometry. A multiple linear regression model is first defined to map the TR-PIV flow fields onto the voltage signal. Due to the high temporal resolution of the signal acquired by the hot-film sensor, the estimates of the TR-PIV flow fields are obtained with a multiple linear regression method called orthonormalized partial least squares regression (OPLSR). Subsequently, this model is incorporated as the observation equation in an ensemble Kalman filter (EnKF) applied on a proper orthogonal decomposition reduced-order model to stabilize it while reducing the effects of the hot-film sensor noise. This method is assessed for the reconstruction of the flow around a NACA0012 airfoil at a Reynolds number of 1000 and an angle of attack of {20}°. Comparisons with multi-time delay-modified linear stochastic estimation show that both the OPLSR and EnKF combined with OPLSR are more accurate as they produce a much lower relative estimation error, and provide a faithful reconstruction of the time evolution of the velocity flow fields.
Using the mean approach in pooling cross-section and time series data for regression modelling
International Nuclear Information System (INIS)
Nuamah, N.N.N.N.
1989-12-01
The mean approach is one of the methods for pooling cross section and time series data for mathematical-statistical modelling. Though a simple approach, its results are sometimes paradoxical in nature. However, researchers still continue using it for its simplicity. Here, the paper investigates the nature and source of such unwanted phenomena. (author). 7 refs
Barry T. Wilson; Joseph F. Knight; Ronald E. McRoberts
2018-01-01
Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several...
Sun, Jianguo; Feng, Yanqin; Zhao, Hui
2015-01-01
Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.
Regression analysis of longitudinal data with correlated censoring and observation times.
Li, Yang; He, Xin; Wang, Haiying; Sun, Jianguo
2016-07-01
Longitudinal data occur in many fields such as the medical follow-up studies that involve repeated measurements. For their analysis, most existing approaches assume that the observation or follow-up times are independent of the response process either completely or given some covariates. In practice, it is apparent that this may not be true. In this paper, we present a joint analysis approach that allows the possible mutual correlations that can be characterized by time-dependent random effects. Estimating equations are developed for the parameter estimation and the resulted estimators are shown to be consistent and asymptotically normal. The finite sample performance of the proposed estimators is assessed through a simulation study and an illustrative example from a skin cancer study is provided.
Jaber, Abobaker M; Ismail, Mohd Tahir; Altaher, Alsaidi M
2014-01-01
This paper mainly forecasts the daily closing price of stock markets. We propose a two-stage technique that combines the empirical mode decomposition (EMD) with nonparametric methods of local linear quantile (LLQ). We use the proposed technique, EMD-LLQ, to forecast two stock index time series. Detailed experiments are implemented for the proposed method, in which EMD-LPQ, EMD, and Holt-Winter methods are compared. The proposed EMD-LPQ model is determined to be superior to the EMD and Holt-Winter methods in predicting the stock closing prices.
Mixture regression models for the gap time distributions and illness-death processes.
Huang, Chia-Hui
2018-01-27
The aim of this study is to provide an analysis of gap event times under the illness-death model, where some subjects experience "illness" before "death" and others experience only "death." Which event is more likely to occur first and how the duration of the "illness" influences the "death" event are of interest. Because the occurrence of the second event is subject to dependent censoring, it can lead to bias in the estimation of model parameters. In this work, we generalize the semiparametric mixture models for competing risks data to accommodate the subsequent event and use a copula function to model the dependent structure between the successive events. Under the proposed method, the survival function of the censoring time does not need to be estimated when developing the inference procedure. We incorporate the cause-specific hazard functions with the counting process approach and derive a consistent estimation using the nonparametric maximum likelihood method. Simulations are conducted to demonstrate the performance of the proposed analysis, and its application in a clinical study on chronic myeloid leukemia is reported to illustrate its utility.
International Nuclear Information System (INIS)
Guo, Yin; Nazarian, Ehsan; Ko, Jeonghan; Rajurkar, Kamlakar
2014-01-01
Highlights: • Developed hourly-indexed ARX models for robust cooling-load forecasting. • Proposed a two-stage weighted least-squares regression approach. • Considered the effect of outliers as well as trend of cooling load and weather patterns. • Included higher order terms and day type patterns in the forecasting models. • Demonstrated better accuracy compared with some ARX and ANN models. - Abstract: This paper presents a robust hourly cooling-load forecasting method based on time-indexed autoregressive with exogenous inputs (ARX) models, in which the coefficients are estimated through a two-stage weighted least squares regression. The prediction method includes a combination of two separate time-indexed ARX models to improve prediction accuracy of the cooling load over different forecasting periods. The two-stage weighted least-squares regression approach in this study is robust to outliers and suitable for fast and adaptive coefficient estimation. The proposed method is tested on a large-scale central cooling system in an academic institution. The numerical case studies show the proposed prediction method performs better than some ANN and ARX forecasting models for the given test data set
Wang, Peijie; Zhao, Hui; Sun, Jianguo
2016-12-01
Interval-censored failure time data occur in many fields such as demography, economics, medical research, and reliability and many inference procedures on them have been developed (Sun, 2006; Chen, Sun, and Peace, 2012). However, most of the existing approaches assume that the mechanism that yields interval censoring is independent of the failure time of interest and it is clear that this may not be true in practice (Zhang et al., 2007; Ma, Hu, and Sun, 2015). In this article, we consider regression analysis of case K interval-censored failure time data when the censoring mechanism may be related to the failure time of interest. For the problem, an estimated sieve maximum-likelihood approach is proposed for the data arising from the proportional hazards frailty model and for estimation, a two-step procedure is presented. In the addition, the asymptotic properties of the proposed estimators of regression parameters are established and an extensive simulation study suggests that the method works well. Finally, we apply the method to a set of real interval-censored data that motivated this study. © 2016, The International Biometric Society.
A fast DSP-based calorimeter hit scanning system
International Nuclear Information System (INIS)
Sekikawa, S.; Arai, I.; Suzuki, A.; Watanabe, A.; Marlow, D.R.; Mindas, C.R.; Wixted, R.L.
1997-01-01
A custom made digital signal processor (DSP) based system has been developed to scan calorimeter hits read by a 32-channel FASTBUS waveform recorder board. The scanner system identifies hit calorimeter elements by surveying their discriminated outputs. This information is used to generate a list of addresses, which guides the read-out process. The system is described and measurements of the scan times are given. (orig.)
International Nuclear Information System (INIS)
Liu, Jie
2015-01-01
This Ph. D. work is motivated by the possibility of monitoring the conditions of components of energy systems for their extended and safe use, under proper practice of operation and adequate policies of maintenance. The aim is to develop a Support Vector Regression (SVR)-based framework for predicting time series data under stationary/nonstationary environmental and operational conditions. Single SVR and SVR-based ensemble approaches are developed to tackle the prediction problem based on both small and large datasets. Strategies are proposed for adaptively updating the single SVR and SVR-based ensemble models in the existence of pattern drifts. Comparisons with other online learning approaches for kernel-based modelling are provided with reference to time series data from a critical component in Nuclear Power Plants (NPPs) provided by Electricite de France (EDF). The results show that the proposed approaches achieve comparable prediction results, considering the Mean Squared Error (MSE) and Mean Relative Error (MRE), in much less computation time. Furthermore, by analyzing the geometrical meaning of the Feature Vector Selection (FVS) method proposed in the literature, a novel geometrically interpretable kernel method, named Reduced Rank Kernel Ridge Regression-II (RRKRR-II), is proposed to describe the linear relations between a predicted value and the predicted values of the Feature Vectors (FVs) selected by FVS. Comparisons with several kernel methods on a number of public datasets prove the good prediction accuracy and the easy-of-tuning of the hyper-parameters of RRKRR-II. (author)
2008-01-01
The first time that single particle effects from cosmic rays on electronics were observed was in 1991, when one of the instruments aboard an ESA satellite broke down after only five days in space. On 5 July, the TS-LEA group will have completed the installation of monitors that will help to reduce similar dangerous effects on LHC electronics.
Spady, Richard; Stouli, Sami
2012-01-01
We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...
Jagutzki, O; Mergel, V; Schmidt-Böcking, H; Spielberger, L; Spillmann, U; Ullmann-Pfleger, K
2002-01-01
New applications for single particle and photon detection in many fields require both large area imaging performance and precise time information on each detected particle. Moreover, a very high data acquisition rate is desirable for most applications and eventually the detection and imaging of more than one particle arriving within a microsecond is required. Commercial CCD systems lack the timing information whereas other electronic microchannel plate (MCP) read-out schemes usually suffer from a low acquisition rate and complicated and sometimes costly read-out electronics. We have designed and tested a complete imaging system consisting of an MCP position readout with helical wire delay-lines, single-unit amplifier box and PC-controlled time-to-digital converter (TDC) readout. The system is very flexible and can detect and analyse position and timing information at single particle rates beyond 1 MHz. Alternatively, multi-hit events can be collected and analysed at about 20 kHz rate. We discuss the advantage...
Tang, Yongqiang
2015-01-01
A sample size formula is derived for negative binomial regression for the analysis of recurrent events, in which subjects can have unequal follow-up time. We obtain sharp lower and upper bounds on the required size, which is easy to compute. The upper bound is generally only slightly larger than the required size, and hence can be used to approximate the sample size. The lower and upper size bounds can be decomposed into two terms. The first term relies on the mean number of events in each group, and the second term depends on two factors that measure, respectively, the extent of between-subject variability in event rates, and follow-up time. Simulation studies are conducted to assess the performance of the proposed method. An application of our formulae to a multiple sclerosis trial is provided.
Verbs in the lexicon: Why is hitting easier than breaking?
McKoon, Gail; Love, Jessica
2011-11-01
Adult speakers use verbs in syntactically appropriate ways. For example, they know implicitly that the boy hit at the fence is acceptable but the boy broke at the fence is not. We suggest that this knowledge is lexically encoded in semantic decompositions. The decomposition for break verbs (e.g. crack, smash) is hypothesized to be more complex than that for hit verbs (e.g. kick, kiss). Specifically, the decomposition of a break verb denotes that "an entity changes state as the result of some external force" whereas the decomposition for a hit verb denotes only that "an entity potentially comes in contact with another entity." In this article, verbs of the two types were compared in a lexical decision experiment - Experiment 1 - and they were compared in sentence comprehension experiments with transitive sentences (e.g. the car hit the bicycle and the car broke the bicycle) - Experiments 2 and 3. In Experiment 1, processing times were shorter for the hit than the break verbs and in Experiments 2 and 3, processing times were shorter for the hit sentences than the break sentences, results that are in accord with the complexities of the postulated semantic decompositions.
Energy Technology Data Exchange (ETDEWEB)
KEENEN,MARTHA JANE; NUSBAUM,ANNA W.
2000-05-18
Very few of us get to start clean: getting a new organization, new space, and hiring new people for a new information management program. In over 20 years in some aspect of this profession, the author has never faced that particular challenge. By far the majority of information management opportunities involve taking over from someone else. Sometimes, a predecessor has gone on to better things on his/her initiative; that is not always the case. Sometimes the group is one you were a part of yesterday. If the function functions, time moves on and changes may be needed to accommodate new technology, additional and/or changed tasks, and alterations in corporate missions. If the function does not, it is a good bet that you were hired or promoted as an agent of change. Each of these situations poses challenges. This presentation is about that first few months and first year in a new assignment. In other words, you have the job, now what?
2010-10-01
... 42 Public Health 5 2010-10-01 2010-10-01 false Approval of the State Medicaid HIT plan, the HIT PAPD and update, the HIT IAPD and update, and the annual HIT IAPD. 495.344 Section 495.344 Public... Requirements Specific to the Medicaid Program § 495.344 Approval of the State Medicaid HIT plan, the HIT PAPD...
Novel HIT antibody detection method using Sonoclot® coagulation analyzer.
Wanaka, Keiko; Asada, Reiko; Miyashita, Kumiko; Kaneko, Makoto; Endo, Hirokazu; Yatomi, Yutaka
2015-01-01
Since heparin-induced thrombocytopenia (HIT), caused by the generation of antibodies against platelet factor 4 (PF4)/heparin complexes (HIT antibodies), may induce serious complications due to thrombosis, a prompt diagnosis is desirable. Functional tests with platelet activation to detect HIT antibodies are useful for diagnosis of HIT, in particular (14)C-selotonin release assay (SRA). However, they are complicated and so can be performed only in limited laboratories. We tested if a blood coagulation test using Sonoclot® analyzer can serve for the detection of HIT antibodies. A murine monoclonal antibody (HIT-MoAb) against PF4/heparin complexes was used as an alternative to human HIT antibodies. To the mixture of HIT-MoAb and heparin (0.5 U/mL, final), whole blood obtained from a healthy volunteer was added, and then the activated clotting time (ACT), clot rate (CR), and area under the curve (AUC) were measured with Sonoclot® analyzer for 30minutes. The HIT-MoAb (30 to 100μg/mL, final) concentration dependently suppressed the anticoagulation activity (prolongation of ACT and decrease of CR and AUC) of heparin. The suppression of anticoagulation effect of heparin by HIT-MoAb was demonstrated by measurements using Sonoclot® analyzer. This method may provide a new tool for screening of HIT antibodies. Copyright © 2014 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Felix, A.C.
1988-01-01
In this study mathematical models of the central circulation, containing as undetermined parameters both chamber volumes and crosstalk coefficients, relating region-of-interest count rates to activity no only in the corresponding chamber but also overlapping and contiguous anatomical chambers, were used to identify contaminating crosstalk contributions to the various time-activity curves of interest. The identification of these crosstalks was essential for the creation of decontaminated region-of-interest time-activity curves which could be used for further model analysis. The decontaminated curves represent what the region-of-interest time-activity curves would look like in the absence of crosstalks. An optimal sampling route in was added to the nonlinear regression least squares fit program so that the region-of-interest time-activity curves could be analyzed to determine which data points contributed most toward decreasing the standard error or each parameter. A biplane model was investigated for use in analyzing radionuclide angiocardiographic first pass data
Draper, Jan; Beretta, Ruth; Kenward, Linda; McDonagh, Lin; Messenger, Julie; Rounce, Jill
2014-10-01
This study explored the impact of The Open University's (OU) preregistration nursing programme on students' employability, career progression and its contribution to developing the nursing workforce across the United Kingdom. Designed for healthcare support workers who are sponsored by their employers, the programme is the only part-time supported open/distance learning programme in the UK leading to registration as a nurse. The international literature reveals that relatively little is known about the impact of previous experience as a healthcare support worker on the experience of transition, employability skills and career progression. To identify alumni and employer views of the perceived impact of the programme on employability, career progression and workforce development. A qualitative design using telephone interviews which were digitally recorded, and transcribed verbatim prior to content analysis to identify recurrent themes. Three geographical areas across the UK. Alumni (n=17) and employers (n=7). Inclusion criterion for alumni was a minimum of two years' post-qualifying experience. Inclusion criteria for employers were those that had responsibility for sponsoring students on the programme and employing them as newly qualified nurses. Four overarching themes were identified: transition, expectations, learning for and in practice, and flexibility. Alumni and employers were of the view that the programme equipped them well to meet the competencies and expectations of being a newly qualified nurse. It provided employers with a flexible route to growing their own workforce and alumni the opportunity to achieve their ambition of becoming a qualified nurse when other more conventional routes would not have been open to them. Some of them had already demonstrated career progression. Generalising results requires caution due to the small, self-selecting sample but findings suggest that a widening participation model of pre-registration nurse education for
77 FR 32639 - HIT Standards Committee and HIT Policy Committee; Call for Nominations
2012-06-01
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee and HIT Policy Committee; Call for... Health Information Technology Policy Committee (HITPC). Name of Committees: HIT Standards Committee and HIT Policy Committee. General Function of the Committees: The HITSC is charged to provide...
Directory of Open Access Journals (Sweden)
Yubo Wang
2017-06-01
Full Text Available It is often difficult to analyze biological signals because of their nonlinear and non-stationary characteristics. This necessitates the usage of time-frequency decomposition methods for analyzing the subtle changes in these signals that are often connected to an underlying phenomena. This paper presents a new approach to analyze the time-varying characteristics of such signals by employing a simple truncated Fourier series model, namely the band-limited multiple Fourier linear combiner (BMFLC. In contrast to the earlier designs, we first identified the sparsity imposed on the signal model in order to reformulate the model to a sparse linear regression model. The coefficients of the proposed model are then estimated by a convex optimization algorithm. The performance of the proposed method was analyzed with benchmark test signals. An energy ratio metric is employed to quantify the spectral performance and results show that the proposed method Sparse-BMFLC has high mean energy (0.9976 ratio and outperforms existing methods such as short-time Fourier transfrom (STFT, continuous Wavelet transform (CWT and BMFLC Kalman Smoother. Furthermore, the proposed method provides an overall 6.22% in reconstruction error.
Wang, Yubo; Veluvolu, Kalyana C
2017-06-14
It is often difficult to analyze biological signals because of their nonlinear and non-stationary characteristics. This necessitates the usage of time-frequency decomposition methods for analyzing the subtle changes in these signals that are often connected to an underlying phenomena. This paper presents a new approach to analyze the time-varying characteristics of such signals by employing a simple truncated Fourier series model, namely the band-limited multiple Fourier linear combiner (BMFLC). In contrast to the earlier designs, we first identified the sparsity imposed on the signal model in order to reformulate the model to a sparse linear regression model. The coefficients of the proposed model are then estimated by a convex optimization algorithm. The performance of the proposed method was analyzed with benchmark test signals. An energy ratio metric is employed to quantify the spectral performance and results show that the proposed method Sparse-BMFLC has high mean energy (0.9976) ratio and outperforms existing methods such as short-time Fourier transfrom (STFT), continuous Wavelet transform (CWT) and BMFLC Kalman Smoother. Furthermore, the proposed method provides an overall 6.22% in reconstruction error.
Noise Reduction and Gap Filling of fAPAR Time Series Using an Adapted Local Regression Filter
Directory of Open Access Journals (Sweden)
Álvaro Moreno
2014-08-01
Full Text Available Time series of remotely sensed data are an important source of information for understanding land cover dynamics. In particular, the fraction of absorbed photosynthetic active radiation (fAPAR is a key variable in the assessment of vegetation primary production over time. However, the fAPAR series derived from polar orbit satellites are not continuous and consistent in space and time. Filtering methods are thus required to fill in gaps and produce high-quality time series. This study proposes an adapted (iteratively reweighted local regression filter (LOESS and performs a benchmarking intercomparison with four popular and generally applicable smoothing methods: Double Logistic (DLOG, smoothing spline (SSP, Interpolation for Data Reconstruction (IDR and adaptive Savitzky-Golay (ASG. This paper evaluates the main advantages and drawbacks of the considered techniques. The results have shown that ASG and the adapted LOESS perform better in recovering fAPAR time series over multiple controlled noisy scenarios. Both methods can robustly reconstruct the fAPAR trajectories, reducing the noise up to 80% in the worst simulation scenario, which might be attributed to the quality control (QC MODIS information incorporated into these filtering algorithms, their flexibility and adaptation to the upper envelope. The adapted LOESS is particularly resistant to outliers. This method clearly outperforms the other considered methods to deal with the high presence of gaps and noise in satellite data records. The low RMSE and biases obtained with the LOESS method (|rMBE| < 8%; rRMSE < 20% reveals an optimal reconstruction even in most extreme situations with long seasonal gaps. An example of application of the LOESS method to fill in invalid values in real MODIS images presenting persistent cloud and snow coverage is also shown. The LOESS approach is recommended in most remote sensing applications, such as gap-filling, cloud-replacement, and observing temporal
Nistal-Nuño, Beatriz
2017-09-01
In Chile, a new law introduced in March 2012 decreased the legal blood alcohol concentration (BAC) limit for driving while impaired from 1 to 0.8 g/l and the legal BAC limit for driving under the influence of alcohol from 0.5 to 0.3 g/l. The goal is to assess the impact of this new law on mortality and morbidity outcomes in Chile. A review of national databases in Chile was conducted from January 2003 to December 2014. Segmented regression analysis of interrupted time series was used for analyzing the data. In a series of multivariable linear regression models, the change in intercept and slope in the monthly incidence rate of traffic deaths and injuries and association with alcohol per 100,000 inhabitants was estimated from pre-intervention to postintervention, while controlling for secular changes. In nested regression models, potential confounding seasonal effects were accounted for. All analyses were performed at a two-sided significance level of 0.05. Immediate level drops in all the monthly rates were observed after the law from the end of the prelaw period in the majority of models and in all the de-seasonalized models, although statistical significance was reached only in the model for injures related to alcohol. After the law, the estimated monthly rate dropped abruptly by -0.869 for injuries related to alcohol and by -0.859 adjusting for seasonality (P < 0.001). Regarding the postlaw long-term trends, it was evidenced a steeper decreasing trend after the law in the models for deaths related to alcohol, although these differences were not statistically significant. A strong evidence of a reduction in traffic injuries related to alcohol was found following the law in Chile. Although insufficient evidence was found of a statistically significant effect for the beneficial effects seen on deaths and overall injuries, potential clinically important effects cannot be ruled out. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd
International Nuclear Information System (INIS)
High, F. William; Stubbs, Christopher W.; Rest, Armin; Stalder, Brian; Challis, Peter
2009-01-01
We present stellar locus regression (SLR), a method of directly adjusting the instrumental broadband optical colors of stars to bring them into accord with a universal stellar color-color locus, producing accurately calibrated colors for both stars and galaxies. This is achieved without first establishing individual zero points for each passband, and can be performed in real-time at the telescope. We demonstrate how SLR naturally makes one wholesale correction for differences in instrumental response, for atmospheric transparency, for atmospheric extinction, and for Galactic extinction. We perform an example SLR treatment of Sloan Digital Sky Survey data over a wide range of Galactic dust values and independently recover the direction and magnitude of the canonical Galactic reddening vector with 14-18 mmag rms uncertainties. We then isolate the effect of atmospheric extinction, showing that SLR accounts for this and returns precise colors over a wide range of air mass, with 5-14 mmag rms residuals. We demonstrate that SLR-corrected colors are sufficiently accurate to allow photometric redshift estimates for galaxy clusters (using red sequence galaxies) with an uncertainty σ(z)/(1 + z) = 0.6% per cluster for redshifts 0.09 < z < 0.25. Finally, we identify our objects in the 2MASS all-sky catalog, and produce i-band zero points typically accurate to 18 mmag using only SLR. We offer open-source access to our IDL routines, validated and verified for the implementation of this technique, at http://stellar-locus-regression.googlecode.com.
Sebastian, Tunny; Jeyaseelan, Visalakshi; Jeyaseelan, Lakshmanan; Anandan, Shalini; George, Sebastian; Bangdiwala, Shrikant I
2018-01-01
Hidden Markov models are stochastic models in which the observations are assumed to follow a mixture distribution, but the parameters of the components are governed by a Markov chain which is unobservable. The issues related to the estimation of Poisson-hidden Markov models in which the observations are coming from mixture of Poisson distributions and the parameters of the component Poisson distributions are governed by an m-state Markov chain with an unknown transition probability matrix are explained here. These methods were applied to the data on Vibrio cholerae counts reported every month for 11-year span at Christian Medical College, Vellore, India. Using Viterbi algorithm, the best estimate of the state sequence was obtained and hence the transition probability matrix. The mean passage time between the states were estimated. The 95% confidence interval for the mean passage time was estimated via Monte Carlo simulation. The three hidden states of the estimated Markov chain are labelled as 'Low', 'Moderate' and 'High' with the mean counts of 1.4, 6.6 and 20.2 and the estimated average duration of stay of 3, 3 and 4 months, respectively. Environmental risk factors were studied using Markov ordinal logistic regression analysis. No significant association was found between disease severity levels and climate components.
Directory of Open Access Journals (Sweden)
Pape Sarah A
2009-02-01
Full Text Available Abstract Background Laser-Doppler imaging (LDI of cutaneous blood flow is beginning to be used by burn surgeons to predict the healing time of burn wounds; predicted healing time is used to determine wound treatment as either dressings or surgery. In this paper, we do a statistical analysis of the performance of the technique. Methods We used data from a study carried out by five burn centers: LDI was done once between days 2 to 5 post burn, and healing was assessed at both 14 days and 21 days post burn. Random-effects ordinal logistic regression and other models such as the continuation ratio model were used to model healing-time as a function of the LDI data, and of demographic and wound history variables. Statistical methods were also used to study the false-color palette, which enables the laser-Doppler imager to be used by clinicians as a decision-support tool. Results Overall performance is that diagnoses are over 90% correct. Related questions addressed were what was the best blood flow summary statistic and whether, given the blood flow measurements, demographic and observational variables had any additional predictive power (age, sex, race, % total body surface area burned (%TBSA, site and cause of burn, day of LDI scan, burn center. It was found that mean laser-Doppler flux over a wound area was the best statistic, and that, given the same mean flux, women recover slightly more slowly than men. Further, the likely degradation in predictive performance on moving to a patient group with larger %TBSA than those in the data sample was studied, and shown to be small. Conclusion Modeling healing time is a complex statistical problem, with random effects due to multiple burn areas per individual, and censoring caused by patients missing hospital visits and undergoing surgery. This analysis applies state-of-the art statistical methods such as the bootstrap and permutation tests to a medical problem of topical interest. New medical findings are
Setiawan, Suhartono, Ahmad, Imam Safawi; Rahmawati, Noorgam Ika
2015-12-01
Bank Indonesia (BI) as the central bank of Republic Indonesiahas a single overarching objective to establish and maintain rupiah stability. This objective could be achieved by monitoring traffic of inflow and outflow money currency. Inflow and outflow are related to stock and distribution of money currency around Indonesia territory. It will effect of economic activities. Economic activities of Indonesia,as one of Moslem country, absolutely related to Islamic Calendar (lunar calendar), that different with Gregorian calendar. This research aims to forecast the inflow and outflow money currency of Representative Office (RO) of BI Semarang Central Java region. The results of the analysis shows that the characteristics of inflow and outflow money currency influenced by the effects of the calendar variations, that is the day of Eid al-Fitr (moslem holyday) as well as seasonal patterns. In addition, the period of a certain week during Eid al-Fitr also affect the increase of inflow and outflow money currency. The best model based on the value of the smallestRoot Mean Square Error (RMSE) for inflow data is ARIMA model. While the best model for predicting the outflow data in RO of BI Semarang is ARIMAX model or Time Series Regression, because both of them have the same model. The results forecast in a period of 2015 shows an increase of inflow money currency happened in August, while the increase in outflow money currency happened in July.
Shen, Nicole T; Maw, Anna; Tmanova, Lyubov L; Pino, Alejandro; Ancy, Kayley; Crawford, Carl V; Simon, Matthew S; Evans, Arthur T
2017-06-01
Systematic reviews have provided evidence for the efficacy of probiotics in preventing Clostridium difficile infection (CDI), but guidelines do not recommend probiotic use for prevention of CDI. We performed an updated systematic review to help guide clinical practice. We searched MEDLINE, EMBASE, International Journal of Probiotics and Prebiotics, and The Cochrane Library databases for randomized controlled trials evaluating use of probiotics and CDI in hospitalized adults taking antibiotics. Two reviewers independently extracted data and assessed risk of bias and overall quality of the evidence. Primary and secondary outcomes were incidence of CDI and adverse events, respectively. Secondary analyses examined the effects of probiotic species, dose, timing, formulation, duration, and study quality. We analyzed data from 19 published studies, comprising 6261 subjects. The incidence of CDI in the probiotic cohort, 1.6% (54 of 3277), was lower than of controls, 3.9% (115 of 2984) (P probiotic users was 0.42 (95% confidence interval, 0.30-0.57; I 2 = 0.0%). Meta-regression analysis demonstrated that probiotics were significantly more effective if given closer to the first antibiotic dose, with a decrement in efficacy for every day of delay in starting probiotics (P = .04); probiotics given within 2 days of antibiotic initiation produced a greater reduction of risk for CDI (relative risk, 0.32; 95% confidence interval, 0.22-0.48; I 2 = 0%) than later administration (relative risk, 0.70; 95% confidence interval, 0.40-1.23; I 2 = 0%) (P = .02). There was no increased risk for adverse events among patients given probiotics. The overall quality of the evidence was high. In a systematic review with meta-regression analysis, we found evidence that administration of probiotics closer to the first dose of antibiotic reduces the risk of CDI by >50% in hospitalized adults. Future research should focus on optimal probiotic dose, species, and formulation. Systematic
Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.
2017-01-01
Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...
HIT-6 and MIDAS as measures of headache disability in a headache referral population.
Sauro, Khara M; Rose, Marianne S; Becker, Werner J; Christie, Suzanne N; Giammarco, Rose; Mackie, Gordon F; Eloff, Arnoldas G; Gawel, Marek J
2010-03-01
The objective of this study was to compare the headache impact test (HIT-6) and the migraine disability assessment scale (MIDAS) as clinical measures of headache-related disability. The degree of headache-related disability is an important factor in treatment planning. Many quality of life and headache disability measures exist but it is unclear which of the available disability measures is the most helpful in planning and measuring headache management. We compared HIT-6 and MIDAS scores from 798 patients from the Canadian Headache Outpatient Registry and Database (CHORD). Correlation and regression analyses were used to examine the relationships between the HIT-6 and MIDAS total scores, headache frequency and intensity, and Beck Depression Inventory (BDI-II) scores. A positive correlation was found between HIT-6 and MIDAS scores (r = 0.52). The BDI-II scores correlated equally with the HIT-6 and the MIDAS (r = 0.42). There was a non-monotonic relationship between headache frequency and the MIDAS, and a non-linear monotonic relationship between headache frequency and the HIT-6 (r = 0.24). The correlation was higher between the intensity and the HIT-6 scores (r = 0.46), than MIDAS (r = 0.26) scores. Seventy-nine percent of patients fell into the most severe HIT-6 disability category, compared with the 57% of patients that fell into the most severe MIDAS disability category. Significantly more patients were placed in a more severe category with the HIT-6 than with the MIDAS (McNemar chi-square = 191 on 6 d.f., P MIDAS appear to measure headache-related disability in a similar fashion. However, some important differences may exist. Headache intensity appears to influence HIT-6 score more than the MIDAS, whereas the MIDAS was influenced more by headache frequency. Using the HIT-6 and MIDAS together may give a more accurate assessment of a patient's headache-related disability.
International Nuclear Information System (INIS)
Bukhari, W; Hong, S-M
2015-01-01
Motion-adaptive radiotherapy aims to deliver a conformal dose to the target tumour with minimal normal tissue exposure by compensating for tumour motion in real time. The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting and gating respiratory motion that utilizes a model-based and a model-free Bayesian framework by combining them in a cascade structure. The algorithm, named EKF-GPR + , implements a gating function without pre-specifying a particular region of the patient’s breathing cycle. The algorithm first employs an extended Kalman filter (LCM-EKF) to predict the respiratory motion and then uses a model-free Gaussian process regression (GPR) to correct the error of the LCM-EKF prediction. The GPR is a non-parametric Bayesian algorithm that yields predictive variance under Gaussian assumptions. The EKF-GPR + algorithm utilizes the predictive variance from the GPR component to capture the uncertainty in the LCM-EKF prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification allows us to pause the treatment beam over such instances. EKF-GPR + implements the gating function by using simple calculations based on the predictive variance with no additional detection mechanism. A sparse approximation of the GPR algorithm is employed to realize EKF-GPR + in real time. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPR + . The experimental results show that the EKF-GPR + algorithm effectively reduces the prediction error in a root-mean-square (RMS) sense by employing the gating function, albeit at the cost of a reduced duty cycle. As an example, EKF-GPR + reduces the patient-wise RMS error to 37%, 39% and 42
Bukhari, W.; Hong, S.-M.
2015-01-01
Motion-adaptive radiotherapy aims to deliver a conformal dose to the target tumour with minimal normal tissue exposure by compensating for tumour motion in real time. The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting and gating respiratory motion that utilizes a model-based and a model-free Bayesian framework by combining them in a cascade structure. The algorithm, named EKF-GPR+, implements a gating function without pre-specifying a particular region of the patient’s breathing cycle. The algorithm first employs an extended Kalman filter (LCM-EKF) to predict the respiratory motion and then uses a model-free Gaussian process regression (GPR) to correct the error of the LCM-EKF prediction. The GPR is a non-parametric Bayesian algorithm that yields predictive variance under Gaussian assumptions. The EKF-GPR+ algorithm utilizes the predictive variance from the GPR component to capture the uncertainty in the LCM-EKF prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification allows us to pause the treatment beam over such instances. EKF-GPR+ implements the gating function by using simple calculations based on the predictive variance with no additional detection mechanism. A sparse approximation of the GPR algorithm is employed to realize EKF-GPR+ in real time. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPR+. The experimental results show that the EKF-GPR+ algorithm effectively reduces the prediction error in a root-mean-square (RMS) sense by employing the gating function, albeit at the cost of a reduced duty cycle. As an example, EKF-GPR+ reduces the patient-wise RMS error to 37%, 39% and 42% in
Bukhari, W; Hong, S-M
2015-01-07
Motion-adaptive radiotherapy aims to deliver a conformal dose to the target tumour with minimal normal tissue exposure by compensating for tumour motion in real time. The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting and gating respiratory motion that utilizes a model-based and a model-free Bayesian framework by combining them in a cascade structure. The algorithm, named EKF-GPR(+), implements a gating function without pre-specifying a particular region of the patient's breathing cycle. The algorithm first employs an extended Kalman filter (LCM-EKF) to predict the respiratory motion and then uses a model-free Gaussian process regression (GPR) to correct the error of the LCM-EKF prediction. The GPR is a non-parametric Bayesian algorithm that yields predictive variance under Gaussian assumptions. The EKF-GPR(+) algorithm utilizes the predictive variance from the GPR component to capture the uncertainty in the LCM-EKF prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification allows us to pause the treatment beam over such instances. EKF-GPR(+) implements the gating function by using simple calculations based on the predictive variance with no additional detection mechanism. A sparse approximation of the GPR algorithm is employed to realize EKF-GPR(+) in real time. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPR(+). The experimental results show that the EKF-GPR(+) algorithm effectively reduces the prediction error in a root-mean-square (RMS) sense by employing the gating function, albeit at the cost of a reduced duty cycle. As an example, EKF-GPR(+) reduces the patient-wise RMS error to 37%, 39% and
Pattern recognition with vector hits
International Nuclear Information System (INIS)
Frühwirth, R
2012-01-01
Trackers at the future high-luminosity LHC, designed to have triggering capability, will feature layers of stacked modules with a small stack separation. This will allow the reconstruction of track stubs or vector hits with position and direction information, but lacking precise curvature information. This opens up new possibilities for track finding, online and offline. Two track finding methods, the Kalman filter and the convergent Hough transform are studied in this context. Results from a simplified fast simulation are presented. It is shown that the performance of the methods depends to a large extent on the size of the stack separation. We conclude that the detector design and the choice of the track finding algorithm(s) are strongly coupled and should proceed conjointly.
Markov chains with quasitoeplitz transition matrix: first zero hitting
Directory of Open Access Journals (Sweden)
Alexander M. Dukhovny
1989-01-01
Full Text Available This paper continues the investigation of Markov Chains with a quasitoeplitz transition matrix. Generating functions of first zero hitting probabilities and mean times are found by the solution of special Riemann boundary value problems on the unit circle. Duality is discussed.
Kühn, Michael; Schöne, Tim
2017-04-01
Water management tools are essential to ensure the conservation of natural resources. The geothermal hot water reservoir below the village of Waiwera, on the Northern Island of New Zealand is used commercially since 1863. The continuous production of 50 °C hot geothermal water, to supply hotels and spas, has a negative impact on the reservoir. Until the year 1969 from all wells drilled the warm water flow was artesian. Due to overproduction the water needs to be pumped up nowadays. Further, within the years 1975 to 1976 the warm water seeps on the beach of Waiwera ran dry. In order to protect the reservoir and the historical and tourist site in the early 1980s a water management plan was deployed. The "Auckland Council" established guidelines to enable a sustainable management of the resource [1]. The management plan demands that the water level in the official and appropriate observation well of the council is 0.5 m above sea level throughout the year in average. Almost four decades of data (since 1978 until today) are now available [2]. For a sustainable water management, it is necessary to be able to forecast the water level as a function of the production rates in the production wells. The best predictions are provided by a multivariate regression model of the water level and production rate time series, which takes into account the production rates of individual wells. It is based on the inversely proportional relationship between the independent variable (production rate) and the dependent variable (measured water level). In production scenarios, a maximum total production rate of approx. 1,100 m3 / day is determined in order to comply with the guidelines of the "Auckland Council". [1] Kühn M., Stöfen H. (2005) A reactive flow model of the geothermal reservoir Waiwera, New Zealand. Hydrogeology Journal 13, 606-626, doi: 10.1007/s10040-004-0377-6 [2] Kühn M., Altmannsberger C. (2016) Assessment of data driven and process based water management tools for
2012-04-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...
2011-05-04
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...
2013-05-17
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations AGENCY: Office of the National Coordinator for Health Information... 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy...
Zhang, Chi; Reichgelt, Han; Rutherfoord, Rebecca H.; Wang, Andy Ju An
2014-01-01
Health Information Technology (HIT) professionals are in increasing demand as healthcare providers need help in the adoption and meaningful use of Electronic Health Record (EHR) systems while the HIT industry needs workforce skilled in HIT and EHR development. To respond to this increasing demand, the School of Computing and Software Engineering…
Statistics of hits to bone cell nuclei
International Nuclear Information System (INIS)
Kruglikov, I.L.; Polig, E.; Jee, W.S.S.
1993-01-01
The statistics of hits to the nuclei of bone cells irradiated from alpha sources labeling bone tissue is described. It is shown that the law of remodeling of a bone structural unit (BSU), which describes the distribution of quiescence periodes of this unit, affects the statistics of hits. It the irradiation of bone cells occurs during the whole cell cycle, the mean number of hits is independent of the law of remodeling. In this case the variance of hits has the minimum value for constant quiescence periods of BSUs (deterministic remodeling) and the maximum value for exponentially distributed quiescence periods (random remodeling). For the first generation of bone cells, i.e. for the cells which existed at the moment of the uptake of the nuclide, the mean number of hits depends on the law of remodeling. For random remodeling the mean number is equal to the mean value for the complete remodeling cycle. For deterministic remodeling the mean is only half this value. For the first generation of bone cells, changing the law of remodeling from random to deterministic increases the probability of no hits to the nuclei of bone cells. For the same mean value of hits, the difference does not exceed 13.3% of the total number of cells. For the subsequent generations of bone cells, such a change of the law of remodeling decreases the probability of no hits up to 20.4% of the total number of cells. (orig.)
Ali, M Sanni; Groenwold, Rolf H H; Belitser, Svetlana V; Souverein, Patrick C; Martín, Elisa; Gatto, Nicolle M; Huerta, Consuelo; Gardarsdottir, Helga; Roes, Kit C B; Hoes, Arno W; de Boer, Antonius; Klungel, Olaf H
2016-01-01
BACKGROUND: Observational studies including time-varying treatments are prone to confounding. We compared time-varying Cox regression analysis, propensity score (PS) methods, and marginal structural models (MSMs) in a study of antidepressant [selective serotonin reuptake inhibitors (SSRIs)] use and
Jeandron, Aurélie; Saidi, Jaime Mufitini; Kapama, Alois; Burhole, Manu; Birembano, Freddy; Vandevelde, Thierry; Gasparrini, Antonio; Armstrong, Ben; Cairncross, Sandy; Ensink, Jeroen H. J.
2015-01-01
Background The eastern provinces of the Democratic Republic of the Congo have been identified as endemic areas for cholera transmission, and despite continuous control efforts, they continue to experience regular cholera outbreaks that occasionally spread to the rest of the country. In a region where access to improved water sources is particularly poor, the question of which improvements in water access should be prioritized to address cholera transmission remains unresolved. This study aimed at investigating the temporal association between water supply interruptions and Cholera Treatment Centre (CTC) admissions in a medium-sized town. Methods and Findings Time-series patterns of daily incidence of suspected cholera cases admitted to the Cholera Treatment Centre in Uvira in South Kivu Province between 2009 and 2014 were examined in relation to the daily variations in volume of water supplied by the town water treatment plant. Quasi-poisson regression and distributed lag nonlinear models up to 12 d were used, adjusting for daily precipitation rates, day of the week, and seasonal variations. A total of 5,745 patients over 5 y of age with acute watery diarrhoea symptoms were admitted to the CTC over the study period of 1,946 d. Following a day without tap water supply, the suspected cholera incidence rate increased on average by 155% over the next 12 d, corresponding to a rate ratio of 2.55 (95% CI: 1.54–4.24), compared to the incidence experienced after a day with optimal production (defined as the 95th percentile—4,794 m3). Suspected cholera cases attributable to a suboptimal tap water supply reached 23.2% of total admissions (95% CI 11.4%–33.2%). Although generally reporting less admissions to the CTC, neighbourhoods with a higher consumption of tap water were more affected by water supply interruptions, with a rate ratio of 3.71 (95% CI: 1.91–7.20) and an attributable fraction of cases of 31.4% (95% CI: 17.3%–42.5%). The analysis did not suggest any
Jeandron, Aurélie; Saidi, Jaime Mufitini; Kapama, Alois; Burhole, Manu; Birembano, Freddy; Vandevelde, Thierry; Gasparrini, Antonio; Armstrong, Ben; Cairncross, Sandy; Ensink, Jeroen H J
2015-10-01
The eastern provinces of the Democratic Republic of the Congo have been identified as endemic areas for cholera transmission, and despite continuous control efforts, they continue to experience regular cholera outbreaks that occasionally spread to the rest of the country. In a region where access to improved water sources is particularly poor, the question of which improvements in water access should be prioritized to address cholera transmission remains unresolved. This study aimed at investigating the temporal association between water supply interruptions and Cholera Treatment Centre (CTC) admissions in a medium-sized town. Time-series patterns of daily incidence of suspected cholera cases admitted to the Cholera Treatment Centre in Uvira in South Kivu Province between 2009 and 2014 were examined in relation to the daily variations in volume of water supplied by the town water treatment plant. Quasi-poisson regression and distributed lag nonlinear models up to 12 d were used, adjusting for daily precipitation rates, day of the week, and seasonal variations. A total of 5,745 patients over 5 y of age with acute watery diarrhoea symptoms were admitted to the CTC over the study period of 1,946 d. Following a day without tap water supply, the suspected cholera incidence rate increased on average by 155% over the next 12 d, corresponding to a rate ratio of 2.55 (95% CI: 1.54-4.24), compared to the incidence experienced after a day with optimal production (defined as the 95th percentile-4,794 m3). Suspected cholera cases attributable to a suboptimal tap water supply reached 23.2% of total admissions (95% CI 11.4%-33.2%). Although generally reporting less admissions to the CTC, neighbourhoods with a higher consumption of tap water were more affected by water supply interruptions, with a rate ratio of 3.71 (95% CI: 1.91-7.20) and an attributable fraction of cases of 31.4% (95% CI: 17.3%-42.5%). The analysis did not suggest any association between levels of residual
Matson, Johnny L.; Kozlowski, Alison M.
2010-01-01
Autistic regression is one of the many mysteries in the developmental course of autism and pervasive developmental disorders not otherwise specified (PDD-NOS). Various definitions of this phenomenon have been used, further clouding the study of the topic. Despite this problem, some efforts at establishing prevalence have been made. The purpose of…
Baresel, Björn; Bucher, Hugo; Bagherpour, Borhan; Brosse, Morgane; Guodun, Kuang; Schaltegger, Urs
2017-03-06
New high-resolution U-Pb dates indicate a duration of 89 ± 38 kyr for the Permian hiatus and of 14 ± 57 kyr for the overlying Triassic microbial limestone in shallow water settings of the Nanpanjiang Basin, South China. The age and duration of the hiatus coincides with the Permian-Triassic boundary (PTB) and the extinction interval in the Meishan Global Stratotype Section and Point, and strongly supports a glacio-eustatic regression, which best explains the genesis of the worldwide hiatus straddling the PTB in shallow water records. In adjacent deep marine troughs, rates of sediment accumulation display a six-fold decrease across the PTB compatible with a dryer and cooler climate as indicated by terrestrial plants. Our model of the Permian-Triassic boundary mass extinction (PTBME) hinges on the synchronicity of the hiatus with the onset of the Siberian Traps volcanism. This early eruptive phase released sulfur-rich volatiles into the stratosphere, thus simultaneously eliciting a short-lived ice age responsible for the global regression and a brief but intense acidification. Abrupt cooling, shrunk habitats on shelves and acidification may all have synergistically triggered the PTBME. Subsequently, the build-up of volcanic CO 2 induced a transient cool climate whose early phase saw the deposition of the microbial limestone.
Directory of Open Access Journals (Sweden)
Zhi-Sai Ma
2017-01-01
Full Text Available Modal parameter estimation plays an important role in vibration-based damage detection and is worth more attention and investigation, as changes in modal parameters are usually being used as damage indicators. This paper focuses on the problem of output-only modal parameter recursive estimation of time-varying structures based upon parameterized representations of the time-dependent autoregressive moving average (TARMA. A kernel ridge regression functional series TARMA (FS-TARMA recursive identification scheme is proposed and subsequently employed for the modal parameter estimation of a numerical three-degree-of-freedom time-varying structural system and a laboratory time-varying structure consisting of a simply supported beam and a moving mass sliding on it. The proposed method is comparatively assessed against an existing recursive pseudolinear regression FS-TARMA approach via Monte Carlo experiments and shown to be capable of accurately tracking the time-varying dynamics in a recursive manner.
Olive, David J
2017-01-01
This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...
Singh, S.; Jaishi, H. P.; Tiwari, R. P.; Tiwari, R. C.
2017-07-01
This paper reports the analysis of soil radon data recorded in the seismic zone-V, located in the northeastern part of India (latitude 23.73N, longitude 92.73E). Continuous measurements of soil-gas emission along Chite fault in Mizoram (India) were carried out with the replacement of solid-state nuclear track detectors at weekly interval. The present study was done for the period from March 2013 to May 2015 using LR-115 Type II detectors, manufactured by Kodak Pathe, France. In order to reduce the influence of meteorological parameters, statistical analysis tools such as multiple linear regression and artificial neural network have been used. Decrease in radon concentration was recorded prior to some earthquakes that occurred during the observation period. Some false anomalies were also recorded which may be attributed to the ongoing crustal deformation which was not major enough to produce an earthquake.
Directory of Open Access Journals (Sweden)
N. Zahir
2015-12-01
Full Text Available Lake Urmia is one of the most important ecosystems of the country which is on the verge of elimination. Many factors contribute to this crisis among them is the precipitation, paly important roll. Precipitation has many forms one of them is in the form of snow. The snow on Sahand Mountain is one of the main and important sources of the Lake Urmia’s water. Snow Depth (SD is vital parameters for estimating water balance for future year. In this regards, this study is focused on SD parameter using Special Sensor Microwave/Imager (SSM/I instruments on board the Defence Meteorological Satellite Program (DMSP F16. The usual statistical methods for retrieving SD include linear and non-linear ones. These methods used least square procedure to estimate SD model. Recently, kernel base methods widely used for modelling statistical problem. From these methods, the support vector regression (SVR is achieved the high performance for modelling the statistical problem. Examination of the obtained data shows the existence of outlier in them. For omitting these outliers, wavelet denoising method is applied. After the omission of the outliers it is needed to select the optimum bands and parameters for SVR. To overcome these issues, feature selection methods have shown a direct effect on improving the regression performance. We used genetic algorithm (GA for selecting suitable features of the SSMI bands in order to estimate SD model. The results for the training and testing data in Sahand mountain is [R²_TEST=0.9049 and RMSE= 6.9654] that show the high SVR performance.
A C++ object-oriented toolkit for track finding with k-dimensional hits
International Nuclear Information System (INIS)
Uiterwijk, J.W.E.; Panman, J.; Vyver, B. van de
2006-01-01
A library is described for the recognition of tracks in a set of hits. The hits are assumed to be k-dimensional points (k-d), with k>=1, of which a subset can be grouped into tracks by using short-range correlations. A connection graph between the hits is created by sorting the hits first in k-d space using one of the developed, fast, k-space containers. The track-finding algorithm considers any connection between two hits as a possible track seed and grows these seeds into longer track segments using a modified depth-first search of the connection graph. All hit-acceptance decisions are called via abstract methods of an acceptance criterion class which isolates the library from the application's hit and track model. An application is tuned for a particular tracking environment by creating a concrete implementation for the hit and track acceptance calculations. The implementer is free to trade tracking time for acceptance complexity (influencing efficiency) depending on the requirements of the particular application. Results for simulated data show that the track finding is both efficient and fast even for high noise environments
The validation of Huffaz Intelligence Test (HIT)
Rahim, Mohd Azrin Mohammad; Ahmad, Tahir; Awang, Siti Rahmah; Safar, Ajmain
2017-08-01
In general, a hafiz who can memorize the Quran has many specialties especially in respect to their academic performances. In this study, the theory of multiple intelligences introduced by Howard Gardner is embedded in a developed psychometric instrument, namely Huffaz Intelligence Test (HIT). This paper presents the validation and the reliability of HIT of some tahfiz students in Malaysia Islamic schools. A pilot study was conducted involving 87 huffaz who were randomly selected to answer the items in HIT. The analysis method used includes Partial Least Square (PLS) on reliability, convergence and discriminant validation. The study has validated nine intelligences. The findings also indicated that the composite reliabilities for the nine types of intelligences are greater than 0.8. Thus, the HIT is a valid and reliable instrument to measure the multiple intelligences among huffaz.
Baresel, Bjoern; Bucher, Hugo; Bagherpour, Borhan; Brosse, Morgane; Guodun, Kuang; Schaltegger, Urs
2017-04-01
High-precision U-Pb dating of single-zircon crystals by chemical abrasion-isotope dilution-thermal ionization mass spectrometry (CA-ID-TIMS) is applied to volcanic beds that are intercalated in sedimentary sequences across the Permian-Triassic boundary (PTB). By assuming that the zircon crystallization age closely approximate that of the volcanic eruption and subsequent deposition, U-Pb zircon geochronology is the preferred approach for dating abiotic and biotic events, such as the formational PTB and the Permian-Triassic boundary mass extinction (PTBME). We will present new U-Pb zircon dates for a series of volcanic ash beds in shallow-marine Permian-Triassic sections in the Nanpanjiang Basin, South China. These high-resolution U-Pb dates indicate a duration of 90 ± 38 kyr for the Permian sedimentary hiatus and a duration of 13 ± 57 kyr for the overlying Triassic microbial limestone in the shallow water settings of the Nanpanjiang pull apart Basin. The age and duration of the hiatus coincides with the formational PTB and the extinction interval in the Meishan Global Stratotype Section and Point, thus strongly supporting a glacio-eustatic regression, which best explains the genesis of the worldwide hiatus straddling the PTB in shallow water records. In adjacent deep marine troughs, rates of sediment accumulation display a six-fold decrease across the PTB compatible with a dryer and cooler climate during the Griesbachian as indicated by terrestrial plants. Our model of the PTBME hinges on the synchronicity of the hiatus with the onset of the Siberian Traps volcanism. This early eruptive phase likely released sulfur-rich volatiles into the stratosphere, thus simultaneously eliciting a short-lived ice age responsible for the global regression and a brief but intense acidification. Abrupt cooling, shrunk habitats on shelves and acidification may all have synergistically triggered the PTBME. Subsequently, the build-up of volcanic CO2 induced this transient cool
How I treat double-hit lymphoma.
Friedberg, Jonathan W
2017-08-03
The 2016 revision of the World Health Organization (WHO) classification for lymphoma has included a new category of lymphoma, separate from diffuse large B-cell lymphoma, termed high-grade B-cell lymphoma with translocations involving myc and bcl-2 or bcl-6 . These lymphomas, which occur in hit lymphomas (or triple-hit lymphomas if all 3 rearrangements are present). It is important to differentiate these lymphomas from the larger group of double-expressor lymphomas, which have increased expression of MYC and BCL-2 and/or BCL-6 by immunohistochemistry, by using variable cutoff percentages to define positivity. Patients with double-hit lymphomas have a poor prognosis when treated with standard chemoimmunotherapy and have increased risk of central nervous system involvement and progression. Double-hit lymphomas may arise as a consequence of the transformation of the underlying indolent lymphoma. There are no published prospective trials in double-hit lymphoma, however retrospective studies strongly suggest that aggressive induction regimens may confer a superior outcome. In this article, I review my approach to the evaluation and treatment of double-hit lymphoma, with an eye toward future clinical trials incorporating rational targeted agents into the therapeutic armamentarium. © 2017 by The American Society of Hematology.
Directory of Open Access Journals (Sweden)
Ping Jiang
2015-01-01
Full Text Available Wind speed/power has received increasing attention around the earth due to its renewable nature as well as environmental friendliness. With the global installed wind power capacity rapidly increasing, wind industry is growing into a large-scale business. Reliable short-term wind speed forecasts play a practical and crucial role in wind energy conversion systems, such as the dynamic control of wind turbines and power system scheduling. In this paper, an intelligent hybrid model for short-term wind speed prediction is examined; the model is based on cross correlation (CC analysis and a support vector regression (SVR model that is coupled with brainstorm optimization (BSO and cuckoo search (CS algorithms, which are successfully utilized for parameter determination. The proposed hybrid models were used to forecast short-term wind speeds collected from four wind turbines located on a wind farm in China. The forecasting results demonstrate that the intelligent hybrid models outperform single models for short-term wind speed forecasting, which mainly results from the superiority of BSO and CS for parameter optimization.
Hong, S-M; Bukhari, W
2014-07-07
The motion of thoracic and abdominal tumours induced by respiratory motion often exceeds 20 mm, and can significantly compromise dose conformality. Motion-adaptive radiotherapy aims to deliver a conformal dose distribution to the tumour with minimal normal tissue exposure by compensating for the tumour motion. This adaptive radiotherapy, however, requires the prediction of the tumour movement that can occur over the system latency period. In general, motion prediction approaches can be classified into two groups: model-based and model-free. Model-based approaches utilize a motion model in predicting respiratory motion. These approaches are computationally efficient and responsive to irregular changes in respiratory motion. Model-free approaches do not assume an explicit model of motion dynamics, and predict future positions by learning from previous observations. Artificial neural networks (ANNs) and support vector regression (SVR) are examples of model-free approaches. In this article, we present a prediction algorithm that combines a model-based and a model-free approach in a cascade structure. The algorithm, which we call EKF-SVR, first employs a model-based algorithm (named LCM-EKF) to predict the respiratory motion, and then uses a model-free SVR algorithm to estimate and correct the error of the LCM-EKF prediction. Extensive numerical experiments based on a large database of 304 respiratory motion traces are performed. The experimental results demonstrate that the EKF-SVR algorithm successfully reduces the prediction error of the LCM-EKF, and outperforms the model-free ANN and SVR algorithms in terms of prediction accuracy across lookahead lengths of 192, 384, and 576 ms.
International Nuclear Information System (INIS)
Hong, S-M; Bukhari, W
2014-01-01
The motion of thoracic and abdominal tumours induced by respiratory motion often exceeds 20 mm, and can significantly compromise dose conformality. Motion-adaptive radiotherapy aims to deliver a conformal dose distribution to the tumour with minimal normal tissue exposure by compensating for the tumour motion. This adaptive radiotherapy, however, requires the prediction of the tumour movement that can occur over the system latency period. In general, motion prediction approaches can be classified into two groups: model-based and model-free. Model-based approaches utilize a motion model in predicting respiratory motion. These approaches are computationally efficient and responsive to irregular changes in respiratory motion. Model-free approaches do not assume an explicit model of motion dynamics, and predict future positions by learning from previous observations. Artificial neural networks (ANNs) and support vector regression (SVR) are examples of model-free approaches. In this article, we present a prediction algorithm that combines a model-based and a model-free approach in a cascade structure. The algorithm, which we call EKF–SVR, first employs a model-based algorithm (named LCM–EKF) to predict the respiratory motion, and then uses a model-free SVR algorithm to estimate and correct the error of the LCM–EKF prediction. Extensive numerical experiments based on a large database of 304 respiratory motion traces are performed. The experimental results demonstrate that the EKF–SVR algorithm successfully reduces the prediction error of the LCM–EKF, and outperforms the model-free ANN and SVR algorithms in terms of prediction accuracy across lookahead lengths of 192, 384, and 576 ms. (paper)
Directory of Open Access Journals (Sweden)
Marek Vochozka
2017-12-01
Full Text Available Purpose of the article: Palladium is presently used for producing electronics, industrial products or jewellery, as well as products in the medical field. Its value is raised especially by its unique physical and chemical characteristics. Predicting the value of such a metal is not an easy matter (with regard to the fact that prices may change significantly in time. Methodology/methods: To carry out the analysis, London Fix Price PM data was used, i.e. amounts reported in the afternoon for a period longer than 10 years. To process the data, Statistica software is used. Linear regression is carried out using a whole range of functions, and subsequently regression via neural structures is performed, where several distributional functions are used again. Subsequently, 1000 neural networks are generated, out of which 5 proving the best characteristics are chosen. Scientific aim: The aim of the paper is to perform a regression analysis of the development of the palladium price on the New York Stock Exchange using neural structures and linear regression, then to compare the two methods and determine the more suitable one for a possible prediction of the future development of the palladium price on the New York Stock Exchange. Findings: Results are compared on the level of an expert perspective and the evaluator’s – economist’s experience. Within regression time lines, the curve obtained by the least squares methods via negative-exponential smoothing gets closest to Palladium price line development. Out of the neural networks, all 5 chosen networks prove to be the most practically useful. Conclusions: Because it is not possible to predict extraordinary situations and their impact on the palladium price (at most in the short term, but certainly not over a long period of time, simplification and the creation of a relatively simple model is appropriate and the result is useful.
Fragkaki, A G; Farmaki, E; Thomaidis, N; Tsantili-Kakoulidou, A; Angelis, Y S; Koupparis, M; Georgakopoulos, C
2012-09-21
The comparison among different modelling techniques, such as multiple linear regression, partial least squares and artificial neural networks, has been performed in order to construct and evaluate models for prediction of gas chromatographic relative retention times of trimethylsilylated anabolic androgenic steroids. The performance of the quantitative structure-retention relationship study, using the multiple linear regression and partial least squares techniques, has been previously conducted. In the present study, artificial neural networks models were constructed and used for the prediction of relative retention times of anabolic androgenic steroids, while their efficiency is compared with that of the models derived from the multiple linear regression and partial least squares techniques. For overall ranking of the models, a novel procedure [Trends Anal. Chem. 29 (2010) 101-109] based on sum of ranking differences was applied, which permits the best model to be selected. The suggested models are considered useful for the estimation of relative retention times of designer steroids for which no analytical data are available. Copyright © 2012 Elsevier B.V. All rights reserved.
NIMROD Simulations of the HIT-SI and HIT-SI3 Devices
Morgan, Kyle; Jarboe, Tom; Hossack, Aaron; Chandra, Rian; Everson, Chris
2017-10-01
The Helicity Injected Torus with Steady Inductive helicity injection (HIT-SI) experiment uses a set of inductively driven helicity injectors to apply non-axisymmetric current drive on the edge of the plasma, driving an axisymmetric spheromak equilibrium in a central confinement volume. Significant improvements have been made to extended MHD modeling of HIT-SI, with both the resolution of disagreement at high injector frequencies in HIT-SI in addition to successes with the new upgraded HIT-SI3 device. Previous numerical studies of HIT-SI, using a zero-beta eMHD model, focused on operations with a drive frequency of 14.5 kHz, and found reduced agreement with both the magnetic profile and current amplification at higher frequencies (30-70 kHz). HIT-SI3 has three helicity injectors which are able to operate with different mode structures of perturbations through the different relative temporal phasing of the injectors. Simulations that allow for pressure gradients have been performed in the parameter regimes of both devices using the NIMROD code and show improved agreement with experimental results, most notably capturing the observed Shafranov-shift due to increased beta observed at higher finj in HIT-SI and the variety of toroidal perturbation spectra available in HIT-SI3. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences under Award Number DE-FG02- 96ER54361.
Bramness, Jørgen G; Walby, Fredrik A; Morken, Gunnar; Røislien, Jo
2015-08-01
Seasonal variation in the number of suicides has long been acknowledged. It has been suggested that this seasonality has declined in recent years, but studies have generally used statistical methods incapable of confirming this. We examined all suicides occurring in Norway during 1969-2007 (more than 20,000 suicides in total) to establish whether seasonality decreased over time. Fitting of additive Fourier Poisson time-series regression models allowed for formal testing of a possible linear decrease in seasonality, or a reduction at a specific point in time, while adjusting for a possible smooth nonlinear long-term change without having to categorize time into discrete yearly units. The models were compared using Akaike's Information Criterion and analysis of variance. A model with a seasonal pattern was significantly superior to a model without one. There was a reduction in seasonality during the period. Both the model assuming a linear decrease in seasonality and the model assuming a change at a specific point in time were both superior to a model assuming constant seasonality, thus confirming by formal statistical testing that the magnitude of the seasonality in suicides has diminished. The additive Fourier Poisson time-series regression model would also be useful for studying other temporal phenomena with seasonal components. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Erdoğan, Sinem B; Tong, Yunjie; Hocke, Lia M; Lindsey, Kimberly P; deB Frederick, Blaise
2016-01-01
Resting state functional connectivity analysis is a widely used method for mapping intrinsic functional organization of the brain. Global signal regression (GSR) is commonly employed for removing systemic global variance from resting state BOLD-fMRI data; however, recent studies have demonstrated that GSR may introduce spurious negative correlations within and between functional networks, calling into question the meaning of anticorrelations reported between some networks. In the present study, we propose that global signal from resting state fMRI is composed primarily of systemic low frequency oscillations (sLFOs) that propagate with cerebral blood circulation throughout the brain. We introduce a novel systemic noise removal strategy for resting state fMRI data, "dynamic global signal regression" (dGSR), which applies a voxel-specific optimal time delay to the global signal prior to regression from voxel-wise time series. We test our hypothesis on two functional systems that are suggested to be intrinsically organized into anticorrelated networks: the default mode network (DMN) and task positive network (TPN). We evaluate the efficacy of dGSR and compare its performance with the conventional "static" global regression (sGSR) method in terms of (i) explaining systemic variance in the data and (ii) enhancing specificity and sensitivity of functional connectivity measures. dGSR increases the amount of BOLD signal variance being modeled and removed relative to sGSR while reducing spurious negative correlations introduced in reference regions by sGSR, and attenuating inflated positive connectivity measures. We conclude that incorporating time delay information for sLFOs into global noise removal strategies is of crucial importance for optimal noise removal from resting state functional connectivity maps.
Camp, Christopher L; Wang, Dean; Sinatro, Alec S; D'Angelo, John; Coleman, Struan H; Dines, Joshua S; Fealy, Stephen; Conte, Stan
2018-05-01
Although batters are frequently hit by pitch (HBP) in baseball, the effect of HBP injuries remains undefined in the literature. To determine the effect of HBP injuries in terms of time out of play, injury patterns resulting in the greatest time out of play, and the value of protective gear such as helmets and elbow pads. Descriptive laboratory study. Based on the Major League Baseball (MLB) Health and Injury Tracking System, all injuries to batters HBP during the 2011-2015 MLB and Minor League Baseball (MiLB) seasons were identified and analyzed. Video analysis was performed on all HBP events from the 2015 MLB season. Multivariate stepwise regression analysis was utilized to determine the predictive capacity of multiple variables (velocity, pitch type, location, etc) on injury status and severity. A total of 2920 HBP injuries resulted in 24,624 days missed (DM) over the 5 seasons. MLB HBP injuries occurred at a rate of 1 per 2554 plate appearances (1 per 9780 pitches thrown). Mean DM per injury were 8.4 (11.7 for MLB vs 8.0 for MiLB, P hit in the head/face (odds ratio, 28.7) or distal upper extremity (odds ratio, 6.4) were more likely to be injured than players HBP in other locations. Players with an unprotected elbow missed 1.7 more days (95% CI, -4.1 to 7.6) than those with an elbow protector ( P = .554) when injured after being HBP. Although HBP injuries occur infrequently in the course of normal play, they collectively represent a significant source of time out of play. The most common body regions injured include the hands/fingers and head/face, and batters hit in these locations are significantly more likely to be injured. After contusions, concussions were the most common injury diagnosis.
Fang, Ling; Gu, Caiyun; Liu, Xinyu; Xie, Jiabin; Hou, Zhiguo; Tian, Meng; Yin, Jia; Li, Aizhu; Li, Yubo
2017-01-01
Primary dysmenorrhea (PD) is a common gynecological disorder which, while not life-threatening, severely affects the quality of life of women. Most patients with PD suffer ovarian hormone imbalances caused by uterine contraction, which results in dysmenorrhea. PD patients may also suffer from increases in estrogen levels caused by increased levels of prostaglandin synthesis and release during luteal regression and early menstruation. Although PD pathogenesis has been previously reported on, these studies only examined the menstrual period and neglected the importance of the luteal regression stage. Therefore, the present study used urine metabolomics to examine changes in endogenous substances and detect urine biomarkers for PD during luteal regression. Ultra performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry was used to create metabolomic profiles for 36 patients with PD and 27 healthy controls. Principal component analysis and partial least squares discriminate analysis were used to investigate the metabolic alterations associated with PD. Ten biomarkers for PD were identified, including ornithine, dihydrocortisol, histidine, citrulline, sphinganine, phytosphingosine, progesterone, 17-hydroxyprogesterone, androstenedione, and 15-keto-prostaglandin F2α. The specificity and sensitivity of these biomarkers was assessed based on the area under the curve of receiver operator characteristic curves, which can be used to distinguish patients with PD from healthy controls. These results provide novel targets for the treatment of PD. PMID:28098892
DEFF Research Database (Denmark)
Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas
2017-01-01
In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....
Upgrade tracking with the UT Hits
Gandini, P; Wang, J
2014-01-01
The performance of the LHCb tracking system for the upgrade on long tracks is evaluated in terms of efficiency and ghost rate reduction for several different sets of requirements. We find that the efficiency is quite high and that the ghost rate reduction is substantial. We also describe the current algorithm for adding UT hits to the tracks.
Aukema, Sietse M.; Siebert, Reiner; Schuuring, Ed; van Imhoff, Gustaaf W.; Kluin-Nelemans, Hanneke C.; Boerma, Evert-Jan; Kluin, Philip M.
2011-01-01
In many B-cell lymphomas, chromosomal translocations are biologic and diagnostic hallmarks of disease. An intriguing subset is formed by the so-called double-hit (DH) lymphomas that are defined by a chromosomal breakpoint affecting the MYC/8q24 locus in combination with another recurrent breakpoint,
Oden, Timothy D.; Asquith, William H.; Milburn, Matthew S.
2009-01-01
In December 2005, the U.S. Geological Survey in cooperation with the City of Houston, Texas, began collecting discrete water-quality samples for nutrients, total organic carbon, bacteria (total coliform and Escherichia coli), atrazine, and suspended sediment at two U.S. Geological Survey streamflow-gaging stations upstream from Lake Houston near Houston (08068500 Spring Creek near Spring, Texas, and 08070200 East Fork San Jacinto River near New Caney, Texas). The data from the discrete water-quality samples collected during 2005-07, in conjunction with monitored real-time data already being collected - physical properties (specific conductance, pH, water temperature, turbidity, and dissolved oxygen), streamflow, and rainfall - were used to develop regression models for predicting water-quality constituent concentrations for inflows to Lake Houston. Rainfall data were obtained from a rain gage monitored by Harris County Homeland Security and Emergency Management and colocated with the Spring Creek station. The leaps and bounds algorithm was used to find the best subsets of possible regression models (minimum residual sum of squares for a given number of variables). The potential explanatory or predictive variables included discharge (streamflow), specific conductance, pH, water temperature, turbidity, dissolved oxygen, rainfall, and time (to account for seasonal variations inherent in some water-quality data). The response variables at each site were nitrite plus nitrate nitrogen, total phosphorus, organic carbon, Escherichia coli, atrazine, and suspended sediment. The explanatory variables provide easily measured quantities as a means to estimate concentrations of the various constituents under investigation, with accompanying estimates of measurement uncertainty. Each regression equation can be used to estimate concentrations of a given constituent in real time. In conjunction with estimated concentrations, constituent loads were estimated by multiplying the
DEFF Research Database (Denmark)
Christensen, E; Altman, D G; Neuberger, J
1993-01-01
BACKGROUND: The precision of current prognostic models in primary biliary cirrhosis (PBC) is rather low, partly because they are based on data from just one time during the course of the disease. The aim of this study was to design a new, more precise prognostic model by incorporating follow......-up data in the development of the model. METHODS: We have performed Cox regression analyses with time-dependent variables in 237 PBC patients followed up regularly for up to 11 years. The validity of the obtained models was tested by comparing predicted and observed survival in 147 independent PBC...... patients followed for up to 6 years. RESULTS: In the obtained model the following time-dependent variables independently indicated a poor prognosis: high bilirubin, low albumin, ascites, gastrointestinal bleeding, and old age. When including histological variables, cirrhosis, central cholestasis, and low...
Ali, M Sanni; Groenwold, Rolf H H; Belitser, Svetlana V; Souverein, Patrick C; Martín, Elisa; Gatto, Nicolle M; Huerta, Consuelo; Gardarsdottir, Helga; Roes, Kit C B; Hoes, Arno W; de Boer, Antonius; Klungel, Olaf H
2016-03-01
Observational studies including time-varying treatments are prone to confounding. We compared time-varying Cox regression analysis, propensity score (PS) methods, and marginal structural models (MSMs) in a study of antidepressant [selective serotonin reuptake inhibitors (SSRIs)] use and the risk of hip fracture. A cohort of patients with a first prescription for antidepressants (SSRI or tricyclic antidepressants) was extracted from the Dutch Mondriaan and Spanish Base de datos para la Investigación Farmacoepidemiológica en Atención Primaria (BIFAP) general practice databases for the period 2001-2009. The net (total) effect of SSRI versus no SSRI on the risk of hip fracture was estimated using time-varying Cox regression, stratification and covariate adjustment using the PS, and MSM. In MSM, censoring was accounted for by inverse probability of censoring weights. The crude hazard ratio (HR) of SSRI use versus no SSRI use on hip fracture was 1.75 (95%CI: 1.12, 2.72) in Mondriaan and 2.09 (1.89, 2.32) in BIFAP. After confounding adjustment using time-varying Cox regression, stratification, and covariate adjustment using the PS, HRs increased in Mondriaan [2.59 (1.63, 4.12), 2.64 (1.63, 4.25), and 2.82 (1.63, 4.25), respectively] and decreased in BIFAP [1.56 (1.40, 1.73), 1.54 (1.39, 1.71), and 1.61 (1.45, 1.78), respectively]. MSMs with stabilized weights yielded HR 2.15 (1.30, 3.55) in Mondriaan and 1.63 (1.28, 2.07) in BIFAP when accounting for censoring and 2.13 (1.32, 3.45) in Mondriaan and 1.66 (1.30, 2.12) in BIFAP without accounting for censoring. In this empirical study, differences between the different methods to control for time-dependent confounding were small. The observed differences in treatment effect estimates between the databases are likely attributable to different confounding information in the datasets, illustrating that adequate information on (time-varying) confounding is crucial to prevent bias. Copyright © 2016 John Wiley & Sons, Ltd.
The R Package threg to Implement Threshold Regression Models
Directory of Open Access Journals (Sweden)
Tao Xiao
2015-08-01
This new package includes four functions: threg, and the methods hr, predict and plot for threg objects returned by threg. The threg function is the model-fitting function which is used to calculate regression coefficient estimates, asymptotic standard errors and p values. The hr method for threg objects is the hazard-ratio calculation function which provides the estimates of hazard ratios at selected time points for specified scenarios (based on given categories or value settings of covariates. The predict method for threg objects is used for prediction. And the plot method for threg objects provides plots for curves of estimated hazard functions, survival functions and probability density functions of the first-hitting-time; function curves corresponding to different scenarios can be overlaid in the same plot for comparison to give additional research insights.
Institute of Scientific and Technical Information of China (English)
2008-01-01
The New York Philharmonic’s concert in North Korea turns a new page in the history of North Korean-U.S.relationsAmerican poet Henry Wadsworth Longfellow said music was the universal language of mankind. His words rang true at the New York Philharmonic’s concert on February 26 in Pyongyang.The unprecedented per- formance showed that after a half-century of confrontation,North Korea and the United States finally found a language that could enhance their understanding of each other. The New York Philharmonic visited North Korea on February 25-27.It gave one formal concert in the East Pyongyang Grand Theater and played informally with North Korean musicians at other time, starting a prelude of more communication
Joannah Caborn Wengler
2012-01-01
Many accelerators’ "round" birthdays are being celebrated at CERN these days – the PS turned 50 in 2009, the SPS was 35 in 2011, and this year it's the turn of the PS Booster to mark its 40th anniversary. Originally designed to accelerate 1013 protons to 800 MeV, it has far exceeded its initial design performance over the years. The PS Booster in the 1970s. Imagine the scene: a group of accelerator physicists staring expectantly at a monitor, when suddenly a shout of joy goes up as a signal flickers across the screen. Does that sound familiar? Well, turn the clock back 40 years (longer hair, wider trouser legs) and you have the situation at the PS Booster on 26 May 1972. On that day, beam was injected into the Booster for the first time. “It was a real buzz,” says Heribert Koziol, then Chairman of the Running-in Committee. “We were very happy – and also a little relieved – when the beam finally...
CERN Bulletin
2012-01-01
Just one year ago, CERN took delivery of its first bi-fuel vehicles (see article in Bulletin 07-08/2011). Today, the fleet comprises 100 vehicles capable of running with petrol or natural gas. At that time, Véronique Marchal, head of the Site Services section in the GS Department, told us: “We are counting on CERN car users’ environmental awareness to use natural gas fuel whenever possible.” Observations one year later show that... well, let’s say there is still plenty of room for improvement. A new awareness campaign has therefore been launched. “Running on natural gas reduces carbon dioxide emissions by some 40%,” explains Serge Micallef of the Services Industriels de Genève (SIG), CERN’s partners for this green mobility project. CNG contains 20% biogas, which is carbon-neutral. CNG produces 60 to 95% less pollution overall than ordinary petrol, and it is entirely soot-free. It is true that filli...
Yen, Po-Yin; McAlearney, Ann Scheck; Sieck, Cynthia J; Hefner, Jennifer L; Huerta, Timothy R
2017-09-07
In past years, policies and regulations required hospitals to implement advanced capabilities of certified electronic health records (EHRs) in order to receive financial incentives. This has led to accelerated implementation of health information technologies (HIT) in health care settings. However, measures commonly used to evaluate the success of HIT implementation, such as HIT adoption, technology acceptance, and clinical quality, fail to account for complex sociotechnical variability across contexts and the different trajectories within organizations because of different implementation plans and timelines. We propose a new focus, HIT adaptation, to illuminate factors that facilitate or hinder the connection between use of the EHR and improved quality of care as well as to explore the trajectory of changes in the HIT implementation journey as it is impacted by frequent system upgrades and optimizations. Future research should develop instruments to evaluate the progress of HIT adaptation in both its longitudinal design and its focus on adaptation progress rather than on one cross-sectional outcome, allowing for more generalizability and knowledge transfer. ©Po-Yin Yen, Ann Scheck McAlearney, Cynthia J Sieck, Jennifer L Hefner, Timothy R Huerta. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 07.09.2017.
On the Hitting Probability of Max-Stable Processes
Hofmann, Martin
2012-01-01
The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.
Simplified validation of borderline hits of database searches
Thomas, Henrik; Shevchenko, Andrej
2008-01-01
Along with unequivocal hits produced by matching multiple MS/MS spectra to database sequences, LC-MS/MS analysis often yields a large number of hits of borderline statistical confidence. To simplify their validation, we propose to use rapid de novo interpretation of all acquired MS/MS spectra and, with the help of a simple software tool, display the candidate sequences together with each database search hit. We demonstrate that comparing hit database sequences and independent de novo interpre...
"Hit-and-Run" leaves its mark: catalyst transcription factors and chromatin modification.
Varala, Kranthi; Li, Ying; Marshall-Colón, Amy; Para, Alessia; Coruzzi, Gloria M
2015-08-01
Understanding how transcription factor (TF) binding is related to gene regulation is a moving target. We recently uncovered genome-wide evidence for a "Hit-and-Run" model of transcription. In this model, a master TF "hits" a target promoter to initiate a rapid response to a signal. As the "hit" is transient, the model invokes recruitment of partner TFs to sustain transcription over time. Following the "run", the master TF "hits" other targets to propagate the response genome-wide. As such, a TF may act as a "catalyst" to mount a broad and acute response in cells that first sense the signal, while the recruited TF partners promote long-term adaptive behavior in the whole organism. This "Hit-and-Run" model likely has broad relevance, as TF perturbation studies across eukaryotes show small overlaps between TF-regulated and TF-bound genes, implicating transient TF-target binding. Here, we explore this "Hit-and-Run" model to suggest molecular mechanisms and its biological relevance. © 2015 The Authors. Bioessays published by WILEY Periodicals, Inc.
Hammarström, Anne; Gustafsson, Per E; Strandh, Mattias; Virtanen, Pekka; Janlert, Urban
2011-03-01
Research often fails to ascertain whether men and women are equally hit by the health consequences of unemployment. The aim of this study was to analyze whether men's self-reported health and health behaviour were hit more by unemployment than women's in a follow-up of the Northern Swedish Cohort. A follow-up study of a cohort of all school leavers in a middle-sized industrial town in northern Sweden was performed from age 16 to age 42. Of those still alive of the original cohort, 94% (n = 1,006) participated during the whole period. A sample was made of participants in the labour force and living in Sweden (n = 916). Register data were used to assess the length of unemployment from age 40 to 42, while questionnaire data were used for the other variables. In multivariate logistic regression analyses significant relations between unemployment and mental health/smoking were found among both women and men, even after control for unemployment at the time of the investigation and indicators of health-related selection. Significant relations between unemployment and alcohol consumption were found among women, while few visits to a dentist was significant among men. Men are not hit more by the health consequences of unemployment in a Swedish context, with a high participation rate of women in the labour market. The public health relevance is that the study indicates the need to take gendered contexts into account in public health research.
Wong, Man Sing; Ho, Hung Chak; Yang, Lin; Shi, Wenzhong; Yang, Jinxin; Chan, Ta-Chien
2017-07-24
Dust events have long been recognized to be associated with a higher mortality risk. However, no study has investigated how prolonged dust events affect the spatial variability of mortality across districts in a downwind city. In this study, we applied a spatial regression approach to estimate the district-level mortality during two extreme dust events in Hong Kong. We compared spatial and non-spatial models to evaluate the ability of each regression to estimate mortality. We also compared prolonged dust events with non-dust events to determine the influences of community factors on mortality across the city. The density of a built environment (estimated by the sky view factor) had positive association with excess mortality in each district, while socioeconomic deprivation contributed by lower income and lower education induced higher mortality impact in each territory planning unit during a prolonged dust event. Based on the model comparison, spatial error modelling with the 1st order of queen contiguity consistently outperformed other models. The high-risk areas with higher increase in mortality were located in an urban high-density environment with higher socioeconomic deprivation. Our model design shows the ability to predict spatial variability of mortality risk during an extreme weather event that is not able to be estimated based on traditional time-series analysis or ecological studies. Our spatial protocol can be used for public health surveillance, sustainable planning and disaster preparation when relevant data are available.
Chuartzman, Silvia G; Schuldiner, Maya
2018-03-25
In the last decade several collections of Saccharomyces cerevisiae yeast strains have been created. In these collections every gene is modified in a similar manner such as by a deletion or the addition of a protein tag. Such libraries have enabled a diversity of systematic screens, giving rise to large amounts of information regarding gene functions. However, often papers describing such screens focus on a single gene or a small set of genes and all other loci affecting the phenotype of choice ('hits') are only mentioned in tables that are provided as supplementary material and are often hard to retrieve or search. To help unify and make such data accessible, we have created a Database of High Throughput Screening Hits (dHITS). The dHITS database enables information to be obtained about screens in which genes of interest were found as well as the other genes that came up in that screen - all in a readily accessible and downloadable format. The ability to query large lists of genes at the same time provides a platform to easily analyse hits obtained from transcriptional analyses or other screens. We hope that this platform will serve as a tool to facilitate investigation of protein functions to the yeast community. © 2018 The Authors Yeast Published by John Wiley & Sons Ltd.
78 FR 29135 - HIT Standards Committee Advisory Meeting
2013-05-17
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee Advisory Meeting AGENCY: Office of...: HIT Standards Committee. General Function of the Committee: To provide recommendations to the National... Federal Health IT Strategic Plan, and in accordance with policies developed by the HIT Policy Committee...
Current drive experiments on the HIT-II spherical torus
International Nuclear Information System (INIS)
Jarboe, T.R.; Raman, R.; Nelson, B.A.; Holcomb, C.T.; McCollam, K.J.; Sieck, P.E.
1999-01-01
This paper describes the following new achievements from the Helicity Injected Torus (HIT) program: a) formation and sustainment of a toroidal magnetic equilibrium using coaxial helicity injection (CHI) in a conducting shell that has an L/R time much shorter than the pulse length; b) static formation of a spherical torus with plasma current over 180 kA using a transformer and feedback controlled equilibrium coils; and c) production of a current increase in a transformer produced spherical torus using CHI. (author)
Current drive experiments on the HIT-II spherical torus
International Nuclear Information System (INIS)
Jarboe, T.; Raman, R.; Nelson, B.; Holcomb, C.T.; McCollam, K.J.; Sieck, P.E.
2001-01-01
This paper describes the following new achievements from the Helicity Injected Torus (HIT) program: a) formation and sustainment of a toroidal magnetic equilibrium using coaxial helicity injection (CHI) in a conducting shell that has an L/R time much shorter than the pulse length; b) static formation of a spherical torus with plasma current over 180 kA using a transformer and feedback controlled equilibrium coils; and c) production of a current increase in a transformer produced spherical torus using CHI. (author)
Liquid Argon TPC Signal Formation, Signal Processing and Hit Reconstruction
Energy Technology Data Exchange (ETDEWEB)
Baller, Bruce [Fermilab
2017-03-11
This document describes the early stage of the reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions requires knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise.
Cosmic Ray Hit Detection with Homogenous Structures
Smirnov, O. M.
Cosmic ray (CR) hits can affect a significant number of pixels both on long-exposure ground-based CCD observations and on the Space Telescope frames. Thus, methods of identifying the damaged pixels are an important part of the data preprocessing for practically any application. The paper presents an implementation of a CR hit detection algorithm based on a homogenous structure (also called cellular automata ), a concept originating in artificial intelligence and dicrete mathematics. Each pixel of the image is represented by a small automaton, which interacts with its neighbors and assumes a distinct state if it ``decides'' that a CR hit is present. On test data, the algorithm has shown a high detection rate (~0.7 ) and a low false alarm rate (frame. A homogenous structure is extremely trainable, which can be very important for processing large batches of data obtained under similar conditions. Training and optimizing issues are discussed, as well as possible other applications of this concept to image processing.
All hypertopologies are hit-and-miss
Directory of Open Access Journals (Sweden)
Somshekhar Naimpally
2002-04-01
Full Text Available We solve a long standing problem by showing that all known hypertopologies are hit-and-miss. Our solution is not merely of theoretical importance. This representation is useful in the study of comparison of the Hausdorff-Bourbaki or H-B uniform topologies and the Wijsman topologies among themselves and with others. Up to now some of these comparisons needed intricate manipulations. The H-B uniform topologies were the subject of intense activity in the 1960's in connection with the Isbell-Smith problem. We show that they are proximally locally finite topologies from which the solution to the above problem follows easily. It is known that the Wijsman topology on the hyperspace is the proximal ball (hit-and-miss topology in”nice” metric spaces including the normed linear spaces. With the introduction of a new far-miss topology we show that the Wijsman topology is hit-and-miss for all metric spaces. From this follows a natural generalization of the Wijsman topology to the hyperspace of any T1 space. Several existing results in the literature are easy consequences of our work.
Recent results from the HIT-II and HIT-SI helicity injection current drive experiments
International Nuclear Information System (INIS)
Jarboe, T.R.; Hamp, W.T.; Izzo, V.A.; Nelson, B.A.; O'Neill, R.G.; Raman, R.; Redd, A.J.; Sieck, P.E.; Smith, R.J.
2005-01-01
Three important results are reported. 1) CHI startup has produced 100 kA of closed current without using poloidal field (PF) coils or any transformer action. The initial equilibrium is then driven to 240 kA with a 3 V transformer loop voltage, indicating high quality plasma. 2) For the first time CHI alone has produced toroidal currents (350 kA) that far exceed q a I inj , and with I p /I tf as high as 1.2. The key to these new results appears to be having the toroidal field small enough that relaxation will occur. 3) The steady inductive helicity injection spheromak experiment has operated at 5 kHz for 6 ms with current amplitudes up to 11 kA in each injector. The helicity injection rate is nearly constant with the ExB flow always into the plasma and not into the walls. NIMROD simulations of HIT-SI show a buildup of spheromak fields. (author)
Padmanabhan, Anand; Jones, Curtis G; Bougie, Daniel W; Curtis, Brian R; McFarland, Janice G; Wang, Demin; Aster, Richard H
2015-01-01
Antibodies specific for platelet factor 4 (PF4)/heparin complexes are the hallmark of heparin-induced thrombocytopenia and thrombosis (HIT), but many antibody-positive patients have normal platelet counts. The basis for this is not fully understood, but it is believed that antibodies testing positive in the serotonin release assay (SRA) are the most likely to cause disease. We addressed this issue by characterizing PF4-dependent binding of HIT antibodies to intact platelets and found that most antibodies testing positive in the SRA, but none of those testing negative, bind to and activate platelets when PF4 is present without any requirement for heparin (P HIT antibodies recognize PF4 in a complex with heparin, only a subset of these antibodies recognize more subtle epitopes induced in PF4 when it binds to CS, the major platelet glycosaminoglycan. Antibodies having this property could explain "delayed HIT" seen in some individuals after discontinuation of heparin and the high risk for thrombosis that persists for weeks in patients recovered from HIT. © 2015 by The American Society of Hematology.
Eyewitness Identification Reforms: Are Suggestiveness-Induced Hits and Guesses True Hits?
Wells, Gary L; Steblay, Nancy K; Dysart, Jennifer E
2012-05-01
Research-based reforms for collecting eyewitness identification evidence (e.g., unbiased pre-lineup instructions, double-blind administration) have been proposed by psychologists and adopted in increasing numbers of jurisdictions across the United States. It is well known that reducing rates of mistaken identifications can also reduce accurate identification rates (hits). But the reforms are largely designed to reduce the suggestiveness of the procedures they are meant to replace. Accordingly, we argue that it is misleading to label any hits obtained because of suggestive procedures as "hits" and then saddle reforms with the charge that they reduce the rate of these illegitimate hits. Eyewitness identification evidence should be based solely on the independent memory of the witness, not aided by biased instructions, cues from lineup administrators, or the use of lineup fillers who make the suspect stand out. Failure to call out these hits as being illegitimate can give solace to those who are motivated to preserve the status quo. © The Author(s) 2012.
Hu, Q.; Friedl, M. A.; Wu, W.
2017-12-01
Accurate and timely information regarding the spatial distribution of crop types and their changes is essential for acreage surveys, yield estimation, water management, and agricultural production decision-making. In recent years, increasing population, dietary shifts and climate change have driven drastic changes in China's agricultural land use. However, no maps are currently available that document the spatial and temporal patterns of these agricultural land use changes. Because of its short revisit period, rich spectral bands and global coverage, MODIS time series data has been shown to have great potential for detecting the seasonal dynamics of different crop types. However, its inherently coarse spatial resolution limits the accuracy with which crops can be identified from MODIS in regions with small fields or complex agricultural landscapes. To evaluate this more carefully and specifically understand the strengths and weaknesses of MODIS data for crop-type mapping, we used MODIS time-series imagery to map the sub-pixel fractional crop area for four major crop types (rice, corn, soybean and wheat) at 500-m spatial resolution for Heilongjiang province, one of the most important grain-production regions in China where recent agricultural land use change has been rapid and pronounced. To do this, a random forest regression (RF-g) model was constructed to estimate the percentage of each sub-pixel crop type in 2006, 2011 and 2016. Crop type maps generated through expert visual interpretation of high spatial resolution images (i.e., Landsat and SPOT data) were used to calibrate the regression model. Five different time series of vegetation indices (155 features) derived from different spectral channels of MODIS land surface reflectance (MOD09A1) data were used as candidate features for the RF-g model. An out-of-bag strategy and backward elimination approach was applied to select the optimal spectra-temporal feature subset for each crop type. The resulting crop maps
Baldwin, Austin K.; Robertson, Dale M.; Saad, David A.; Magruder, Christopher
2013-01-01
In 2008, the U.S. Geological Survey and the Milwaukee Metropolitan Sewerage District initiated a study to develop regression models to estimate real-time concentrations and loads of chloride, suspended solids, phosphorus, and bacteria in streams near Milwaukee, Wisconsin. To collect monitoring data for calibration of models, water-quality sensors and automated samplers were installed at six sites in the Menomonee River drainage basin. The sensors continuously measured four potential explanatory variables: water temperature, specific conductance, dissolved oxygen, and turbidity. Discrete water-quality samples were collected and analyzed for five response variables: chloride, total suspended solids, total phosphorus, Escherichia coli bacteria, and fecal coliform bacteria. Using the first year of data, regression models were developed to continuously estimate the response variables on the basis of the continuously measured explanatory variables. Those models were published in a previous report. In this report, those models are refined using 2 years of additional data, and the relative improvement in model predictability is discussed. In addition, a set of regression models is presented for a new site in the Menomonee River Basin, Underwood Creek at Wauwatosa. The refined models use the same explanatory variables as the original models. The chloride models all used specific conductance as the explanatory variable, except for the model for the Little Menomonee River near Freistadt, which used both specific conductance and turbidity. Total suspended solids and total phosphorus models used turbidity as the only explanatory variable, and bacteria models used water temperature and turbidity as explanatory variables. An analysis of covariance (ANCOVA), used to compare the coefficients in the original models to those in the refined models calibrated using all of the data, showed that only 3 of the 25 original models changed significantly. Root-mean-squared errors (RMSEs
Health information technology knowledge and skills needed by HIT employers.
Fenton, S H; Gongora-Ferraez, M J; Joost, E
2012-01-01
To evaluate the health information technology (HIT) workforce knowledge and skills needed by HIT employers. Statewide face-to-face and online focus groups of identified HIT employer groups in Austin, Brownsville, College Station, Dallas, El Paso, Houston, Lubbock, San Antonio, and webinars for rural health and nursing informatics. HIT employers reported needing an HIT workforce with diverse knowledge and skills ranging from basic to advanced, while covering information technology, privacy and security, clinical practice, needs assessment, contract negotiation, and many other areas. Consistent themes were that employees needed to be able to learn on the job and must possess the ability to think critically and problem solve. Many employers wanted persons with technical skills, yet also the knowledge and understanding of healthcare operations. The HIT employer focus groups provided valuable insight into employee skills needed in this fast-growing field. Additionally, this information will be utilized to develop a statewide HIT workforce needs assessment survey.
Differentiating regressed melanoma from regressed lichenoid keratosis.
Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A
2017-04-01
Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
The Holistic Integrity Test (HIT - quantified resilience analysis
Directory of Open Access Journals (Sweden)
Dobson Mike
2016-01-01
Full Text Available The Holistic Integrity Test (HIT - Quantified Resilience Analysis. Rising sea levels and wider climate change mean we face an increasing risk from flooding and other natural hazards. Tough economic times make it difficult to economically justify or afford the desired level of engineered risk reduction. Add to this significant uncertainty from a range of future predictions, constantly updated with new science. We therefore need to understand not just how to reduce the risk, but what could happen should above design standard events occur. In flood terms this includes not only the direct impacts (damage and loss of life, but the wider cascade impacts to infrastructure systems and the longer term impacts on the economy and society. However, understanding the “what if” is only the first part of the equation; a range of improvement measures to mitigate such effects need to be identified and implemented. These measures should consider reducing the risk, lessening the consequences, aiding the response, and speeding up the recovery. However, they need to be objectively assessed through quantitative analysis, which underpins them technically and economically. Without such analysis, it cannot be predicted how measures will perform if the extreme events occur. It is also vital to consider all possible hazards as measures for one hazard may hinder the response to another. The Holistic Integrity Test (HIT, uses quantitative system analysis and “HITs” the site, its infrastructure, contained dangers and wider regional system to determine how it copes with a range of severe shock events, Before, During and After the event, whilst also accounting for uncertainty (as illustrated in figure 1. First explained at the TINCE 2014 Nuclear Conference in Paris, it was explained in terms of a Nuclear Facility needing to analyse the site in response to post Fukushima needs; the hit is however universally applicable. The HIT has three key risk reduction goals: The
Directory of Open Access Journals (Sweden)
Fabian Horst
Full Text Available Traditionally, gait analysis has been centered on the idea of average behavior and normality. On one hand, clinical diagnoses and therapeutic interventions typically assume that average gait patterns remain constant over time. On the other hand, it is well known that all our movements are accompanied by a certain amount of variability, which does not allow us to make two identical steps. The purpose of this study was to examine changes in the intra-individual gait patterns across different time-scales (i.e., tens-of-mins, tens-of-hours.Nine healthy subjects performed 15 gait trials at a self-selected speed on 6 sessions within one day (duration between two subsequent sessions from 10 to 90 mins. For each trial, time-continuous ground reaction forces and lower body joint angles were measured. A supervised learning model using a kernel-based discriminant regression was applied for classifying sessions within individual gait patterns.Discernable characteristics of intra-individual gait patterns could be distinguished between repeated sessions by classification rates of 67.8 ± 8.8% and 86.3 ± 7.9% for the six-session-classification of ground reaction forces and lower body joint angles, respectively. Furthermore, the one-on-one-classification showed that increasing classification rates go along with increasing time durations between two sessions and indicate that changes of gait patterns appear at different time-scales.Discernable characteristics between repeated sessions indicate continuous intrinsic changes in intra-individual gait patterns and suggest a predominant role of deterministic processes in human motor control and learning. Natural changes of gait patterns without any externally induced injury or intervention may reflect continuous adaptations of the motor system over several time-scales. Accordingly, the modelling of walking by means of average gait patterns that are assumed to be near constant over time needs to be reconsidered in the
Horst, Fabian; Eekhoff, Alexander; Newell, Karl M; Schöllhorn, Wolfgang I
2017-01-01
Traditionally, gait analysis has been centered on the idea of average behavior and normality. On one hand, clinical diagnoses and therapeutic interventions typically assume that average gait patterns remain constant over time. On the other hand, it is well known that all our movements are accompanied by a certain amount of variability, which does not allow us to make two identical steps. The purpose of this study was to examine changes in the intra-individual gait patterns across different time-scales (i.e., tens-of-mins, tens-of-hours). Nine healthy subjects performed 15 gait trials at a self-selected speed on 6 sessions within one day (duration between two subsequent sessions from 10 to 90 mins). For each trial, time-continuous ground reaction forces and lower body joint angles were measured. A supervised learning model using a kernel-based discriminant regression was applied for classifying sessions within individual gait patterns. Discernable characteristics of intra-individual gait patterns could be distinguished between repeated sessions by classification rates of 67.8 ± 8.8% and 86.3 ± 7.9% for the six-session-classification of ground reaction forces and lower body joint angles, respectively. Furthermore, the one-on-one-classification showed that increasing classification rates go along with increasing time durations between two sessions and indicate that changes of gait patterns appear at different time-scales. Discernable characteristics between repeated sessions indicate continuous intrinsic changes in intra-individual gait patterns and suggest a predominant role of deterministic processes in human motor control and learning. Natural changes of gait patterns without any externally induced injury or intervention may reflect continuous adaptations of the motor system over several time-scales. Accordingly, the modelling of walking by means of average gait patterns that are assumed to be near constant over time needs to be reconsidered in the context of
DEFF Research Database (Denmark)
Dholakia, Nikhilesh; Turcan, Romeo V.
2012-01-01
This paper is part of an ongoing project to develop an interdisciplinary metatheory of bubbles, relevant to the contemporary era of globalization and rapid, technology-aided communication flows. Just in the first few years of the 21st century, several bubbles have appeared – the so-called dotcom ...... cultural field where relatively small bubbles may form. Movies represent a good arena to examine cultural bubbles on a scale that is not daunting, and where the hype-hope-hit dynamics can be observed more frequently than in most other settings....
Pedrini, D. T.; Pedrini, Bonnie C.
Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…
Directory of Open Access Journals (Sweden)
Schöning André
2016-01-01
Full Text Available Track reconstruction in high track multiplicity environments at current and future high rate particle physics experiments is a big challenge and very time consuming. The search for track seeds and the fitting of track candidates are usually the most time consuming steps in the track reconstruction. Here, a new and fast track reconstruction method based on hit triplets is proposed which exploits a three-dimensional fit model including multiple scattering and hit uncertainties from the very start, including the search for track seeds. The hit triplet based reconstruction method assumes a homogeneous magnetic field which allows to give an analytical solutions for the triplet fit result. This method is highly parallelizable, needs fewer operations than other standard track reconstruction methods and is therefore ideal for the implementation on parallel computing architectures. The proposed track reconstruction algorithm has been studied in the context of the Mu3e-experiment and a typical LHC experiment.
Directory of Open Access Journals (Sweden)
T N Hagawane
2016-01-01
Results: It was noted that the respiratory rate, and tumour necrosis factor-α (TNF-α levels were significantly higher at 4 h in the dual hit group as compared to LPS, OA and control groups. Interleukin-6 (IL-6 levels were significantly higher in the dual hit group as compared to LPS at 8 and 24 h, OA at 8 h and control (at all time intervals group. IL-1β levels were significantly higher in LPS and dual hit groups at all time intervals, but not in OA and control groups. The injury induced in dual hit group was earlier and more sustained as compared to LPS and OA alone. Interpretation & conclusions: The lung pathology and changes in respiration functions produced by the dual hit model were closer to the diagnostic criteria of ALI/ARDS in terms of clinical manifestations and pulmonary injury and the injury persisted longer as compared to LPS and OA single hit model. Therefore, the ARDS model produced by the dual hit method was closer to the diagnostic criteria of ARDS in terms of clinical manifestations and pulmonary injury.
Energy Technology Data Exchange (ETDEWEB)
Meade, Roger Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Keksis, August Lawrence [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-10-03
On January 12, 1975, a rock seemed to fall from the sky over New York State’s Schoharie County hitting the tractor of a local farmer, who was “preparing his fields for spring planting.” As the farmer later described the event to a reporter from the UFO INVESTIGATOR, the object glanced off the tractor, fell to the ground, and melted its way through a patch of ice that was two and one half inches thick. The farmer, Leonard Tillapaugh, called the county sheriff, Harvey Stoddard, who recovered the rock, noting that it “was still warm.” Why and how a sample of the rock came to Los Alamos is not known. However, it captivated a wide Laboratory audience, was subjected to rigorous testing and evaluation. Los Alamos used the scientific method in the manner promoted by Hynek. Did Los Alamos solve the mystery of the rock’s origin? Not definitively. Although the exact origin could not be determined, it was shown conclusively that the rock was not from outer space. With that said, the saga of Rock that hit New York came to an end. Nothing more was said or written about it. The principals involved have long since passed from the scene. The NICAP ceased operations in 1980. And, the rock, itself, has disappeared.
Directory of Open Access Journals (Sweden)
Yu-Pin Liao
2017-11-01
Full Text Available In the past few decades, demand forecasting has become relatively difficult due to rapid changes in the global environment. This research illustrates the use of the make-to-stock (MTS production strategy in order to explain how forecasting plays an essential role in business management. The linear mixed-effect (LME model has been extensively developed and is widely applied in various fields. However, no study has used the LME model for business forecasting. We suggest that the LME model be used as a tool for prediction and to overcome environment complexity. The data analysis is based on real data in an international display company, where the company needs accurate demand forecasting before adopting a MTS strategy. The forecasting result from the LME model is compared to the commonly used approaches, including the regression model, autoregressive model, times series model, and exponential smoothing model, with the results revealing that prediction performance provided by the LME model is more stable than using the other methods. Furthermore, product types in the data are regarded as a random effect in the LME model, hence demands of all types can be predicted simultaneously using a single LME model. However, some approaches require splitting the data into different type categories, and then predicting the type demand by establishing a model for each type. This feature also demonstrates the practicability of the LME model in real business operations.
Bukhari, W.; Hong, S.-M.
2016-03-01
The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the radiation treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting respiratory motion in 3D space and realizing a gating function without pre-specifying a particular phase of the patient’s breathing cycle. The algorithm, named EKF-GPRN+ , first employs an extended Kalman filter (EKF) independently along each coordinate to predict the respiratory motion and then uses a Gaussian process regression network (GPRN) to correct the prediction error of the EKF in 3D space. The GPRN is a nonparametric Bayesian algorithm for modeling input-dependent correlations between the output variables in multi-output regression. Inference in GPRN is intractable and we employ variational inference with mean field approximation to compute an approximate predictive mean and predictive covariance matrix. The approximate predictive mean is used to correct the prediction error of the EKF. The trace of the approximate predictive covariance matrix is utilized to capture the uncertainty in EKF-GPRN+ prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification enables us to pause the treatment beam over such instances. EKF-GPRN+ implements a gating function by using simple calculations based on the trace of the predictive covariance matrix. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPRN+ . The experimental results show that the EKF-GPRN+ algorithm reduces the patient-wise prediction error to 38%, 40% and 40% in root-mean-square, compared to no prediction, at lookahead lengths of 192 ms, 384 ms and 576 ms, respectively. The EKF-GPRN+ algorithm can further reduce the prediction error by employing the gating function, albeit
International Nuclear Information System (INIS)
Bukhari, W; Hong, S-M
2016-01-01
The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the radiation treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting respiratory motion in 3D space and realizing a gating function without pre-specifying a particular phase of the patient’s breathing cycle. The algorithm, named EKF-GPRN + , first employs an extended Kalman filter (EKF) independently along each coordinate to predict the respiratory motion and then uses a Gaussian process regression network (GPRN) to correct the prediction error of the EKF in 3D space. The GPRN is a nonparametric Bayesian algorithm for modeling input-dependent correlations between the output variables in multi-output regression. Inference in GPRN is intractable and we employ variational inference with mean field approximation to compute an approximate predictive mean and predictive covariance matrix. The approximate predictive mean is used to correct the prediction error of the EKF. The trace of the approximate predictive covariance matrix is utilized to capture the uncertainty in EKF-GPRN + prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification enables us to pause the treatment beam over such instances. EKF-GPRN + implements a gating function by using simple calculations based on the trace of the predictive covariance matrix. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPRN + . The experimental results show that the EKF-GPRN + algorithm reduces the patient-wise prediction error to 38%, 40% and 40% in root-mean-square, compared to no prediction, at lookahead lengths of 192 ms, 384 ms and 576 ms, respectively. The EKF-GPRN + algorithm can further reduce the prediction error by employing the gating function
Directory of Open Access Journals (Sweden)
Lorenzo Moja
Full Text Available To assess the relationship between surgical delay and mortality in elderly patients with hip fracture. Systematic review and meta-analysis of retrospective and prospective studies published from 1948 to 2011. Medline (from 1948, Embase (from 1974 and CINAHL (from 1982, and the Cochrane Library. Odds ratios (OR and 95% confidence intervals for each study were extracted and pooled with a random effects model. Heterogeneity, publication bias, bayesian analysis, and meta-regression analyses were done. Criteria for inclusion were retro- and prospective elderly population studies, patients with operated hip fractures, indication of timing of surgery and survival status.There were 35 independent studies, with 191,873 participants and 34,448 deaths. The majority considered a cut-off between 24 and 48 hours. Early hip surgery was associated with a lower risk of death (pooled odds ratio (OR 0.74, 95% confidence interval (CI 0.67 to 0.81; P<0.000 and pressure sores (0.48, 95% CI 0.38 to 0.60; P<0.000. Meta-analysis of the adjusted prospective studies gave similar results. The bayesian probability predicted that about 20% of future studies might find that early surgery is not beneficial for decreasing mortality. None of the confounders (e.g. age, sex, data source, baseline risk, cut-off points, study location, quality and year explained the differences between studies.Surgical delay is associated with a significant increase in the risk of death and pressure sores. Conservative timing strategies should be avoided. Orthopaedic surgery services should ensure the majority of patients are operated within one or two days.
Multiple-hit parameter estimation in monolithic detectors.
Hunter, William C J; Barrett, Harrison H; Lewellen, Tom K; Miyaoka, Robert S
2013-02-01
We examine a maximum-a-posteriori method for estimating the primary interaction position of gamma rays with multiple interaction sites (hits) in a monolithic detector. In assessing the performance of a multiple-hit estimator over that of a conventional one-hit estimator, we consider a few different detector and readout configurations of a 50-mm-wide square cerium-doped lutetium oxyorthosilicate block. For this study, we use simulated data from SCOUT, a Monte-Carlo tool for photon tracking and modeling scintillation- camera output. With this tool, we determine estimate bias and variance for a multiple-hit estimator and compare these with similar metrics for a one-hit maximum-likelihood estimator, which assumes full energy deposition in one hit. We also examine the effect of event filtering on these metrics; for this purpose, we use a likelihood threshold to reject signals that are not likely to have been produced under the assumed likelihood model. Depending on detector design, we observe a 1%-12% improvement of intrinsic resolution for a 1-or-2-hit estimator as compared with a 1-hit estimator. We also observe improved differentiation of photopeak events using a 1-or-2-hit estimator as compared with the 1-hit estimator; more than 6% of photopeak events that were rejected by likelihood filtering for the 1-hit estimator were accurately identified as photopeak events and positioned without loss of resolution by a 1-or-2-hit estimator; for PET, this equates to at least a 12% improvement in coincidence-detection efficiency with likelihood filtering applied.
Recent Improvements in the SHIELD-HIT Code
DEFF Research Database (Denmark)
Hansen, David Christoffer; Lühr, Armin Christian; Herrmann, Rochus
2012-01-01
Purpose: The SHIELD-HIT Monte Carlo particle transport code has previously been used to study a wide range of problems for heavy-ion treatment and has been benchmarked extensively against other Monte Carlo codes and experimental data. Here, an improved version of SHIELD-HIT is developed concentra......Purpose: The SHIELD-HIT Monte Carlo particle transport code has previously been used to study a wide range of problems for heavy-ion treatment and has been benchmarked extensively against other Monte Carlo codes and experimental data. Here, an improved version of SHIELD-HIT is developed...
Guthrie, Bruce; Clark, Stella A; Reynish, Emma L; McCowan, Colin; Morales, Daniel R
2013-01-01
Regulatory risk communications are an important method for disseminating drug safety information, but their impact varies. Two significant UK risk communications about antipsychotic use in older people with dementia were issued in 2004 and 2009. These varied considerably in their content and dissemination, allowing examination of their differential impact. Segmented regression time-series analysis 2001-2011 for people aged ≥65 years with dementia in 87 Scottish general practices, examining the impact of two pre-specified risk communications in 2004 and 2009 on antipsychotic and other psychotropic prescribing. The percentage of people with dementia prescribed an antipsychotic was 15.9% in quarter 1 2001 and was rising by an estimated 0.6%/quarter before the 2004 risk communication. The 2004 risk communication was sent directly to all prescribers, and specifically recommended review of all patients prescribed relevant drugs. It was associated with an immediate absolute reduction in antipsychotic prescribing of 5.9% (95% CI -6.6 to -5.2) and a change to a stable level of prescribing subsequently. The 2009 risk communication was disseminated in a limited circulation bulletin, and only specifically recommended avoiding initiation if possible. There was no immediate associated impact, but it was associated with a significant decline in prescribing subsequently which appeared driven by a decline in initiation, with the percentage prescribed an antipsychotic falling from 18.4% in Q1 2009 to 13.5% in Q1 2011. There was no widespread substitution of antipsychotics with other psychotropic drugs. The two risk communications were associated with reductions in antipsychotic use, in ways which were compatible with marked differences in their content and dissemination. Further research is needed to ensure that the content and dissemination of regulatory risk communications is optimal, and to track their impact on intended and unintended outcomes. Although rates are falling
Guthrie, Bruce; Clark, Stella A.; Reynish, Emma L.; McCowan, Colin; Morales, Daniel R.
2013-01-01
Background Regulatory risk communications are an important method for disseminating drug safety information, but their impact varies. Two significant UK risk communications about antipsychotic use in older people with dementia were issued in 2004 and 2009. These varied considerably in their content and dissemination, allowing examination of their differential impact. Methods Segmented regression time-series analysis 2001–2011 for people aged ≥65 years with dementia in 87 Scottish general practices, examining the impact of two pre-specified risk communications in 2004 and 2009 on antipsychotic and other psychotropic prescribing. Results The percentage of people with dementia prescribed an antipsychotic was 15.9% in quarter 1 2001 and was rising by an estimated 0.6%/quarter before the 2004 risk communication. The 2004 risk communication was sent directly to all prescribers, and specifically recommended review of all patients prescribed relevant drugs. It was associated with an immediate absolute reduction in antipsychotic prescribing of 5.9% (95% CI −6.6 to −5.2) and a change to a stable level of prescribing subsequently. The 2009 risk communication was disseminated in a limited circulation bulletin, and only specifically recommended avoiding initiation if possible. There was no immediate associated impact, but it was associated with a significant decline in prescribing subsequently which appeared driven by a decline in initiation, with the percentage prescribed an antipsychotic falling from 18.4% in Q1 2009 to 13.5% in Q1 2011. There was no widespread substitution of antipsychotics with other psychotropic drugs. Conclusions The two risk communications were associated with reductions in antipsychotic use, in ways which were compatible with marked differences in their content and dissemination. Further research is needed to ensure that the content and dissemination of regulatory risk communications is optimal, and to track their impact on intended and
Directory of Open Access Journals (Sweden)
Bruce Guthrie
Full Text Available Regulatory risk communications are an important method for disseminating drug safety information, but their impact varies. Two significant UK risk communications about antipsychotic use in older people with dementia were issued in 2004 and 2009. These varied considerably in their content and dissemination, allowing examination of their differential impact.Segmented regression time-series analysis 2001-2011 for people aged ≥65 years with dementia in 87 Scottish general practices, examining the impact of two pre-specified risk communications in 2004 and 2009 on antipsychotic and other psychotropic prescribing.The percentage of people with dementia prescribed an antipsychotic was 15.9% in quarter 1 2001 and was rising by an estimated 0.6%/quarter before the 2004 risk communication. The 2004 risk communication was sent directly to all prescribers, and specifically recommended review of all patients prescribed relevant drugs. It was associated with an immediate absolute reduction in antipsychotic prescribing of 5.9% (95% CI -6.6 to -5.2 and a change to a stable level of prescribing subsequently. The 2009 risk communication was disseminated in a limited circulation bulletin, and only specifically recommended avoiding initiation if possible. There was no immediate associated impact, but it was associated with a significant decline in prescribing subsequently which appeared driven by a decline in initiation, with the percentage prescribed an antipsychotic falling from 18.4% in Q1 2009 to 13.5% in Q1 2011. There was no widespread substitution of antipsychotics with other psychotropic drugs.The two risk communications were associated with reductions in antipsychotic use, in ways which were compatible with marked differences in their content and dissemination. Further research is needed to ensure that the content and dissemination of regulatory risk communications is optimal, and to track their impact on intended and unintended outcomes. Although rates
Panel Smooth Transition Regression Models
DEFF Research Database (Denmark)
González, Andrés; Terasvirta, Timo; Dijk, Dick van
We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...
Hilbe, Joseph M
2009-01-01
This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...
A Two-Hit Model of Autism: Adolescence as the Second Hit
Picci, Giorgia; Scherf, K. Suzanne
2015-01-01
Adolescence brings dramatic changes in behavior and neural organization. Unfortunately, for some 30% of individuals with autism, there is marked decline in adaptive functioning during adolescence. We propose a two-hit model of autism. First, early perturbations in neural development function as a “first hit” that sets up a neural system that is “built to fail” in the face of a second hit. Second, the confluence of pubertal hormones, neural reorganization, and increasing social demands during adolescence provides the “second hit” that interferes with the ability to transition into adult social roles and levels of adaptive functioning. In support of this model, we review evidence about adolescent-specific neural and behavioral development in autism. We conclude with predictions and recommendations for empirical investigation about several domains in which developmental trajectories for individuals with autism may be uniquely deterred in adolescence. PMID:26609500
Inflammation and the Two-Hit Hypothesis of Schizophrenia
Feigenson, Keith A.; Kusnecov, Alex W.; Silverstein, Steven M.
2014-01-01
The high societal and individual cost of schizophrenia necessitates finding better, more effective treatment, diagnosis, and prevention strategies. One of the obstacles in this endeavor is the diverse set of etiologies that comprises schizophrenia. A substantial body of evidence has grown over the last few decades to suggest that schizophrenia is a heterogeneous syndrome with overlapping symptoms and etiologies. At the same time, an increasing number of clinical, epidemiological, and experimental studies have shown links between schizophrenia and inflammatory conditions. In this review, we analyze the literature on inflammation and schizophrenia, with a particular focus on comorbidity, biomarkers, and environmental insults. We then identify several mechanisms by which inflammation could influence the development of schizophrenia via the two-hit hypothesis. Lastly, we note the relevance of these findings to clinical applications in the diagnosis, prevention, and treatment of schizophrenia. PMID:24247023
Demand spillovers of smash-hit papers: evidence from the 'Male Organ Incident'.
Kässi, Otto; Westling, Tatu
2013-12-01
This study explores the short-run spillover effects of popular research papers. We consider the publicity of 'Male Organ and Economic Growth: Does Size Matter?' as an exogenous shock to economics discussion paper demand, a natural experiment of a sort. In particular, we analyze how the very substantial visibility influenced the downloads of Helsinki Center of Economic Research discussion papers. Difference in differences and regression discontinuity analysis are conducted to elicit the spillover patterns. This study finds that the spillover effect to average economics paper demand is positive and statistically significant. It seems that hit papers increase the exposure of previously less downloaded papers. We find that part of the spillover effect could be attributable to Internet search engines' influence on browsing behavior. Conforming to expected patterns, papers residing on the same web page as the hit paper evidence very significant increases in downloads which also supports the spillover thesis. A11, C21. 97K80.
DEFF Research Database (Denmark)
Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas
2017-01-01
In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...
Improvements to the ion Doppler spectrometer diagnostic on the HIT-SI experiments
Hossack, Aaron; Chandra, Rian; Everson, Chris; Jarboe, Tom
2018-03-01
An ion Doppler spectrometer diagnostic system measuring impurity ion temperature and velocity on the HIT-SI and HIT-SI3 spheromak devices has been improved with higher spatiotemporal resolution and lower error than previously described devices. Hardware and software improvements to the established technique have resulted in a record of 6.9 μs temporal and ≤2.8 cm spatial resolution in the midplane of each device. These allow Ciii and Oii flow, displacement, and temperature profiles to be observed simultaneously. With 72 fused-silica fiber channels in two independent bundles, and an f/8.5 Czerny-Turner spectrometer coupled to a video camera, frame rates of up to ten times the imposed magnetic perturbation frequency of 14.5 kHz were achieved in HIT-SI, viewing the upper half of the midplane. In HIT-SI3, frame rates of up to eight times the perturbation frequency were achieved viewing both halves of the midplane. Biorthogonal decomposition is used as a novel filtering tool, reducing uncertainty in ion temperature from ≲13 to ≲5 eV (with an instrument temperature of 8-16 eV) and uncertainty in velocity from ≲2 to ≲1 km/s. Doppler shift and broadening are calculated via the Levenberg-Marquardt algorithm, after which the errors in velocity and temperature are uniquely specified. Axisymmetric temperature profiles on HIT-SI3 for Ciii peaked near the inboard current separatrix at ≈40 eV are observed. Axisymmetric plasma displacement profiles have been measured on HIT-SI3, peaking at ≈6 cm at the outboard separatrix. Both profiles agree with the upper half of the midplane observable by HIT-SI. With its complete midplane view, HIT-SI3 has unambiguously extracted axisymmetric, toroidal current dependent rotation of up to 3 km/s. Analysis of the temporal phase of the displacement uncovers a coherent structure, locked to the applied perturbation. Previously described diagnostic systems could not achieve such results.
Morales, Daniel R; Donnan, Peter T; Daly, Fergus; Staa, Tjeerd Van; Sullivan, Frank M
2013-01-01
Objectives To measure the incidence of Bell's palsy and determine the impact of clinical trial findings on Bell's palsy management in the UK. Design Interrupted time series regression analysis and incidence measures. Setting General practices in the UK contributing to the Clinical Practice Research Datalink (CPRD). Participants Patients ≥16 years with a diagnosis of Bell's palsy between 2001 and 2012. Interventions (1) Publication of the 2004 Cochrane reviews of clinical trials on corticosteroids and antivirals for Bell's palsy, which made no clear recommendation on their use and (2) publication of the 2007 Scottish Bell's Palsy Study (SBPS), which made a clear recommendation that treatment with prednisolone alone improves chances for complete recovery. Main outcome measures Incidence of Bell's palsy per 100 000 person-years. Changes in the management of Bell's palsy with either prednisolone therapy, antiviral therapy, combination therapy (prednisolone with antiviral therapy) or untreated cases. Results During the 12-year period, 14 460 cases of Bell's palsy were identified with an overall incidence of 37.7/100 000 person-years. The 2004 Cochrane reviews were associated with immediate falls in prednisolone therapy (−6.3% (−11.0 to −1.6)), rising trends in combination therapy (1.1% per quarter (0.5 to 1.7)) and falling trends for untreated cases (−0.8% per quarter (−1.4 to −0.3)). SBPS was associated with immediate increases in prednisolone therapy (5.1% (0.9 to 9.3)) and rising trends in prednisolone therapy (0.7% per quarter (0.4 to 1.2)); falling trends in combination therapy (−1.7% per quarter (−2.2 to −1.3)); and rising trends for untreated cases (1.2% per quarter (0.8 to 1.6)). Despite improvements, 44% still remain untreated. Conclusions SBPS was clearly associated with change in management, but a significant proportion of patients failed to receive effective treatment, which cannot be fully explained. Clarity and uncertainty in
International Nuclear Information System (INIS)
Urrutia, J D; Bautista, L A; Baccay, E B
2014-01-01
The aim of this study was to develop mathematical models for estimating earthquake casualties such as death, number of injured persons, affected families and total cost of damage. To quantify the direct damages from earthquakes to human beings and properties given the magnitude, intensity, depth of focus, location of epicentre and time duration, the regression models were made. The researchers formulated models through regression analysis using matrices and used α = 0.01. The study considered thirty destructive earthquakes that hit the Philippines from the inclusive years 1968 to 2012. Relevant data about these said earthquakes were obtained from Philippine Institute of Volcanology and Seismology. Data on damages and casualties were gathered from the records of National Disaster Risk Reduction and Management Council. This study will be of great value in emergency planning, initiating and updating programs for earthquake hazard reduction in the Philippines, which is an earthquake-prone country.
Current insights into the laboratory diagnosis of HIT.
Bakchoul, T; Zöllner, H; Greinacher, A
2014-06-01
Heparin-induced thrombocytopenia (HIT) is an adverse drug reaction and prothrombotic disorder caused by immunization against platelet factor 4 (PF4) after complex formation with heparin or other polyanions. After antibody binding to PF4/heparin complexes, HIT antibodies are capable of intravascular platelet activation by cross-linking Fc gamma receptor IIa (FcγRIIa) on the platelet surface leading to a platelet count decrease and/or thrombosis. In contrast to most other immune-mediated disorders, the currently available laboratory tests for anti-PF4/heparin antibodies show a high sensitivity also for clinically irrelevant antibodies. This makes the diagnosis of HIT challenging and bears the risk to substantially overdiagnose HIT. The strength of the antigen assays for HIT is in ruling out HIT when the test is negative. Functional assays have a higher specificity for clinically relevant antibodies, but they are restricted to specialized laboratories. Currently, a Bayesian approach combining the clinical likelihood estimation for HIT with laboratory tests is the most appropriate approach to diagnose HIT. In this review, we give an overview on currently available diagnostic procedures and discuss their limitations. © 2014 John Wiley & Sons Ltd.
Vermeulen, E.; Stronks, K.; Visser, M de; Brouwer, I.A.; Schene, A.H.; Mocking, R.J.T.; Colpo, M.; Bandinelli, S.; Ferrucci, L.; Nicolaou, M.
2016-01-01
This study aimed to identify dietary patterns using reduced rank regression (RRR) and to explore their associations with depressive symptoms over 9 years in the Invecchiare in Chianti study. At baseline, 1362 participants (55.4 % women) aged 18-102 years (mean age 68 (sd 15.5) years) were included
Vermeulen, Esther; Stronks, Karien; Visser, Marjolein; Brouwer, Ingeborg A; Schene, Aart H; Mocking, Roel J T; Colpo, Marco; Bandinelli, Stefania; Ferrucci, Luigi; Nicolaou, Mary
This study aimed to identify dietary patterns using reduced rank regression (RRR) and to explore their associations with depressive symptoms over 9 years in the Invecchiare in Chianti study. At baseline, 1362 participants (55·4 % women) aged 18-102 years (mean age 68 (sd 15·5) years) were included
DEFF Research Database (Denmark)
Johansen, Søren
2008-01-01
The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...
42 CFR 495.340 - As-needed HIT PAPD update and as-needed HIT IAPD update requirements.
2010-10-01
... limited to any of the following: (a) A projected cost increase of $100,000 or more. (b) A schedule... implementation approach, or scope of activities beyond that approved in the HIT planning advance planning document or the HIT implementation advance planning document. (d) A change in implementation concept or a...
Aggressive B cell Lymphoma: Optimal Therapy for MYC-positive, Double-Hit, and Triple-Hit DLBCL.
Dunleavy, Kieron
2015-12-01
Approximately 10% of cases of diffuse large B cell lymphoma (DLBCL) harbor a MYC rearrangement and this has been associated with an inferior outcome following standard therapy across many different studies. Double-hit and triple-hit lymphomas harbor concurrent rearrangements of MYC and BCL2 and/or BCL6 and are also associated with a very aggressive course and poor clinical outcome. It is unclear and there is lack of consensus on how these diseases should be approached therapeutically. They are characterized typically by high tumor proliferation and likely require Burkitt lymphoma-type strategies and several retrospective studies suggest that more intensive approaches than rituximab, cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP) may be beneficial. One challenge in this respect is that most patients with these diseases are older than 60 years and generally have poor tolerability of regimens typically used in Burkitt lymphoma. Dose-adjusted EPOCH-R is an alternative effective immunochemotherapy platform for DLBCL and is effective in Burkitt lymphoma, and retrospective studies suggest that it is effective and feasible in patients with DLBCL that harbors a MYC rearrangement with or without a BCL-2 translocation (double-hit). A multicenter study of this approach in MYC-rearranged DLBCL is ongoing and preliminary results are very encouraging. There is a lack of consensus on the role of consolidation stem cell transplantation in patients who achieve a good response to initial therapy but at this point in time, no (retrospective) studies have demonstrated any benefit. These diseases are also associated with a high rate of CNS involvement and progression and checking for cerebrospinal fluid by cytology and flow cytometry at initial diagnosis should be considered. In summary, based on retrospective data and preliminary prospective data (as more mature data is awaited), while Burkitt-type regimens may be feasible in young patients, DA-EPOCH-R is a
College Radio Hits the Big Time in the Music Industry.
Greene, Elizabeth
1989-01-01
In the last decade, college radio has begun to play music too experimental for commercial radio, and people searching for innovative or controversial music are tuning into college stations. The music industry has welcomed the student broadcasters, many of whom enter the profession after college. (MSE)
Hit me baby one more time : a haptic rating interface
Bartneck, C.; Athanasiadou, P.; Kanda, T.; Jacko, J.A.
2007-01-01
As the importance of recommender systems increases, in combination with the explosion in data available over the internet and in our own digital libraries, we suggest an alternative method of providing explicit user feedback. We create a tangible interface, which will not only facilitate
DEFF Research Database (Denmark)
Johansen, Søren
There are simple well-known conditions for the validity of regression and correlation as statistical tools. We analyse by examples the effect of nonstationarity on inference using these methods and compare them to model based inference. Finally we analyse some data on annual mean temperature...... and sea level, by applying the cointegrated vector autoregressive model, which explicitly takes into account the nonstationarity of the variables....
DEFF Research Database (Denmark)
Johansen, Søren
There are simple well-known conditions for the validity of regression and correlation as statistical tools. We analyse by examples the effect of nonstationarity on inference using these methods and compare them to model based inference. Finally we analyse some data on annual mean temperature...... and sea level, by applying the cointegrated vector autoregressive model, which explicitly takes into account the nonstationarity of the variables....
The first neutron beam hits EAR2
Antonella Del Rosso
2014-01-01
On 25 July 2014, about a year after construction work began, the Experimental Area 2 (EAR2) of CERN’s neutron facility n_TOF recorded its first beam. Unique in many aspects, EAR2 will start its rich programme of experimental physics this autumn. The last part of the EAR2 beamline: the neutrons come from the underground target and reach the top of the beamline, where they hit the samples. Built about 20 metres above the neutron production target, EAR2 is in fact a bunker connected to the n_TOF underground facilities via a duct 80 cm in diameter, where the beamline is installed. The feet of the bunker support pillars are located on the concrete structure of the n_TOF tunnel and part of the structure lies above the old ISR building. A beam dump located on the roof of the building completes the structure. Neutrons are used by physicists to study neutron-induced reactions with applications in a number of fields, including nuclear waste transmutation, nuclear technology, nuclear astrop...
Jiang, Jie; Li, Youping; Huang, Xiaolin; Li, Bing; Su, Lin; Zhong, Dake; Shi, Chenghu; Li, Mingxu; Shan, Juan; Chen, Yin
2012-08-01
Critical injury treatment in the hardest-hit areas after a great earthquake was retrospectively analyzed to determine how best to reduce mortality and disability and increase the rehabilitation rate through postquake medical relief. Retrospective analysis, primary sources, and secondary sources were comprehensively retrieved and analyzed. According to incomplete data, 30,620 injured were rescued by themselves among the hardest-hit areas in the 72 hours immediately following the earthquake. Critically injured patients accounted for 22% of total inpatients. Mortality rates declined with greater distance from the epicenter: rates were 12.21% for municipal healthcare centers in the hardest-hit areas, 4.50% for municipal medical units in peripheral quake-hit areas, 2.50% for provincial medical units in peripheral quake-hit areas, and 2.17% for Ministry of Health-affiliated hospitals in peripheral quake-hit areas. The number of injured with fractures on body, limbs or unknown-parts, severe conditions as well as other kinds of non-traumatic diseases received in second-line hospitals was much more than those treated in first-line hospitals with more severe injuries. Among 10,373 injured in stable condition transferred to third-line hospitals, 99.07% were discharged from hospitals within four months, while the mortality rate was 0.017%. The medical relief model of "supervising body helping subordinate unit, severely stricken areas assisting hardest-hit areas, least-hit areas supporting both hardest-hit and severely stricken areas, and self help and mutual assistance applied between hardest-hit areas" was roughly established for injured from severely stricken areas after the Wenchuan Earthquake. The "four-centralization" treatment principle, which referred to concentrating patients, experts, resources and treatment for those injured in critical condition effectively reduced the mortality from 15.06% to 2.9%. Timely, scientific, and standard on-site triage and postmedical
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
Prediction of Potential Hit Song and Musical Genre Using Artificial Neural Networks
Monterola, Christopher; Abundo, Cheryl; Tugaff, Jeric; Venturina, Lorcel Ericka
Accurately quantifying the goodness of music based on the seemingly subjective taste of the public is a multi-million industry. Recording companies can make sound decisions on which songs or artists to prioritize if accurate forecasting is achieved. We extract 56 single-valued musical features (e.g. pitch and tempo) from 380 Original Pilipino Music (OPM) songs (190 are hit songs) released from 2004 to 2006. Based on an effect size criterion which measures a variable's discriminating power, the 20 highest ranked features are fed to a classifier tasked to predict hit songs. We show that regardless of musical genre, a trained feed-forward neural network (NN) can predict potential hit songs with an average accuracy of ΦNN = 81%. The accuracy is about +20% higher than those of standard classifiers such as linear discriminant analysis (LDA, ΦLDA = 61%) and classification and regression trees (CART, ΦCART = 57%). Both LDA and CART are above the proportional chance criterion (PCC, ΦPCC = 50%) but are slightly below the suggested acceptable classifier requirement of 1.25*ΦPCC = 63%. Utilizing a similar procedure, we demonstrate that different genres (ballad, alternative rock or rock) of OPM songs can be automatically classified with near perfect accuracy using LDA or NN but only around 77% using CART.
Forecasting with Dynamic Regression Models
Pankratz, Alan
2012-01-01
One of the most widely used tools in statistical forecasting, single equation regression models is examined here. A companion to the author's earlier work, Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, the present text pulls together recent time series ideas and gives special attention to possible intertemporal patterns, distributed lag responses of output to input series and the auto correlation patterns of regression disturbance. It also includes six case studies.
Black, Anne; Heimerl, Susanne; Oertli, Linnéa; Wilczek, Wolf; Greinacher, Andreas; Spannagl, Michael; Herr, Wolfgang; Hart, Christina
2017-10-01
Heparin-induced thrombocytopenia (HIT) is a rare cause of thrombocytopenia and a potentially life-threatening adverse drug reaction. Clinical overdiagnosis of HIT results in costly laboratory tests and anticoagulation. Criteria and algorithms for diagnosis are established, but their translation into clinical practice is still challenging. In a retrospective approach we studied all HIT related laboratory test requests within four years and evaluated data before (1st period, 24month) and after (2nd period, 24month) replacing particle gel immunoassay (PaGIA) and enzyme-linked immunosorbent assay (ELISA) by a chemiluminescent immunoassay (CLIA). HIT was confirmed by heparin-induced platelet activation (HIPA) test. Clinical pretest probability for HIT using an implemented simplified 4Ts score and platelet count were evaluated. Costs for laboratory tests and alternative anticoagulation were calculated. In 1850 patients with suspected HIT, 2327 laboratory orders were performed. In 87.2% of these orders an intermediate/high simplified 4Ts score was found. Thrombocytopenia was present in 87.1%. After replacing PaGIA and ELISA by CLIA the number of immunological and functional laboratory tests was reduced by 38.2%. The number of positive HIT immunoassays declined from 22.6% to 6.0%, while the number of positive HIPA tests among positive immunological tests increased by 19%. Altogether, acute HIT was confirmed in 59 patients. A decline in the use of alternative anticoagulants was observed in the 2nd period. Our study shows that in a university hospital setting HIT is well-known, but diagnosis requires a precise laboratory confirmation. Replacing PaGIA and ELISA by CLIA did not influence laboratory order behavior but results in reduced overall costs for laboratory diagnostics and alternative anticoagulation. Copyright © 2017 Elsevier Ltd. All rights reserved.
DEFF Research Database (Denmark)
Fitzenberger, Bernd; Wilke, Ralf Andreas
2015-01-01
if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...
Schenk, Liam N.; Anderson, Chauncey W.; Diaz, Paul; Stewart, Marc A.
2016-12-22
Executive SummarySuspended-sediment and total phosphorus loads were computed for two sites in the Upper Klamath Basin on the Wood and Williamson Rivers, the two main tributaries to Upper Klamath Lake. High temporal resolution turbidity and acoustic backscatter data were used to develop surrogate regression models to compute instantaneous concentrations and loads on these rivers. Regression models for the Williamson River site showed strong correlations of turbidity with total phosphorus and suspended-sediment concentrations (adjusted coefficients of determination [Adj R2]=0.73 and 0.95, respectively). Regression models for the Wood River site had relatively poor, although statistically significant, relations of turbidity with total phosphorus, and turbidity and acoustic backscatter with suspended sediment concentration, with high prediction uncertainty. Total phosphorus loads for the partial 2014 water year (excluding October and November 2013) were 39 and 28 metric tons for the Williamson and Wood Rivers, respectively. These values are within the low range of phosphorus loads computed for these rivers from prior studies using water-quality data collected by the Klamath Tribes. The 2014 partial year total phosphorus loads on the Williamson and Wood Rivers are assumed to be biased low because of the absence of data from the first 2 months of water year 2014, and the drought conditions that were prevalent during that water year. Therefore, total phosphorus and suspended-sediment loads in this report should be considered as representative of a low-water year for the two study sites. Comparing loads from the Williamson and Wood River monitoring sites for November 2013–September 2014 shows that the Williamson and Sprague Rivers combined, as measured at the Williamson River site, contributed substantially more suspended sediment to Upper Klamath Lake than the Wood River, with 4,360 and 1,450 metric tons measured, respectively.Surrogate techniques have proven useful at
Do pigeons prefer alternatives that include near-hit outcomes?
Stagner, Jessica P; Case, Jacob P; Sticklen, Mary F; Duncan, Amanda K; Zentall, Thomas R
2015-07-01
Pigeons show suboptimal choice on a gambling-like task similar to that shown by humans. Humans also show a preference for gambles in which there are near hits (losses that come close to winning). In the present research, we asked if pigeons would show a preference for alternatives with near-hit-like trials. In Experiment 1, we included an alternative that presented a near hit, in which a stimulus associated with reinforcement (a presumed conditioned reinforcer) changed to a stimulus associated with the absence of reinforcement (a presumed conditioned inhibitor). The pigeons tended to avoid this alternative. In Experiment 2, we varied the duration of the presumed conditioned reinforcer (2 vs. 8 s) that changed to a presumed conditioned inhibitor (8 vs. 2 s) and found that the longer the conditioned reinforcer was presented, the more the pigeons avoided it. In Experiment 3, the near-hit alternative involved an ambiguous stimulus for 8 s that changed to a presumed conditioned reinforcer (or a presumed conditioned inhibitor) for 2 s, but the pigeons still avoided it. In Experiment 4, we controlled for the duration of the conditioned reinforcer by presenting it first for 2 s followed by the ambiguous stimulus for 8 s. Once again, the pigeons avoided the alternative with the near-hit trials. In all 4 experiments, the pigeons tended to avoid alternatives that provided near-hit-like trials. We concluded that humans may be attracted to near-hit trials because near-hit trials give them the illusion of control, whereas this does not appear to be a factor for pigeons. (c) 2015 APA, all rights reserved).
Combined hit theory-microdosimetric explanation of cellular radiobiological action
International Nuclear Information System (INIS)
Bond, V.P.; Varma, M.N.
1983-01-01
Hit theory is combined with microdosimetry in a stochastic approach that explains the observed responses of cell populations exposed in radiation fields of different qualities. The central thesis is that to expose a population of cells in a low-level radiation field is to subject the cells to the potential for interaction with charged particles in the vicinity of the cells, quantifiable in terms of the charged particle fluence theta. When such an interaction occurs there is a resulting stochastic transfer of energy to a critical volume (CV) of cross section sigma, within the cell(s). The severity of cell injury is dependent on the amount of energy thus imparted, or the hit size. If the severity is above some minimal level, there is a non-zero probability that the injury will result in a quantal effect (e.g., a mutational or carcinogenic initial event, cell transformation). A microdosimetric proportional counter, viewed here as a phantom cell CV that permits measurements not possible in the living cell, is used to determine the incidence of hit cells and the spectrum of hit sizes. Each hit is then weighted on the basis of an empirically-determined function that provides the fraction of cells responding quantally, as a function of hit size. The sum of the hits so weighted provides the incidence of quantally-responding cells, for any amount of exposure theta in a radiation field of any quality or mixture qualities. The hit size weighting function for pink mutations in Tradescantia is discussed, as are its implications in terms of a replacement for RBE and dose equivalent. 14 references, 9 figures
Logic regression and its extensions.
Schwender, Holger; Ruczinski, Ingo
2010-01-01
Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.
Rare transformation to double hit lymphoma in Waldenstrom's macroglobulinemia.
Okolo, Onyemaechi N; Johnson, Ariel C; Yun, Seongseok; Arnold, Stacy J; Anwer, Faiz
2017-08-01
Waldenström macroglobulinemia (WM) is a lymphoproliferative lymphoma that is characterized by monoclonal immunoglobulin M (IgM) protein and bone marrow infiltration. Its incidence is rare and rarer still is its ability to transform to a B-cell lymphoma, particularly the aggressive diffuse large B-cell lymphoma, which bodes a poor prognosis. When transformation includes mutations of MYC, BCL-2 and/or BCL-6, it is known as a 'double hit' or 'triple hit' lymphoma respectively. This paper presents a rare case of WM with mutations positive for MYC and BCL2, making it a case of double hit B-cell lymphoplasmacytic lymphoma with plasmatic differentiation without morphological transformation to aggressive histology like DLBCL. The paper also broadens to include discussions on current topics in the classification, diagnosis, possible causes of transformation, and treatment of WM, including transformation to double hit lymphoma. The significance of this case lies in that the presence of double hit lymphoma-like genetic mutations in WM have not been previously described in the literature and potentially such changes are harbinger of extra-nodal presentation, aggressive growth, and possibly poor prognosis, if data from other double-hit lymphoma are extrapolated.
Overview of the HIT-SI3 spheromak experiment
Hossack, A. C.; Jarboe, T. R.; Chandra, R. N.; Morgan, K. D.; Sutherland, D. A.; Everson, C. J.; Penna, J. M.; Nelson, B. A.
2017-10-01
The HIT-SI and HIT-SI3 spheromak experiments (a = 23 cm) study efficient, steady-state current drive for magnetic confinement plasmas using a novel method which is ideal for low aspect ratio, toroidal geometries. Sustained spheromaks show coherent, imposed plasma motion and low plasma-generated mode activity, indicating stability. Analysis of surface magnetic fields in HIT-SI indicates large n = 0 and 1 mode amplitudes and little energy in higher modes. Within measurement uncertainties all the n = 1 energy is imposed by the injectors, rather than being plasma-generated. The fluctuating field imposed by the injectors is sufficient to sustain the toroidal current through dynamo action whereas the plasma-generated field is not (Hossack et al., Phys. Plasmas, 2017). Ion Doppler spectroscopy shows coherent, imposed plasma motion inside r 10 cm in HIT-SI and a smaller volume of coherent motion in HIT-SI3. Coherent motion indicates the spheromak is stable and a lack of plasma-generated n = 1 energy indicates the maximum q is maintained below 1 for stability during sustainment. In HIT-SI3, the imposed mode structure is varied to test the plasma response (Hossack et al., Nucl. Fusion, 2017). Imposing n = 2, n = 3, or large, rotating n = 1 perturbations is correlated with transient plasma-generated activity. Work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences, under Award Number DE-FG02-96ER54361.
Usenko, Vasiliy S; Svirin, Sergey N; Shchekaturov, Yan N; Ponarin, Eduard D
2014-04-04
Many studies have investigated the impact of a wide range of social events on suicide-related behaviour. However, these studies have predominantly examined national events. The aim of this study is to provide a statistical evaluation of the relationship between mass gatherings in some relatively small urban sub-populations and the general suicide rates of a major city. The data were gathered in the Ukrainian city of Dnipropetrovsk, with a population of 1 million people, in 2005-2010. Suicide attempts, suicides, and the total amount of suicide-related behaviours were registered daily for each sex. Bivariate and multivariate statistical analysis, including negative binomial regression, were applied to assess the risk of suicide-related behaviour in the city's general population for 7 days before and after 427 mass gatherings, such as concerts, football games, and non-regular mass events organized by the Orthodox Church and new religious movements. The bivariate and multivariate statistical analyses found significant changes in some suicide-related behaviour rates in the city's population after certain kinds of mass gatherings. In particular, we observed an increased relative risk (RR) of male suicide-related behaviour after a home defeat of the local football team (RR = 1.32, p = 0.047; regression coefficient beta = 0.371, p = 0.002), and an increased risk of male suicides (RR = 1.29, p = 0.006; beta =0.255, p = 0.002), male suicide-related behaviour (RR = 1.25, p = 0.019; beta =0.251, p football games and mass events organized by new religious movements involved a relatively small part of an urban population (1.6 and 0.3%, respectively), we observed a significant increase of the some suicide-related behaviour rates in the whole population. It is likely that the observed effect on suicide-related behaviour is related to one's personal presence at the event rather than to its broadcast. Our findings can be explained largely in
Creel, Scott; Creel, Michael
2009-11-01
1. Sampling error in annual estimates of population size creates two widely recognized problems for the analysis of population growth. First, if sampling error is mistakenly treated as process error, one obtains inflated estimates of the variation in true population trajectories (Staples, Taper & Dennis 2004). Second, treating sampling error as process error is thought to overestimate the importance of density dependence in population growth (Viljugrein et al. 2005; Dennis et al. 2006). 2. In ecology, state-space models are used to account for sampling error when estimating the effects of density and other variables on population growth (Staples et al. 2004; Dennis et al. 2006). In econometrics, regression with instrumental variables is a well-established method that addresses the problem of correlation between regressors and the error term, but requires fewer assumptions than state-space models (Davidson & MacKinnon 1993; Cameron & Trivedi 2005). 3. We used instrumental variables to account for sampling error and fit a generalized linear model to 472 annual observations of population size for 35 Elk Management Units in Montana, from 1928 to 2004. We compared this model with state-space models fit with the likelihood function of Dennis et al. (2006). We discuss the general advantages and disadvantages of each method. Briefly, regression with instrumental variables is valid with fewer distributional assumptions, but state-space models are more efficient when their distributional assumptions are met. 4. Both methods found that population growth was negatively related to population density and winter snow accumulation. Summer rainfall and wolf (Canis lupus) presence had much weaker effects on elk (Cervus elaphus) dynamics [though limitation by wolves is strong in some elk populations with well-established wolf populations (Creel et al. 2007; Creel & Christianson 2008)]. 5. Coupled with predictions for Montana from global and regional climate models, our results
Understanding logistic regression analysis
Sperandei, Sandro
2014-01-01
Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...
Introduction to regression graphics
Cook, R Dennis
2009-01-01
Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava
Alternative Methods of Regression
Birkes, David
2011-01-01
Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s
Lange, Lydia L
2005-05-01
Scientific publications tend to be forgotten quickly. A few works, however, are still cited 100 years and more after their publication. The author used bibliometric methods to compare "hits" (works noticed by the scientific community soon after their publication) with "missed signals" (works that went unnoticed until much later) by investigating 2 psychological journals founded in the 1890s: Zeitschrift für Psychologie and Psychological Review. All articles that were published in either of these journals up to 1920 and cited more than 25 times in the Web of Science up to the year 2000 were considered for inclusion in the analysis. It emerged that hits corresponded more closely to the focus of scientific attention at the time of the publications than missed signals.
Single-hit mechanism of tumour cell killing by radiation.
Chapman, J D
2003-02-01
To review the relative importance of the single-hit mechanism of radiation killing for tumour response to 1.8-2.0 Gy day(-1) fractions and to low dose-rate brachytherapy. Tumour cell killing by ionizing radiation is well described by the linear-quadratic equation that contains two independent components distinguished by dose kinetics. Analyses of tumour cell survival curves that contain six or more dose points usually provide good estimates of the alpha- and beta-inactivation coefficients. Superior estimates of tumour cell intrinsic radiosensitivity are obtained when synchronized populations are employed. The characteristics of single-hit inactivation of tumour cells are reviewed and compared with the characteristics of beta-inactivation. Potential molecular targets associated with single-hit inactivation are discussed along with strategies for potentiating cell killing by this mechanism. The single-hit mechanism of tumour cell killing shows no dependence on dose-rate and, consequently, no evidence of sublethal damage repair. It is uniquely potentiated by high linear-energy-transfer radiation, exhibits a smaller oxygen enhancement ratio and exhibits a larger indirect effect by hydroxyl radicals than the beta-mechanism. alpha-inactivation coefficients vary slightly throughout interphase but mitotic cells exhibit extremely high alpha-coefficients in the range of those observed for lymphocytes and some repair-deficient cells. Evidence is accumulating to suggest that chromatin in compacted form could be a radiation-hypersensitive target associated with single-hit radiation killing. Analyses of tumour cell survival curves demonstrate that it is the single-hit mechanism (alpha) that determines the majority of cell killing after doses of 2Gy and that this mechanism is highly variable between tumour cell lines. The characteristics of single-hit inactivation are qualitatively and quantitatively distinct from those of beta-inactivation. Compacted chromatin in tumour cells
Scientific impact: the story of your big hit
Sinatra, Roberta; Wang, Dashun; Deville, Pierre; Song, Chaoming; Barabasi, Albert-Laszlo
2014-03-01
A gradual increase in performance through learning and practice characterize most trades, from sport to music or engineering, and common sense suggests this to be true in science as well. This prompts us to ask: what are the precise patterns that lead to scientific excellence? Does performance indeed improve throughout a scientific career? Are there quantifiable signs of an impending scientific hit? Using citation-based measures as a proxy of impact, we show that (i) major discoveries are not preceded by works of increasing impact, nor are followed by work of higher impact, (ii) the precise time ranking of the highest impact work in a scientist's career is uniformly random, with the higher probability to have a major discovery in the middle of scientific careers being due only to changes in productivity, (iii) there is a strong correlation between the highest impact work and average impact of a scientist's work. These findings suggest that the impact of a paper is drawn randomly from an impact distribution that is unique for each scientist. We present a model which allows to reconstruct the individual impact distribution, making possible to create synthetic careers that exhibit the same properties of the real data and to define a ranking based on the overall impact of a scientist. RS acknowledges support from the James McDonnell Foundation.
International Nuclear Information System (INIS)
Nuamah, N.N.N.N.
1990-12-01
The paradoxical nature of results of the mean approach in pooling cross-section and time series data has been identified to be caused by the presence in the normal equations of phenomena such as autocovariances, multicollinear covariances, drift covariances and drift multicollinear covariances. This paper considers the problem of autocorrelation and suggests ways of solving it. (author). 4 refs
2010-10-08
... Technology; HIT Standards Committee Schedule for the Assessment of HIT Policy Committee Recommendations.... SUMMARY: Section 3003(b)(3) of the American Recovery and Reinvestment Act of 2009 mandates that the HIT Standards Committee develop a schedule for the assessment of policy recommendations developed by the HIT...
77 FR 66617 - HIT Policy and Standards Committees; Workgroup Application Database
2012-11-06
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy and Standards Committees; Workgroup Application... of New ONC HIT FACA Workgroup Application Database. The Office of the National Coordinator (ONC) has.... Name of Committees: HIT Standards Committee and HIT Policy Committee. General Function of the...
Directory of Open Access Journals (Sweden)
Matthias Schmid
Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.
Understanding logistic regression analysis.
Sperandei, Sandro
2014-01-01
Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.
Weisberg, Sanford
2013-01-01
Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus
Hosmer, David W; Sturdivant, Rodney X
2013-01-01
A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-
Lee, Michael T.; Asquith, William H.; Oden, Timothy D.
2012-01-01
In December 2005, the U.S. Geological Survey (USGS), in cooperation with the City of Houston, Texas, began collecting discrete water-quality samples for nutrients, total organic carbon, bacteria (Escherichia coli and total coliform), atrazine, and suspended sediment at two USGS streamflow-gaging stations that represent watersheds contributing to Lake Houston (08068500 Spring Creek near Spring, Tex., and 08070200 East Fork San Jacinto River near New Caney, Tex.). Data from the discrete water-quality samples collected during 2005–9, in conjunction with continuously monitored real-time data that included streamflow and other physical water-quality properties (specific conductance, pH, water temperature, turbidity, and dissolved oxygen), were used to develop regression models for the estimation of concentrations of water-quality constituents of substantial source watersheds to Lake Houston. The potential explanatory variables included discharge (streamflow), specific conductance, pH, water temperature, turbidity, dissolved oxygen, and time (to account for seasonal variations inherent in some water-quality data). The response variables (the selected constituents) at each site were nitrite plus nitrate nitrogen, total phosphorus, total organic carbon, E. coli, atrazine, and suspended sediment. The explanatory variables provide easily measured quantities to serve as potential surrogate variables to estimate concentrations of the selected constituents through statistical regression. Statistical regression also facilitates accompanying estimates of uncertainty in the form of prediction intervals. Each regression model potentially can be used to estimate concentrations of a given constituent in real time. Among other regression diagnostics, the diagnostics used as indicators of general model reliability and reported herein include the adjusted R-squared, the residual standard error, residual plots, and p-values. Adjusted R-squared values for the Spring Creek models ranged
Understanding poisson regression.
Hayat, Matthew J; Higgins, Melinda
2014-04-01
Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.
Influence of Running on Pistol Shot Hit Patterns.
Kerkhoff, Wim; Bolck, Annabel; Mattijssen, Erwin J A T
2016-01-01
In shooting scene reconstructions, risk assessment of the situation can be important for the legal system. Shooting accuracy and precision, and thus risk assessment, might be correlated with the shooter's physical movement and experience. The hit patterns of inexperienced and experienced shooters, while shooting stationary (10 shots) and in running motion (10 shots) with a semi-automatic pistol, were compared visually (with confidence ellipses) and statistically. The results show a significant difference in precision (circumference of the hit patterns) between stationary shots and shots fired in motion for both inexperienced and experienced shooters. The decrease in precision for all shooters was significantly larger in the y-direction than in the x-direction. The precision of the experienced shooters is overall better than that of the inexperienced shooters. No significant change in accuracy (shift in the hit pattern center) between stationary shots and shots fired in motion can be seen for all shooters. © 2015 American Academy of Forensic Sciences.
Analysis of impact noise induced by hitting of titanium head golf driver.
Kim, Young Ho; Kim, Young Chul; Lee, Jun Hee; An, Yong-Hwi; Park, Kyung Tae; Kang, Kyung Min; Kang, Yeon June
2014-11-01
The hitting of titanium head golf driver against golf ball creates a short duration, high frequency impact noise. We analyzed the spectra of these impact noises and evaluated the auditory hazards from exposure to the noises. Noises made by 10 titanium head golf drivers with five maximum hits were collected, and the spectra of the pure impact sounds were studied using a noise analysis program. The noise was measured at 1.7 m (position A) and 3.4 m (position B) from the hitting point in front of the hitter and at 3.4 m (position C) behind the hitting point. Average time duration was measured and auditory risk units (ARUs) at position A were calculated using the Auditory Hazard Assessment Algorithm for Humans. The average peak levels at position A were 119.9 dBA at the sound pressure level (SPL) peak and 100.0 dBA at the overall octave level. The average peak levels (SPL and overall octave level) at position B were 111.6 and 96.5 dBA, respectively, and at position C were 111.5 and 96.7 dBA, respectively. The average time duration and ARUs measured at position A were 120.6 ms and 194.9 units, respectively. Although impact noises made by titanium head golf drivers showed relatively low ARUs, individuals enjoying golf frequently may be susceptible to hearing loss due to the repeated exposure of this intense impact noise with short duration and high frequency. Unprotected exposure to impact noises should be limited to prevent cochleovestibular disorders.
We are Family (Chemistry Smash Hits).
Ross, Haymo
2018-01-02
Time after time: Another new year, a new editorial and a new Editor-in-Chief. This editorial provides you with all the latest exciting news from Chemistry-A European Journal, with a review of the past year and looking forward to the one to come. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Does ′heparin-induced thrombocytopenia′ hit our minds?
Directory of Open Access Journals (Sweden)
Arun R Thangavel
2016-01-01
Full Text Available Unfractionated heparin is a widely used drug to prevent deep vein thrombosis and pulmonary emboli in patients at risk. With the advent of newer anticoagulants having lesser side effects, its use has diminished but not out of service. Here, we report a case of deep venous thrombosis, in a patient on prophylactic dose of heparin, which was later found to be a manifestation of heparin-induced thrombocytopenia (HIT. Thrombosis in the presence of heparin prophylaxis should be considered as HIT rather than a failure of anticoagulation.
Direct determination of the hit locations from experimental HPGe pulses
Energy Technology Data Exchange (ETDEWEB)
Désesquelles, P., E-mail: Pierre.Desesquelles@in2p3.fr [Univ. Paris-Sud, CSNSM CNRS/IN2P3, 15 rue G. Clémenceau, 91405 Orsay (France); Boston, A.J.; Boston, H.C.; Cresswell, J.R.; Dimmock, M.R. [Oliver Lodge Laboratory, The University of Liverpool, Oxford Street, Liverpool L69 7ZE (United Kingdom); Lazarus, I.H. [STFC Daresbury Laboratory, Daresbury, Warrington WA4 4AD (United Kingdom); Ljungvall, J. [Univ. Paris-Sud, CSNSM CNRS/IN2P3, 15 rue G. Clémenceau, 91405 Orsay (France); Nelson, L. [Oliver Lodge Laboratory, The University of Liverpool, Oxford Street, Liverpool L69 7ZE (United Kingdom); Nga, D.-T. [Univ. Paris-Sud, CSNSM CNRS/IN2P3, 15 rue G. Clémenceau, 91405 Orsay (France); Nolan, P.J.; Rigby, S.V. [Oliver Lodge Laboratory, The University of Liverpool, Oxford Street, Liverpool L69 7ZE (United Kingdom); Simpson, J. [STFC Daresbury Laboratory, Daresbury, Warrington WA4 4AD (United Kingdom); Van-Oanh, N.-T. [Univ. Paris-Sud, LCP UMR8000 CNRS, 15 rue G. Clémenceau, 91405 Orsay (France)
2013-11-21
The gamma-tracking technique optimises the determination of the energy and emission angle of gamma-rays detected by modern segmented HPGe detectors. This entails the determination, using the delivered pulse shapes, of the interaction points of the gamma-ray within the crystal. The direct method presented here allows the localisation of the hits using only a large sample of pulses detected in the actual operating conditions. No external crystal scanning system or pulse shape simulation code is needed. In order to validate this method, it is applied to sets of pulses obtained using the University of Liverpool scanning system. The hit locations are determined by the method with good precision.
Possilibity of estimating payoff matrix from model for hit phenomena
International Nuclear Information System (INIS)
Ishii, Akira; Sakaidani, Shota; Iwanaga, Saori
2016-01-01
The conflicts of topics on social media is considered using an extended mathematical model based on the mathematical model for hit phenomena that has been used to analyze entertainment hits. The social media platform used in this study was blog. The calculation results shows examples of strong conflict, weak conflict, and no conflict cases. Since the conflict of two topics can be considered in the framework of game theory, the results can be used to determine each matrix element of the payoff matrix of game theory.
Directory of Open Access Journals (Sweden)
Mok Tik
2014-06-01
Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.
Hagawane, T N; Gaikwad, R V; Kshirsagar, N A
2016-05-01
Despite advances in therapy and overall medical care, acute lung injury (ALI)/acute respiratory distress syndrome (ARDS) management remains a problem. Hence the objective of this study was to develop a rat model that mimics human ALI/ARDS. Four groups of Wistar rats, 48 per group were treated with (i) intratracheal (IT) lipopolysaccharide (LPS) (5 mg/kg) dissolved in normal saline (NS), (ii) intravenous (iv) oleic acid (OA) (250 μl/kg) suspension in bovine serum albumin (BSA), (iii) dual hit: IT LPS (2 mg/kg) dissolved in NS and iv OA (100 μl/kg) and (iv) control group: IT NS and iv BSA. From each group at set periods of time various investigations like chest x-rays, respiratory rate (RR), tidal volume (TV), total cell count, differential cell count, total protein count and cytokine levels in bronchoalveolar lavage fluid (BALF), lung wet/dry weight ratio and histopathological examination were done. It was noted that the respiratory rate, and tumour necrosis factor-α (TNF-α) levels were significantly higher at 4 h in the dual hit group as compared to LPS, OA and control groups. Interleukin-6 (IL-6) levels were significantly higher in the dual hit group as compared to LPS at 8 and 24 h, OA at 8 h and control (at all time intervals) group. IL-1β levels were significantly higher in LPS and dual hit groups at all time intervals, but not in OA and control groups. The injury induced in dual hit group was earlier and more sustained as compared to LPS and OA alone. The lung pathology and changes in respiration functions produced by the dual hit model were closer to the diagnostic criteria of ALI/ARDS in terms of clinical manifestations and pulmonary injury and the injury persisted longer as compared to LPS and OA single hit model. Therefore, the ARDS model produced by the dual hit method was closer to the diagnostic criteria of ARDS in terms of clinical manifestations and pulmonary injury.
Behkami, Nima A; Dorr, David A; Morrice, Stuart
2010-01-01
The goal of this study is to describe a framework that allows decision makers to efficiently evaluate factors that affect Electronic Health Record (EHR) adoption and test suitable interventions; specifically financial incentives. The United States healthcare delivery system is experiencing a transformation to improve population health. There is strong agreement that "meaningful use" of Health Information Technology (HIT) is a major enabler in this effort. However it's also understood that the high cost of implementing an EHR is an obstacle for adoption. To help understand these complexities we developed a simulation model designed to capture the dynamic nature of policy interventions that affect the adoption of EHR. We found that "Effective" use of HIT approaches break-even-point and larger clinic revenue many times faster that "average" or "poor" use of HIT. This study uses a systems perspective to the evaluate EHR adoption process through the "meaningful use" redesign as proposed in the American Reinvestment and Recovery Act 2009 in the United States healthcare industry by utilizing the System Dynamics methodology and Scenario Analysis.
Multicollinearity and Regression Analysis
Daoud, Jamal I.
2017-12-01
In regression analysis it is obvious to have a correlation between the response and predictor(s), but having correlation among predictors is something undesired. The number of predictors included in the regression model depends on many factors among which, historical data, experience, etc. At the end selection of most important predictors is something objective due to the researcher. Multicollinearity is a phenomena when two or more predictors are correlated, if this happens, the standard error of the coefficients will increase [8]. Increased standard errors means that the coefficients for some or all independent variables may be found to be significantly different from In other words, by overinflating the standard errors, multicollinearity makes some variables statistically insignificant when they should be significant. In this paper we focus on the multicollinearity, reasons and consequences on the reliability of the regression model.
Powered two-wheeler drivers' risk of hitting a pedestrian in towns.
Clabaux, Nicolas; Fournier, Jean-Yves; Michel, Jean-Emmanuel
2014-12-01
The risk of collision between pedestrians and powered two-wheelers is poorly understood today. The objective of this research is to determine the risk for powered two-wheeler drivers of hitting and injuring a pedestrian per kilometer driven in towns and to compare this risk with that run by four-wheeled vehicle drivers. Using the bodily injury accidents recorded by the police on nine roads in the city of Marseille in 2011 and a campaign of observations of powered two-wheeler traffic, we estimated the risk per kilometer driven by powered two-wheeler drivers of hitting a pedestrian and compared it with the risk run by four-wheeled vehicle drivers. The results show that the risk for powered two-wheeler drivers of hitting and injuring a pedestrian is significantly higher than the risk run by four-wheeled vehicle drivers. On the nine roads studied, it is on average 3.33 times higher (95% CI: 1.63; 6.78). Taking four more years into account made it possible to consolidate these results and to tighten the confidence interval. There does indeed seem to be problems in the interactions between pedestrians and powered two-wheeler users in urban traffic. These interaction problems lead to a higher risk of hitting and injuring a pedestrian for powered two-wheeler drivers than for four-wheeled vehicle drivers. The analysis of the police reports suggests that part of this increased risk comes from filtering maneuvers by powered two-wheelers. Possible countermeasures deal with the urban street layout. Measures consisting in reducing the width and the number of traffic lanes to a strict minimum and installing medians or pedestrian islands could be an effective way for the prevention of urban accidents between pedestrians and powered two-wheelers. Copyright © 2014 National Safety Council and Elsevier Ltd. All rights reserved.
DEFF Research Database (Denmark)
Bache, Stefan Holst
A new and alternative quantile regression estimator is developed and it is shown that the estimator is root n-consistent and asymptotically normal. The estimator is based on a minimax ‘deviance function’ and has asymptotically equivalent properties to the usual quantile regression estimator. It is......, however, a different and therefore new estimator. It allows for both linear- and nonlinear model specifications. A simple algorithm for computing the estimates is proposed. It seems to work quite well in practice but whether it has theoretical justification is still an open question....
Zhu, Tian; Cao, Shuyi; Su, Pin-Chih; Patel, Ram; Shah, Darshan; Chokshi, Heta B.; Szukala, Richard; Johnson, Michael E.; Hevener, Kirk E.
2013-01-01
A critical analysis of virtual screening results published between 2007 and 2011 was performed. The activity of reported hit compounds from over 400 studies was compared to their hit identification criteria. Hit rates and ligand efficiencies were calculated to assist in these analyses and the results were compared with factors such as the size of the virtual library and the number of compounds tested. A series of promiscuity, drug-like, and ADMET filters were applied to the reported hits to assess the quality of compounds reported and a careful analysis of a subset of the studies which presented hit optimization was performed. This data allowed us to make several practical recommendations with respect to selection of compounds for experimental testing, defining hit identification criteria, and general virtual screening hit criteria to allow for realistic hit optimization. A key recommendation is the use of size-targeted ligand efficiency values as hit identification criteria. PMID:23688234
Zhu, Tian; Cao, Shuyi; Su, Pin-Chih; Patel, Ram; Shah, Darshan; Chokshi, Heta B; Szukala, Richard; Johnson, Michael E; Hevener, Kirk E
2013-09-12
A critical analysis of virtual screening results published between 2007 and 2011 was performed. The activity of reported hit compounds from over 400 studies was compared to their hit identification criteria. Hit rates and ligand efficiencies were calculated to assist in these analyses, and the results were compared with factors such as the size of the virtual library and the number of compounds tested. A series of promiscuity, druglike, and ADMET filters were applied to the reported hits to assess the quality of compounds reported, and a careful analysis of a subset of the studies that presented hit optimization was performed. These data allowed us to make several practical recommendations with respect to selection of compounds for experimental testing, definition of hit identification criteria, and general virtual screening hit criteria to allow for realistic hit optimization. A key recommendation is the use of size-targeted ligand efficiency values as hit identification criteria.
Network analysis of metabolite GWAS hits
DEFF Research Database (Denmark)
Matone, Alice; Scott-Boyer, Marie-Pier; Carayol, Jerome
2016-01-01
BACKGROUND AND SCOPE: Weight loss success is dependent on the ability to refrain from regaining the lost weight in time. This feature was shown to be largely variable among individuals, and these differences, with their underlying molecular processes, are diverse and not completely elucidated. Al...
MIDAS and HIT-6 French translation: reliability and correlation between tests.
Magnoux, E; Freeman, M A; Zlotnik, G
2008-01-01
The aim was to evaluate the test-retest reliability of the French translation of the Migraine Disability Assessment (MIDAS) and Headache Impact Test (HIT)-6 questionnaires as applied to episodic and chronic headaches and to assess the correlation between these two questionnaires. The MIDAS and HIT-6 questionnaires, which assess the degree of migraine-related functional disability, are widely used in headache treatment clinics. The French translation has not been checked for test-retest reliability. MIDAS involves recall, over the previous 3 months, of the number of days with functional disability with regard to work and to home and social life. HIT-6 involves a more subjective and general assessment of headache-related disability over the previous 4 weeks. We expect that there may be greater impact recall bias for chronic headaches than for episodic headaches and considered it important to be able to determine if the reliability of these questionnaires is equally good for these two patient populations. Given that both questionnaires have the same objective, that of assessing headache impact, it was thought useful to determine if their results might show a correlation and if they could thus be used interchangeably. The study was approved by an external ethics committee. The subjects were patients who regularly visit the Clinique de la Migraine de Montréal, which specializes in the treatment of headaches. The MIDAS and HIT-6 questionnaires were completed by the patients during their regular visit. Twelve days later, the same questionnaires were mailed with a prepaid return envelope. Sixty-five patients were required in both the episodic and chronic headache groups, assuming an 80% questionnaire return rate. One hundred and eighty-five patients were enrolled, and 143 completed the study, 75 with episodic headaches and 68 with chronic headaches. The questionnaire return rate was 78.9%. On average, questionnaires were completed a second time 21 days after the first
Do repeated rumble strip hits improve driver alertness?
Watling, C.N.; Akerstedt, T.; Kecklund, L.G.; Anund, A.
2016-01-01
Driving while sleepy is associated with increased crash risk. Rumble strips are designed to alert a sleepy or inattentive driver when they deviate outside their driving lane. The current study sought to examine the effects of repeated rumble strip hits on levels of physiological and subjective
Madoff Debacle Hits Colleges and Raises Questions about Trustee Conflicts
Fain, Paul
2009-01-01
Several colleges and universities lost millions in the alleged $50-billion Ponzi scheme run by the Wall Street trader Bernard L. Madoff. The losses include institutions' endowment holdings in hedge funds that were invested with Madoff as well as hits taken by supporting foundations and donors. Several foundations that have been active in higher…
The probability of a tornado missile hitting a target
International Nuclear Information System (INIS)
Goodman, J.; Koch, J.E.
1983-01-01
It is shown that tornado missile transportation is a diffusion Markovian process. Therefore, the Green's function method is applied for the estimation of the probability of hitting a unit target area. This propability is expressed through a joint density of tornado intensity and path area, a probability of tornado missile injection and a tornado missile height distribution. (orig.)
COPD: A stepwise or a hit hard approach?
Directory of Open Access Journals (Sweden)
A.J. Ferreira
2016-07-01
Full Text Available Current guidelines differ slightly on the recommendations for treatment of Chronic Obstructive Pulmonary Disease (COPD patients, and although there are some undisputed recommendations, there is still debate regarding the management of COPD. One of the hindrances to deciding which therapeutic approach to choose is late diagnosis or misdiagnosis of COPD. After a proper diagnosis is achieved and severity assessed, the choice between a stepwise or âhit hardâ approach has to be made. For GOLD A patients the stepwise approach is recommended, whilst for B, C and D patients this remains debatable. Moreover, in patients for whom inhaled corticosteroids (ICS are recommended, a step-up or âhit hardâ approach with triple therapy will depend on the patient's characteristics and, for patients who are being over-treated with ICS, ICS withdrawal should be performed, in order to optimize therapy and reduce excessive medications.This paper discusses and proposes stepwise, âhit hardâ, step-up and ICS withdrawal therapeutic approaches for COPD patients based on their GOLD group. We conclude that all approaches have benefits, and only a careful patient selection will determine which approach is better, and which patients will benefit the most from each approach. Keywords: COPD, Stepwise, Hit hard, Step-up, ICS withdrawal, Bronchodilators, ICS
Biophysics: for HTS hit validation, chemical lead optimization, and beyond.
Genick, Christine C; Wright, S Kirk
2017-09-01
There are many challenges to the drug discovery process, including the complexity of the target, its interactions, and how these factors play a role in causing the disease. Traditionally, biophysics has been used for hit validation and chemical lead optimization. With its increased throughput and sensitivity, biophysics is now being applied earlier in this process to empower target characterization and hit finding. Areas covered: In this article, the authors provide an overview of how biophysics can be utilized to assess the quality of the reagents used in screening assays, to validate potential tool compounds, to test the integrity of screening assays, and to create follow-up strategies for compound characterization. They also briefly discuss the utilization of different biophysical methods in hit validation to help avoid the resource consuming pitfalls caused by the lack of hit overlap between biophysical methods. Expert opinion: The use of biophysics early on in the drug discovery process has proven crucial to identifying and characterizing targets of complex nature. It also has enabled the identification and classification of small molecules which interact in an allosteric or covalent manner with the target. By applying biophysics in this manner and at the early stages of this process, the chances of finding chemical leads with novel mechanisms of action are increased. In the future, focused screens with biophysics as a primary readout will become increasingly common.
First hitting probabilities for semi markov chains and estimation
DEFF Research Database (Denmark)
Georgiadis, Stylianos
2017-01-01
We first consider a stochastic system described by an absorbing semi-Markov chain with finite state space and we introduce the absorption probability to a class of recurrent states. Afterwards, we study the first hitting probability to a subset of states for an irreducible semi-Markov chain...
Assessing the lipophilicity of fragments and early hits
Mortenson, Paul N.; Murray, Christopher W.
2011-07-01
A key challenge in many drug discovery programs is to accurately assess the potential value of screening hits. This is particularly true in fragment-based drug design (FBDD), where the hits often bind relatively weakly, but are correspondingly small. Ligand efficiency (LE) considers both the potency and the size of the molecule, and enables us to estimate whether or not an initial hit is likely to be optimisable to a potent, druglike lead. While size is a key property that needs to be controlled in a small molecule drug, there are a number of additional properties that should also be considered. Lipophilicity is amongst the most important of these additional properties, and here we present a new efficiency index (LLEAT) that combines lipophilicity, size and potency. The index is intuitively defined, and has been designed to have the same target value and dynamic range as LE, making it easily interpretable by medicinal chemists. Monitoring both LE and LLEAT should help both in the selection of more promising fragment hits, and controlling molecular weight and lipophilicity during optimisation.
Jiang, Weiping; Ma, Jun; Li, Zhao; Zhou, Xiaohui; Zhou, Boye
2018-05-01
The analysis of the correlations between the noise in different components of GPS stations has positive significance to those trying to obtain more accurate uncertainty of velocity with respect to station motion. Previous research into noise in GPS position time series focused mainly on single component evaluation, which affects the acquisition of precise station positions, the velocity field, and its uncertainty. In this study, before and after removing the common-mode error (CME), we performed one-dimensional linear regression analysis of the noise amplitude vectors in different components of 126 GPS stations with a combination of white noise, flicker noise, and random walking noise in Southern California. The results show that, on the one hand, there are above-moderate degrees of correlation between the white noise amplitude vectors in all components of the stations before and after removal of the CME, while the correlations between flicker noise amplitude vectors in horizontal and vertical components are enhanced from un-correlated to moderately correlated by removing the CME. On the other hand, the significance tests show that, all of the obtained linear regression equations, which represent a unique function of the noise amplitude in any two components, are of practical value after removing the CME. According to the noise amplitude estimates in two components and the linear regression equations, more accurate noise amplitudes can be acquired in the two components.
Multiple linear regression analysis
Edwards, T. R.
1980-01-01
Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Seber, George A F
2012-01-01
Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.
Ritz, Christian; Parmigiani, Giovanni
2009-01-01
R is a rapidly evolving lingua franca of graphical display and statistical analysis of experiments from the applied sciences. This book provides a coherent treatment of nonlinear regression with R by means of examples from a diversity of applied sciences such as biology, chemistry, engineering, medicine and toxicology.
Bayesian ARTMAP for regression.
Sasu, L M; Andonie, R
2013-10-01
Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.
Bounded Gaussian process regression
DEFF Research Database (Denmark)
Jensen, Bjørn Sand; Nielsen, Jens Brehm; Larsen, Jan
2013-01-01
We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We...... with the proposed explicit noise-model extension....
and Multinomial Logistic Regression
African Journals Online (AJOL)
This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for classifying students based on their academic performance. The predictive accuracy for each model was measured by their average Classification Correct Rate (CCR).
Mechanisms of neuroblastoma regression
Brodeur, Garrett M.; Bagatell, Rochelle
2014-01-01
Recent genomic and biological studies of neuroblastoma have shed light on the dramatic heterogeneity in the clinical behaviour of this disease, which spans from spontaneous regression or differentiation in some patients, to relentless disease progression in others, despite intensive multimodality therapy. This evidence also suggests several possible mechanisms to explain the phenomena of spontaneous regression in neuroblastomas, including neurotrophin deprivation, humoral or cellular immunity, loss of telomerase activity and alterations in epigenetic regulation. A better understanding of the mechanisms of spontaneous regression might help to identify optimal therapeutic approaches for patients with these tumours. Currently, the most druggable mechanism is the delayed activation of developmentally programmed cell death regulated by the tropomyosin receptor kinase A pathway. Indeed, targeted therapy aimed at inhibiting neurotrophin receptors might be used in lieu of conventional chemotherapy or radiation in infants with biologically favourable tumours that require treatment. Alternative approaches consist of breaking immune tolerance to tumour antigens or activating neurotrophin receptor pathways to induce neuronal differentiation. These approaches are likely to be most effective against biologically favourable tumours, but they might also provide insights into treatment of biologically unfavourable tumours. We describe the different mechanisms of spontaneous neuroblastoma regression and the consequent therapeutic approaches. PMID:25331179
Nedley Depression Hit Hypothesis: Identifying Depression and Its Causes.
Nedley, Neil; Ramirez, Francisco E
2016-11-01
Depression is often diagnosed using the Diagnostic and Statistical Manual of Mental Disorders Fifth Edition (DSM-5) criteria. We propose how certain lifestyle choices and non-modifiable factors can predict the development of depression. We identified 10 cause categories (hits or "blows" to the brain) and theorize that four or more active hits could trigger a depression episode. Methods. A sample of 4271 participants from our community-based program (70% female; ages 17-94 years) was assessed at baseline and at the eighth week of the program using a custom test. Ten cause categories were examined as predictors of depression are (1) Genetic, (2)Developmental, (3)Lifestyle, (4)Circadian Rhythm, (5)Addiction, (6)Nutrition, (7)Toxic, (8)Social/Complicated Grief, (9)Medical Condition, and (10)Frontal Lobe. Results. The relationship between the DSM-5 score and a person having four hits categories in the first program week showed a sensitivity of 89.98 % (95% CI: 89.20 % - 90.73%), specificity 48.84% (CI 45.94-51.75) and Matthew Correlation Coefficient (MCC) .41 . For the eight-week test, the results showed a sensitivity 83.6% (CI 81.9-85.5), specificity 53.7% (CI 51.7-55.6) and MCC .38. Overall, the hits that improved the most from baseline after the eighth week were: Nutrition (47%), Frontal lobe (36%), Addiction (24%), Circadian rhythm (24%), Lifestyle (20%), Social (12%) and Medical (10%). Conclusions. The Nedley four-hit hypothesis seems to predict a depressive episode and correlates well with the DSM-5 criteria with good sensitivity and MCC but less specificity. Identifying these factors and applying lifestyle therapies could play an important role in the treatment of depressed individuals.
Ridge Regression Signal Processing
Kuhl, Mark R.
1990-01-01
The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.
Subset selection in regression
Miller, Alan
2002-01-01
Originally published in 1990, the first edition of Subset Selection in Regression filled a significant gap in the literature, and its critical and popular success has continued for more than a decade. Thoroughly revised to reflect progress in theory, methods, and computing power, the second edition promises to continue that tradition. The author has thoroughly updated each chapter, incorporated new material on recent developments, and included more examples and references. New in the Second Edition:A separate chapter on Bayesian methodsComplete revision of the chapter on estimationA major example from the field of near infrared spectroscopyMore emphasis on cross-validationGreater focus on bootstrappingStochastic algorithms for finding good subsets from large numbers of predictors when an exhaustive search is not feasible Software available on the Internet for implementing many of the algorithms presentedMore examplesSubset Selection in Regression, Second Edition remains dedicated to the techniques for fitting...
Better Autologistic Regression
Directory of Open Access Journals (Sweden)
Mark A. Wolters
2017-11-01
Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.
Regression in organizational leadership.
Kernberg, O F
1979-02-01
The choice of good leaders is a major task for all organizations. Inforamtion regarding the prospective administrator's personality should complement questions regarding his previous experience, his general conceptual skills, his technical knowledge, and the specific skills in the area for which he is being selected. The growing psychoanalytic knowledge about the crucial importance of internal, in contrast to external, object relations, and about the mutual relationships of regression in individuals and in groups, constitutes an important practical tool for the selection of leaders.
Classification and regression trees
Breiman, Leo; Olshen, Richard A; Stone, Charles J
1984-01-01
The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
Double hit of NEMO gene in preeclampsia.
Directory of Open Access Journals (Sweden)
Agata Sakowicz
Full Text Available The precise etiology of preeclampsia is unknown. Family studies indicate that both genetic and environmental factors influence its development. One of these factors is NFkB, whose activation depends on NEMO (NFkB essential modulator. This is the first study to investigate the association between the existence of single nucleotide variant of the NEMO gene and the appearance of preeclampsia. A total of 151 women (72 preeclamptic women and 79 controls and their children were examined. Sanger sequencing was performed to identify variants in the NEMO gene in the preeclamptic mothers. The maternal identified variants were then sought in the studied groups of children, and in the maternal and child controls, using RFLP-PCR. Real-time RT-PCR was performed to assess NEMO gene expression in maternal blood, umbilical cord blood and placentas. The sequencing process indicated the existence of two different variants in the 3'UTR region of the NEMO gene of preeclamptic women (IKBKG:c.*368C>A and IKBKG:c.*402C>T. The simultaneous occurrence of the TT genotype in the mother and the TT genotype in the daughter or a T allele in the son increased the risk of preeclampsia development 2.59 fold. Additionally, we found that the configuration of maternal/fetal genotypes (maternal TT/ daughter TT or maternal TT/son T of IKBKG:c.*402C/T variant is associated with the level of NEMO gene expression. Our results showed that, the simultaneous occurrence of the maternal TT genotype (IKBKG:c.*402C>T variants and TT genotype in the daughter or T allele in the son correlates with the level of NEMO gene expression and increases the risk of preeclampsia development. Our observations may offer a new insight into the genetic etiology and pathogenesis of preeclampsia.
Kang, Hong; Wang, Frank; Zhou, Sicheng; Miao, Qi; Gong, Yang
2017-01-01
Health information technology (HIT) events, a subtype of patient safety events, pose a major threat and barrier toward a safer healthcare system. It is crucial to gain a better understanding of the nature of the errors and adverse events caused by current HIT systems. The scarcity of HIT event-exclusive databases and event reporting systems indicates the challenge of identifying the HIT events from existing resources. FDA Manufacturer and User Facility Device Experience (MAUDE) database is a potential resource for HIT events. However, the low proportion and the rapid evolvement of HIT-related events present challenges for distinguishing them from other equipment failures and hazards. We proposed a strategy to identify and synchronize HIT events from MAUDE by using a filter based on structured features and classifiers based on unstructured features. The strategy will help us develop and grow an HIT event-exclusive database, keeping pace with updates to MAUDE toward shared learning.
Contribution of Visual Information about Ball Trajectory to Baseball Hitting Accuracy.
Directory of Open Access Journals (Sweden)
Takatoshi Higuchi
Full Text Available The contribution of visual information about a pitched ball to the accuracy of baseball-bat contact may vary depending on the part of trajectory seen. The purpose of the present study was to examine the relationship between hitting accuracy and the segment of the trajectory of the flying ball that can be seen by the batter. Ten college baseball field players participated in the study. The systematic error and standardized variability of ball-bat contact on the bat coordinate system and pitcher-to-catcher direction when hitting a ball launched from a pitching machine were measured with or without visual occlusion and analyzed using analysis of variance. The visual occlusion timing included occlusion from 150 milliseconds (ms after the ball release (R+150, occlusion from 150 ms before the expected arrival of the launched ball at the home plate (A-150, and a condition with no occlusion (NO. Twelve trials in each condition were performed using two ball speeds (31.9 m·s-1 and 40.3 m·s-1. Visual occlusion did not affect the mean location of ball-bat contact in the bat's long axis, short axis, and pitcher-to-catcher directions. Although the magnitude of standardized variability was significantly smaller in the bat's short axis direction than in the bat's long axis and pitcher-to-catcher directions (p < 0.001, additional visible time from the R+150 condition to the A-150 and NO conditions resulted in a further decrease in standardized variability only in the bat's short axis direction (p < 0.05. The results suggested that there is directional specificity in the magnitude of standardized variability with different visible time. The present study also confirmed the limitation to visual information is the later part of the ball trajectory for improving hitting accuracy, which is likely due to visuo-motor delay.
Steganalysis using logistic regression
Lubenko, Ivans; Ker, Andrew D.
2011-02-01
We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.
SEPARATION PHENOMENA LOGISTIC REGRESSION
Directory of Open Access Journals (Sweden)
Ikaro Daniel de Carvalho Barreto
2014-03-01
Full Text Available This paper proposes an application of concepts about the maximum likelihood estimation of the binomial logistic regression model to the separation phenomena. It generates bias in the estimation and provides different interpretations of the estimates on the different statistical tests (Wald, Likelihood Ratio and Score and provides different estimates on the different iterative methods (Newton-Raphson and Fisher Score. It also presents an example that demonstrates the direct implications for the validation of the model and validation of variables, the implications for estimates of odds ratios and confidence intervals, generated from the Wald statistics. Furthermore, we present, briefly, the Firth correction to circumvent the phenomena of separation.
Adaptive metric kernel regression
DEFF Research Database (Denmark)
Goutte, Cyril; Larsen, Jan
2000-01-01
Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...
Adaptive Metric Kernel Regression
DEFF Research Database (Denmark)
Goutte, Cyril; Larsen, Jan
1998-01-01
Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...
DEPDC5 takes a second hit in familial focal epilepsy.
Anderson, Matthew P
2018-04-30
Loss-of-function mutations in a single allele of the gene encoding DEP domain-containing 5 protein (DEPDC5) are commonly linked to familial focal epilepsy with variable foci; however, a subset of patients presents with focal cortical dysplasia that is proposed to result from a second-hit somatic mutation. In this issue of the JCI, Ribierre and colleagues provide several lines of evidence to support second-hit DEPDC5 mutations in this disorder. Moreover, the authors use in vivo, in utero electroporation combined with CRISPR-Cas9 technology to generate a murine model of the disease that recapitulates human manifestations, including cortical dysplasia-like changes, focal seizures, and sudden unexpected death. This study provides important insights into familial focal epilepsy and provides a preclinical model for evaluating potential therapies.
Hit size effectiveness in relation to the microdosimetric site size
International Nuclear Information System (INIS)
Varma, M.N.; Wuu, C.S.; Zaider, M.
1994-01-01
This paper examines the effect of site size (that is, the diameter of the microdosimetric volume) on the hit size effectiveness function (HSEF), q(y), for several endpoints relevant in radiation protection. A Bayesian and maximum entropy approach is used to solve the integral equations that determine, given microdosimetric spectra and measured initial slopes, the function q(y). All microdosimetric spectra have been calculated de novo. The somewhat surprising conclusion of this analysis is that site size plays only a minor role in selecting the hit size effectiveness function q(y). It thus appears that practical means (e.g. conventional proportional counters) are already at hand to actually implement the HSEF as a radiation protection tool. (Author)
Tolosa-Hunt Syndrome in Double-Hit Lymphoma
Directory of Open Access Journals (Sweden)
Prakash Peddi
2015-01-01
Full Text Available Tolosa-Hunt syndrome (THS is a painful condition characterized by hemicranial pain, retroorbital pain, loss of vision, oculomotor nerve paralysis, and sensory loss in distribution of ophthalmic and maxillary division of trigeminal nerve. Lymphomas rarely involve cavernous sinus and simulate Tolosa-Hunt syndrome. Here we present a first case of double-hit B cell lymphoma (DHL relapsing and masquerading as Tolosa-Hunt syndrome. The neurological findings were explained by a lymphomatous infiltration of the right Gasserian ganglion which preceded systemic relapse. As part of this report, the diagnostic criteria for Tolosa-Hunt syndrome and double-hit lymphoma are reviewed and updated treatment recommendations are presented.
Heparin- induced thrombocytopenia (HIT: a case report of CABG patient
Directory of Open Access Journals (Sweden)
Alireza Jahangirifard
2016-08-01
Full Text Available Heparin- induced thrombocytopenia (HIT is an antibody mediated adverse effect of heparin therapy which is classified into two subtypes, HITI which is non-immune, spontaneously reversible thrombocytopenia and; HITII which is an autoimmune-mediated adverse effect of heparin therapy. In this case report, we described a 65-year old male patient with HITII after coronary artery bypass grafting.Key words: Heparin- induced thrombocytopenia, Heparin- induced thrombosis, coronary artery bypass grafting.
75 FR 5595 - HIT Standards Committee Advisory Meeting; Notice of Meeting
2010-02-03
... Technology HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National... Health Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT... Federal Health IT Strategic Plan, and in accordance with policies developed by the HIT Policy Committee...
76 FR 28784 - HIT Standards Committee's Workgroup Meetings; Notice of Meetings
2011-05-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Standards Committee's... implementation of the Federal Health IT Strategic Plan, and in accordance with policies developed by the HIT...
76 FR 50736 - HIT Standards Committee's Workgroup Meetings; Notice of Meetings
2011-08-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Standards Committee's... implementation of the Federal Health IT Strategic Plan, and in accordance with policies developed by the HIT...
76 FR 14975 - HIT Standards Committee's Workgroup Meetings; Notice of Meetings
2011-03-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Standards Committee's Workgroup Meetings; Notice of... be open to the public via dial-in access only. Name of Committees: HIT Standards Committee's... implementation of the Federal Health IT Strategic Plan, and in accordance with policies developed by the HIT...
Development of pulsation technique for single ion hit system
Energy Technology Data Exchange (ETDEWEB)
Sakai, Takuro; Hamano, Tsuyoshi; Hirao, Toshio; Kamiya, Tomihiro [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment
1996-12-01
When a high energy heavy ion enters into a substance, high density of ionization and excitement occurrs along its flying trace. Especially, when such an ion enters into a semiconductor cell, a bit inversion called single event is occurred or a phenomenon destroyed element itself on case of the worst is formed. The present semiconductor cell is made in a size of some micron square, as different from its accumulated degree. In order to analyze the single event phenomenon formed by entering ion into such fine region in detail, a technique possible enter heavy ion beam with space resolution under 1 micron to each sample is necessary. In order to develop this technique, a static type high speed beam switch for control of entering a beam into a sample and a single ion detector for detecting entrance of ion into the sample were installed to heavy ion microbeam forming apparatus. The single ion hit system in Takasaki Radiation Chemistry Research Establishment, JAERI succeeded in detection and control technique of the single ion and control of noise due to pulsization and finished development of basic technique of the single ion hit, since now. After today, it is planned to hit actually the single ion onto the sample and evaluate its accuracy. (G.K.)
Effective progression of nuclear magnetic resonance-detected fragment hits.
Eaton, Hugh L; Wyss, Daniel F
2011-01-01
Fragment-based drug discovery (FBDD) has become increasingly popular over the last decade as an alternate lead generation tool to HTS approaches. Several compounds have now progressed into the clinic which originated from a fragment-based approach, demonstrating the utility of this emerging field. While fragment hit identification has become much more routine and may involve different screening approaches, the efficient progression of fragment hits into quality lead series may still present a major bottleneck for the broadly successful application of FBDD. In our laboratory, we have extensive experience in fragment-based NMR screening (SbN) and the subsequent iterative progression of fragment hits using structure-assisted chemistry. To maximize impact, we have applied this approach strategically to early- and high-priority targets, and those struggling for leads. Its application has yielded a clinical candidate for BACE1 and lead series in about one third of the SbN/FBDD projects. In this chapter, we will give an overview of our strategy and focus our discussion on NMR-based FBDD approaches. Copyright © 2011 Elsevier Inc. All rights reserved.
Hill, Benjamin David; Womble, Melissa N; Rohling, Martin L
2015-01-01
This study utilized logistic regression to determine whether performance patterns on Concussion Vital Signs (CVS) could differentiate known groups with either genuine or feigned performance. For the embedded measure development group (n = 174), clinical patients and undergraduate students categorized as feigning obtained significantly lower scores on the overall test battery mean for the CVS, Shipley-2 composite score, and California Verbal Learning Test-Second Edition subtests than did genuinely performing individuals. The final full model of 3 predictor variables (Verbal Memory immediate hits, Verbal Memory immediate correct passes, and Stroop Test complex reaction time correct) was significant and correctly classified individuals in their known group 83% of the time (sensitivity = .65; specificity = .97) in a mixed sample of young-adult clinical cases and simulators. The CVS logistic regression function was applied to a separate undergraduate college group (n = 378) that was asked to perform genuinely and identified 5% as having possibly feigned performance indicating a low false-positive rate. The failure rate was 11% and 16% at baseline cognitive testing in samples of high school and college athletes, respectively. These findings have particular relevance given the increasing use of computerized test batteries for baseline cognitive testing and return-to-play decisions after concussion.
Karnes, Jason H; Shaffer, Christian M; Cronin, Robert; Bastarache, Lisa; Gaudieri, Silvana; James, Ian; Pavlos, Rebecca; Steiner, Heidi E; Mosley, Jonathan D; Mallal, Simon; Denny, Joshua C; Phillips, Elizabeth J; Roden, Dan M
2017-09-01
Heparin-induced thrombocytopenia (HIT) is an unpredictable, life-threatening, immune-mediated reaction to heparin. Variation in human leukocyte antigen (HLA) genes is now used to prevent immune-mediated adverse drug reactions. Combinations of HLA alleles and killer cell immunoglobulin-like receptors (KIR) are associated with multiple autoimmune diseases and infections. The objective of this study is to evaluate the association of HLA alleles and KIR types, alone or in the presence of different HLA ligands, with HIT. HIT cases and heparin-exposed controls were identified in BioVU, an electronic health record coupled to a DNA biobank. HLA sequencing and KIR type imputation using Illumina OMNI-Quad data were performed. Odds ratios for HLA alleles and KIR types and HLA*KIR interactions using conditional logistic regressions were determined in the overall population and by race/ethnicity. Analysis was restricted to KIR types and HLA alleles with a frequency greater than 0.01. The p values for HLA and KIR association were corrected by using a false discovery rate qHIT cases and 350 matched controls were identified. No statistical differences in baseline characteristics were observed between cases and controls. The HLA-DRB3*01:01 allele was significantly associated with HIT in the overall population (odds ratio 2.81 [1.57-5.02], p=2.1×10 -4 , q=0.02) and in individuals with European ancestry, independent of other alleles. No KIR types were associated with HIT, although a significant interaction was observed between KIR2DS5 and the HLA-C1 KIR binding group (p=0.03). The HLA-DRB3*01:01 allele was identified as a potential risk factor for HIT. This class II HLA gene and allele represent biologically plausible candidates for influencing HIT pathogenesis. We found limited evidence of the role of KIR types in HIT pathogenesis. Replication and further study of the HLA-DRB3*01:01 association is necessary. © 2017 Pharmacotherapy Publications, Inc.
DEFF Research Database (Denmark)
Hansen, Henrik; Tarp, Finn
2001-01-01
This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy....... investment. We conclude by stressing the need for more theoretical work before this kind of cross-country regressions are used for policy purposes.......This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy...
Al-Ghraibah, Amani
Solar flares release stored magnetic energy in the form of radiation and can have significant detrimental effects on earth including damage to technological infrastructure. Recent work has considered methods to predict future flare activity on the basis of quantitative measures of the solar magnetic field. Accurate advanced warning of solar flare occurrence is an area of increasing concern and much research is ongoing in this area. Our previous work 111] utilized standard pattern recognition and classification techniques to determine (classify) whether a region is expected to flare within a predictive time window, using a Relevance Vector Machine (RVM) classification method. We extracted 38 features which describing the complexity of the photospheric magnetic field, the result classification metrics will provide the baseline against which we compare our new work. We find a true positive rate (TPR) of 0.8, true negative rate (TNR) of 0.7, and true skill score (TSS) of 0.49. This dissertation proposes three basic topics; the first topic is an extension to our previous work [111, where we consider a feature selection method to determine an appropriate feature subset with cross validation classification based on a histogram analysis of selected features. Classification using the top five features resulting from this analysis yield better classification accuracies across a large unbalanced dataset. In particular, the feature subsets provide better discrimination of the many regions that flare where we find a TPR of 0.85, a TNR of 0.65 sightly lower than our previous work, and a TSS of 0.5 which has an improvement comparing with our previous work. In the second topic, we study the prediction of solar flare size and time-to-flare using support vector regression (SVR). When we consider flaring regions only, we find an average error in estimating flare size of approximately half a GOES class. When we additionally consider non-flaring regions, we find an increased average
Srinivas, Nuggehally R; Syed, Muzeeb
2016-03-01
Linezolid, a oxazolidinone, was the first in class to be approved for the treatment of bacterial infections arising from both susceptible and resistant strains of Gram-positive bacteria. Since overt exposure of linezolid may precipitate serious toxicity issues, therapeutic drug monitoring (TDM) may be required in certain situations, especially in patients who are prescribed other co-medications. Using appropriate oral pharmacokinetic data (single dose and steady state) for linezolid, both maximum plasma drug concentration (Cmax) versus area under the plasma concentration-time curve (AUC) and minimum plasma drug concentration (Cmin) versus AUC relationship was established by linear regression models. The predictions of the AUC values were performed using published mean/median Cmax or Cmin data and appropriate regression lines. The quotient of observed and predicted values rendered fold difference calculation. The mean absolute error (MAE), root mean square error (RMSE), correlation coefficient (r), and the goodness of the AUC fold prediction were used to evaluate the two models. The Cmax versus AUC and trough plasma concentration (Ctrough) versus AUC models displayed excellent correlation, with r values of >0.9760. However, linezolid AUC values were predicted to be within the narrower boundary of 0.76 to 1.5-fold by a higher percentage by the Ctrough (78.3%) versus Cmax model (48.2%). The Ctrough model showed superior correlation of predicted versus observed values and RMSE (r = 0.9031; 28.54%, respectively) compared with the Cmax model (r = 0.5824; 61.34%, respectively). A single time point strategy of using Ctrough level is possible as a prospective tool to measure the AUC of linezolid in the patient population.
HIT and brain reward function: A case of mistaken identity (theory).
Wright, Cory; Colombo, Matteo; Beard, Alexander
2017-08-01
This paper employs a case study from the history of neuroscience-brain reward function-to scrutinize the inductive argument for the so-called 'Heuristic Identity Theory' (HIT). The case fails to support HIT, illustrating why other case studies previously thought to provide empirical support for HIT also fold under scrutiny. After distinguishing two different ways of understanding the types of identity claims presupposed by HIT and considering other conceptual problems, we conclude that HIT is not an alternative to the traditional identity theory so much as a relabeling of previously discussed strategies for mechanistic discovery. Copyright © 2017. Published by Elsevier Ltd.
Is There an Association Between Heparin-Induced Thrombocytopenia (HIT) and Autoimmune Disease?
Klinkhammer, Brent; Gruchalla, Michael
2018-03-01
Heparin-induced thrombocytopenia (HIT) is a drug-induced, immunoglobulin G medicated autoimmune disorder associated with several negative clinical outcomes including increased morbidity, mortality, and increased medical costs. Previous studies have shown associations between comorbid autoimmune diseases, but there is little known about associations between HIT and autoimmunity. To provide clinical data to suggest an association between HIT and autoimmunity. Retrospective chart review of 59 cases with a diagnosis of HIT and 251 matched controls without a HIT diagnosis, comparing the prevalence of autoimmunity in each group. A single, large upper Midwest health care system. Patients with a diagnosis of HIT were significantly more likely to have a comorbid autoimmune disease than those without a HIT diagnosis (55.9% vs 10.8%, P HIT were significantly more likely to have a diagnosis of antiphospholipid syndrome (15.3% vs 0.0%, P HIT were significantly older than controls ( P HIT and autoimmune disease and suggests a need for more research into the relationship between HIT and autoimmunity. These results could alter the anticoagulation management of venous thromboembolism and acute coronary syndrome in patients with a previously identified autoimmune disease. Copyright© Wisconsin Medical Society.
Modified Regression Correlation Coefficient for Poisson Regression Model
Kaengthong, Nattacha; Domthong, Uthumporn
2017-09-01
This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).
Vectors, a tool in statistical regression theory
Corsten, L.C.A.
1958-01-01
Using linear algebra this thesis developed linear regression analysis including analysis of variance, covariance analysis, special experimental designs, linear and fertility adjustments, analysis of experiments at different places and times. The determination of the orthogonal projection, yielding
An Additive-Multiplicative Cox-Aalen Regression Model
DEFF Research Database (Denmark)
Scheike, Thomas H.; Zhang, Mei-Jie
2002-01-01
Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects...
Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun
2016-07-01
In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
International Nuclear Information System (INIS)
Timmermann, Beate; Kortmann, Rolf-Dieter; Kuehl, Joachim; Meisner, Christoph; Slavc, Irene; Pietsch, Thorsten; Bamberg, Michael
2000-01-01
Purpose: To evaluate the outcome in children with anaplastic ependymomas after surgery, irradiation, and chemotherapy; and to identify prognostic factors for survival. Methods and Materials: Fifty-five children (n = 27 girls, 28 boys; median age at diagnosis, 6.2 years) with newly diagnosed anaplastic ependymomas were treated in the multicenter, prospective trials HIT 88/89 and HIT 91. Macroscopic complete resection was achieved in 28 patients; 27 patients underwent incomplete resection. All patients received chemotherapy before (n = 40) or after irradiation (n = 15). The irradiation volume encompassed either the neuraxis followed by a boost to the primary tumor site (n = 40) or the tumor region only (n = 13). No radiotherapy was administered in two patients. Results: Median follow-up was 38 months. The overall survival rate at 3 years after surgery was 75.6%. Disease progression occurred in 25 children with local progression occurring in 20. The median time to disease progression was 45 months. The only significant prognostic factor was the extent of resection (estimated progression-free survival [EPFS] after 3 years was 83.3% after complete resection and 38.5% after incomplete resection) and the presence of metastases at the time of diagnosis (0% vs. 65.8% 3-year EPFS in localized tumors). Age, sex, tumor site, mode of chemotherapy, and irradiation volume did not influence survival. Conclusions: Treatment centers should be meticulous about surgery and diagnostic workup. Because the primary tumor region is the predominant site of failure it is important to intensify local treatment. Dose escalation by hyperfractionation or stereotactic radiotherapy might be a promising approach in macroscopically residual disease. The role of adjuvant chemotherapy requires further study
Pelvic rotation torque during fast-pitch softball hitting under three ball height conditions.
Iino, Yoichi; Fukushima, Atsushi; Kojima, Takeji
2014-08-01
The purpose of this study was to investigate the relevance of hip joint angles to the production of the pelvic rotation torque in fast-pitch softball hitting and to examine the effect of ball height on this production. Thirteen advanced female softball players hit stationary balls at three different heights: high, middle, and low. The pelvic rotation torque, defined as the torque acting on the pelvis through the hip joints about the pelvic superior-inferior axis, was determined from the kinematic and force plate data using inverse dynamics. Irrespective of the ball heights, the rear hip extension, rear hip external rotation, front hip adduction, and front hip flexion torques contributed to the production of pelvic rotation torque. Although the contributions of the adduction and external rotation torques at each hip joint were significantly different among the ball heights, the contributions of the front and rear hip joint torques were similar among the three ball heights owing to cancelation of the two torque components. The timings of the peaks of the hip joint torque components were significantly different, suggesting that softball hitters may need to adjust the timings of the torque exertions fairly precisely to rotate the upper body effectively.
Selective histone deacetylase 6 inhibition prolongs survival in a lethal two-hit model.
Cheng, Xin; Liu, Zhengcai; Liu, Baoling; Zhao, Ting; Li, Yongqing; Alam, Hasan B
2015-07-01
Hemorrhagic shock (HS) followed by a subsequent insult ("second hit") often initiates an exaggerated systemic inflammatory response and multiple organ failure. We have previously demonstrated that valproic acid, a pan histone deacetylase inhibitor, could improve survival in a rodent "two-hit" model. In the present study, our goal was to determine whether selective inhibition of histone deacetylase 6 with Tubastatin A (Tub-A) could prolong survival in a two-hit model where HS was followed by sepsis from cecal ligation and puncture (CLP). C57Bl/6J mice were subjected to sublethal HS (30% blood loss) and then randomly divided into two groups (n = 13 per group) such as Tub-A group (treatment) and vehicle (VEH) group (control). The Tub-A group was given an intraperitoneal injection of Tub-A (70 mg/kg) dissolved in dimethyl sulfoxide (DMSO). The VEH group was injected with DMSO (1 μl/g body weight). After 24 h, all mice were subjected CLP followed immediately by another dose of Tub-A or DMSO. Survival was monitored for 10 d. In a parallel study, peritoneal irrigation fluid and liver tissue from Tub-A- or DMSO-treated mice were collected 3 h after CLP. Enzyme-linked immunosorbent assay was performed to quantify activity of the myeloperoxidase and concentrations of tumor necrosis factor-alpha (TNF-α) and interleukin 6 (IL-6) in the peritoneal irrigation fluid. RNA was isolated from the liver tissue, and real-time polymerase chain reaction was performed to measure relative messenger RNA levels of TNF-α and IL-6. Treatment with Tub-A significantly improved survival compared with that of the control (69.2% versus 15.4%). In addition, Tub-A significantly suppressed myeloperoxidase activity (169.9 ± 8.4 ng/mL versus 70.4 ± 17.4 ng/mL; P hit model. Copyright © 2015 Elsevier Inc. All rights reserved.
Current drive experiments in the HIT-II spherical tokamak
International Nuclear Information System (INIS)
Jarboe, T.R.; Gu, P.; Isso, V.A.; Jewell, P.E.; McCollam, K.J.; Nelson, B.A.; Ramon, R.; Redd, A.J.; Sieck, P.E.; Smith, R.J.; Nagata, M.; Uyama, T.
2001-01-01
The Helicity Injected Torus (Hit) program has made progress in understanding relaxation and helicity injection current drive. Helicity-conserving MHD activity during the inductive (Ohmic) current ramp demonstrates the profile flattening needed for coaxial helicity injection (CHI). Results from cathode and anode central column (CC) CHI pulses are consistent with the electron locking model of current drive from a pure n=1 mode. Finally, low density CHI, compatible with Ohmic operation, has been achieved. Some enhancement of CHI discharges with the application of Ohmic is shown. (author)
ANALYSIS MUSIC CONCERTS ADOPTING THE MATHEMATICAL MODEL OF HIT PHENOMENA
Kawahata Yasuko; Genda Etsuo; Ishii Akira
2013-01-01
A mathematical model for the hit phenomenon in entertainment within a society is presented as a stochastic process of interactions of human dynamics. In this paper, we analyzed music to the concert.Knowing the cost of advertising the concert is difficult. But exposure to the media of the artist can be seen. We tried to analysis of music concert itself by performing a prediction of reputation of artists during the concert tour from this exposure.In this paper, The world most pop...
Polynomial regression analysis and significance test of the regression function
International Nuclear Information System (INIS)
Gao Zhengming; Zhao Juan; He Shengping
2012-01-01
In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)
Foster, Guy M.; Graham, Jennifer L.
2016-04-06
The Kansas River is a primary source of drinking water for about 800,000 people in northeastern Kansas. Source-water supplies are treated by a combination of chemical and physical processes to remove contaminants before distribution. Advanced notification of changing water-quality conditions and cyanobacteria and associated toxin and taste-and-odor compounds provides drinking-water treatment facilities time to develop and implement adequate treatment strategies. The U.S. Geological Survey (USGS), in cooperation with the Kansas Water Office (funded in part through the Kansas State Water Plan Fund), and the City of Lawrence, the City of Topeka, the City of Olathe, and Johnson County Water One, began a study in July 2012 to develop statistical models at two Kansas River sites located upstream from drinking-water intakes. Continuous water-quality monitors have been operated and discrete-water quality samples have been collected on the Kansas River at Wamego (USGS site number 06887500) and De Soto (USGS site number 06892350) since July 2012. Continuous and discrete water-quality data collected during July 2012 through June 2015 were used to develop statistical models for constituents of interest at the Wamego and De Soto sites. Logistic models to continuously estimate the probability of occurrence above selected thresholds were developed for cyanobacteria, microcystin, and geosmin. Linear regression models to continuously estimate constituent concentrations were developed for major ions, dissolved solids, alkalinity, nutrients (nitrogen and phosphorus species), suspended sediment, indicator bacteria (Escherichia coli, fecal coliform, and enterococci), and actinomycetes bacteria. These models will be used to provide real-time estimates of the probability that cyanobacteria and associated compounds exceed thresholds and of the concentrations of other water-quality constituents in the Kansas River. The models documented in this report are useful for characterizing changes
Quantum algorithm for linear regression
Wang, Guoming
2017-07-01
We present a quantum algorithm for fitting a linear regression model to a given data set using the least-squares approach. Differently from previous algorithms which yield a quantum state encoding the optimal parameters, our algorithm outputs these numbers in the classical form. So by running it once, one completely determines the fitted model and then can use it to make predictions on new data at little cost. Moreover, our algorithm works in the standard oracle model, and can handle data sets with nonsparse design matrices. It runs in time poly( log2(N ) ,d ,κ ,1 /ɛ ) , where N is the size of the data set, d is the number of adjustable parameters, κ is the condition number of the design matrix, and ɛ is the desired precision in the output. We also show that the polynomial dependence on d and κ is necessary. Thus, our algorithm cannot be significantly improved. Furthermore, we also give a quantum algorithm that estimates the quality of the least-squares fit (without computing its parameters explicitly). This algorithm runs faster than the one for finding this fit, and can be used to check whether the given data set qualifies for linear regression in the first place.
Capture orbits around asteroids by hitting zero-velocity curves
Wang, Wei; Yang, Hongwei; Zhang, Wei; Ma, Guangfu
2017-12-01
The problem of capturing a spacecraft from a heliocentric orbit into a high parking orbit around binary asteroids is investigated in the current study. To reduce the braking Δ V, a new capture strategy takes advantage of the three-body gravity of the binary asteroid to lower the inertial energy before applying the Δ V. The framework of the circular restricted three-body problem (CR3BP) is employed for the binary asteroid system. The proposed capture strategy is based on the mechanism by which inertial energy can be decreased sharply near zero-velocity curves (ZVCs). The strategy has two steps, namely, hitting the target ZVC and raising the periapsis by a small Δ V at the apoapsis. By hitting the target ZVC, the positive inertial energy decreases and becomes negative. Using a small Δ V, the spacecraft inserts into a bounded orbit around the asteroid. In addition, a rotating mass dipole model is employed for elongated asteroids, which leads to dynamics similar to that of the CR3BP. With this approach, the proposed capture strategy can be applied to elongated asteroids. Numerical simulations validate that the proposed capture strategy is applicable for the binary asteroid 90 Antiope and the elongated asteroid 216 Kleopatra.
Promiscuous 2-aminothiazoles (PrATs): a frequent hitting scaffold.
Devine, Shane M; Mulcair, Mark D; Debono, Cael O; Leung, Eleanor W W; Nissink, J Willem M; Lim, San Sui; Chandrashekaran, Indu R; Vazirani, Mansha; Mohanty, Biswaranjan; Simpson, Jamie S; Baell, Jonathan B; Scammells, Peter J; Norton, Raymond S; Scanlon, Martin J
2015-02-12
We have identified a class of molecules, known as 2-aminothiazoles (2-ATs), as frequent-hitting fragments in biophysical binding assays. This was exemplified by 4-phenylthiazol-2-amine being identified as a hit in 14/14 screens against a diverse range of protein targets, suggesting that this scaffold is a poor starting point for fragment-based drug discovery. This prompted us to analyze this scaffold in the context of an academic fragment library used for fragment-based drug discovery (FBDD) and two larger compound libraries used for high-throughput screening (HTS). This analysis revealed that such "promiscuous 2-aminothiazoles" (PrATs) behaved as frequent hitters under both FBDD and HTS settings, although the problem was more pronounced in the fragment-based studies. As 2-ATs are present in known drugs, they cannot necessarily be deemed undesirable, but the combination of their promiscuity and difficulties associated with optimizing them into a lead compound makes them, in our opinion, poor scaffolds for fragment libraries.
Recursive Algorithm For Linear Regression
Varanasi, S. V.
1988-01-01
Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.
Post-processing through linear regression
van Schaeybroeck, B.; Vannitsem, S.
2011-03-01
Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.
Post-processing through linear regression
Directory of Open Access Journals (Sweden)
B. Van Schaeybroeck
2011-03-01
Full Text Available Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS method, a new time-dependent Tikhonov regularization (TDTR method, the total least-square method, a new geometric-mean regression (GM, a recently introduced error-in-variables (EVMOS method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified.
These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise. At long lead times the regression schemes (EVMOS, TDTR which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.
On Weighted Support Vector Regression
DEFF Research Database (Denmark)
Han, Xixuan; Clemmensen, Line Katrine Harder
2014-01-01
We propose a new type of weighted support vector regression (SVR), motivated by modeling local dependencies in time and space in prediction of house prices. The classic weights of the weighted SVR are added to the slack variables in the objective function (OF‐weights). This procedure directly...... shrinks the coefficient of each observation in the estimated functions; thus, it is widely used for minimizing influence of outliers. We propose to additionally add weights to the slack variables in the constraints (CF‐weights) and call the combination of weights the doubly weighted SVR. We illustrate...... the differences and similarities of the two types of weights by demonstrating the connection between the Least Absolute Shrinkage and Selection Operator (LASSO) and the SVR. We show that an SVR problem can be transformed to a LASSO problem plus a linear constraint and a box constraint. We demonstrate...
SHIELD-HIT12A - a Monte Carlo particle transport program for ion therapy research
DEFF Research Database (Denmark)
Bassler, Niels; Hansen, David Christoffer; Lühr, Armin
2014-01-01
. We experienced that new users quickly learn to use SHIELD-HIT12A and setup new geometries. Contrary to previous versions of SHIELD-HIT, the 12A distribution comes along with easy-to-use example files and an English manual. A new implementation of Vavilov straggling resulted in a massive reduction......Abstract. Purpose: The Monte Carlo (MC) code SHIELD-HIT simulates the transport of ions through matter. Since SHIELD-HIT08 we added numerous features that improves speed, usability and underlying physics and thereby the user experience. The “-A” fork of SHIELD-HIT also aims to attach SHIELD....... It supports native formats compatible with the heavy ion treatment planning system TRiP. Stopping power files follow ICRU standard and are generated using the libdEdx library, which allows the user to choose from a multitude of stopping power tables. Results: SHIELD-HIT12A runs on Linux and Windows platforms...
Hadyan, Fadhlil; Shaufiah; Arif Bijaksana, Moch.
2017-01-01
Automatic summarization is a system that can help someone to take the core information of a long text instantly. The system can help by summarizing text automatically. there’s Already many summarization systems that have been developed at this time but there are still many problems in those system. In this final task proposed summarization method using document index graph. This method utilizes the PageRank and HITS formula used to assess the web page, adapted to make an assessment of words in the sentences in a text document. The expected outcome of this final task is a system that can do summarization of a single document, by utilizing document index graph with TextRank and HITS to improve the quality of the summary results automatically.
Combining Alphas via Bounded Regression
Directory of Open Access Journals (Sweden)
Zura Kakushadze
2015-11-01
Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.
Regression in autistic spectrum disorders.
Stefanatos, Gerry A
2008-12-01
A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.
Plasma response to sustainment with imposed-dynamo current drive in HIT-SI and HIT-SI3
Hossack, A. C.; Jarboe, T. R.; Chandra, R. N.; Morgan, K. D.; Sutherland, D. A.; Penna, J. M.; Everson, C. J.; Nelson, B. A.
2017-07-01
The helicity injected torus—steady inductive (HIT-SI) program studies efficient, steady-state current drive for magnetic confinement plasmas using a novel experimental method. Stable, high-beta spheromaks have been sustained using steady, inductive current drive. Externally induced loop voltage and magnetic flux are oscillated together so that helicity and power injection are always positive, sustaining the edge plasma current indefinitely. Imposed-dynamo current drive (IDCD) theory further shows that the entire plasma current is sustained. The method is ideal for low aspect ratio, toroidal geometries with closed flux surfaces. Experimental studies of spheromak plasmas sustained with IDCD have shown stable magnetic profiles with evidence of pressure confinement. New measurements show coherent motion of a stable spheromak in response to the imposed perturbations. On the original device two helicity injectors were mounted on either side of the spheromak and the injected mode spectrum was predominantly n = 1. Coherent, rigid motion indicates that the spheromak is stable and a lack of plasma-generated n = 1 energy indicates that the maximum q is maintained below 1 during sustainment. Results from the HIT-SI3 device are also presented. Three inductive helicity injectors are mounted on one side of the spheromak flux conserver. Varying the relative injector phasing changes the injected mode spectrum which includes n = 2, 3, and higher modes.
Linear regression in astronomy. I
Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh
1990-01-01
Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.
Warkentin, Theodore E; Pai, Menaka; Linkins, Lori-Ann
2017-08-31
Direct oral anticoagulants (DOACs) are attractive options for treatment of heparin-induced thrombocytopenia (HIT). We report our continuing experience in Hamilton, ON, Canada, since January 1, 2015 (when we completed our prospective study of rivaroxaban for HIT), using rivaroxaban for serologically confirmed HIT (4Ts score ≥4 points; positive platelet factor 4 [PF4]/heparin immunoassay, positive serotonin-release assay). We also performed a literature review of HIT treatment using DOACs (rivaroxaban, apixaban, dabigatran, edoxaban). We focused on patients who received DOAC therapy for acute HIT as either primary therapy (group A) or secondary therapy (group B; initial treatment using a non-DOAC/non-heparin anticoagulant with transition to a DOAC during HIT-associated thrombocytopenia). Our primary end point was occurrence of objectively documented thrombosis during DOAC therapy for acute HIT. We found that recovery without new, progressive, or recurrent thrombosis occurred in all 10 Hamilton patients with acute HIT treated with rivaroxaban. Data from the literature review plus these new data identified a thrombosis rate of 1 of 46 patients (2.2%; 95% CI, 0.4%-11.3%) in patients treated with rivaroxaban during acute HIT (group A, n = 25; group B, n = 21); major hemorrhage was seen in 0 of 46 patients. Similar outcomes in smaller numbers of patients were observed with apixaban (n = 12) and dabigatran (n = 11). DOACs offer simplified management of selected patients, as illustrated by a case of persisting (autoimmune) HIT (>2-month platelet recovery with inversely parallel waning of serum-induced heparin-independent serotonin release) with successful outpatient rivaroxaban management of HIT-associated thrombosis. Evidence supporting efficacy and safety of DOACs for acute HIT is increasing, with the most experience reported for rivaroxaban. © 2017 by The American Society of Hematology.
Prenatal cannabis exposure - The "first hit" to the endocannabinoid system.
Richardson, Kimberlei A; Hester, Allison K; McLemore, Gabrielle L
As more states and countries legalize medical and/or adult recreational marijuana use, the incidences of prenatal cannabis exposure (PCE) will likely increase. While young people increasingly view marijuana as innocuous, marijuana preparations have been growing in potency in recent years, potentially creating global clinical, public health, and workforce concerns. Unlike fetal alcohol spectrum disorder, there is no phenotypic syndrome associated with PCE. There is also no preponderance of evidence that PCE causes lifelong cognitive, behavioral, or functional abnormalities, and/or susceptibility to subsequent addiction. However, there is compelling circumstantial evidence, based on the principles of teratology and fetal malprogramming, suggesting that pregnant women should refrain from smoking marijuana. The usage of marijuana during pregnancy perturbs the fetal endogenous cannabinoid signaling system (ECSS), which is present and active from the early embryonic stage, modulating neurodevelopment and continuing this role into adulthood. The ECSS is present in virtually every brain structure and organ system, and there is also evidence that this system is important in the regulation of cardiovascular processes. Endocannabinoids (eCBs) undergird a broad spectrum of processes, including the early stages of fetal neurodevelopment and uterine implantation. Delta-9-tetrahydrocannabinol (THC), the psychoactive chemical in cannabis, enters maternal circulation, and readily crosses the placental membrane. THC binds to CB receptors of the fetal ECSS, altering neurodevelopment and possibly rewiring ECSS circuitry. In this review, we discuss the Double-Hit Hypothesis as it relates to PCE. We contend that PCE, similar to a neurodevelopmental teratogen, delivers the first hit to the ECSS, which is compromised in such a way that a second hit (i.e., postnatal stressors) will precipitate the emergence of a specific phenotype. In summary, we conclude that perturbations of the
Advanced statistics: linear regression, part I: simple linear regression.
Marill, Keith A
2004-01-01
Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.
Dykes, Patricia C; Hurley, Ann C; Brown, Suzanne; Carr, Robyn; Cashen, Margaret; Collins, Rita; Cook, Robyn; Currie, Leanne; Docherty, Charles; Ensio, Anneli; Foster, Joanne; Hardiker, Nicholas R; Honey, Michelle L L; Killalea, Rosaleen; Murphy, Judy; Saranto, Kaija; Sensmeier, Joyce; Weaver, Charlotte
2009-01-01
In 2005, the Healthcare Information Management Systems Society (HIMSS) Nursing Informatics Community developed a survey to measure the impact of health information technology (HIT), the I-HIT Scale, on the role of nurses and interdisciplinary communication in hospital settings. In 2007, nursing informatics colleagues from Australia, England, Finland, Ireland, New Zealand, Scotland and the United States formed a research collaborative to validate the I-HIT across countries. All teams have completed construct and face validation in their countries. Five out of six teams have initiated reliability testing by practicing nurses. This paper reports the international collaborative's validation of the I-HIT Scale completed to date.
Tracking, aiming, and hitting the UAV with ordinary assault rifle
Racek, František; Baláž, Teodor; Krejčí, Jaroslav; Procházka, Stanislav; Macko, Martin
2017-10-01
The usage small-unmanned aerial vehicles (UAVs) is significantly increasing nowadays. They are being used as a carrier of military spy and reconnaissance devices (taking photos, live video streaming and so on), or as a carrier of potentially dangerous cargo (intended for destruction and killing). Both ways of utilizing the UAV cause the necessity to disable it. From the military point of view, to disable the UAV means to bring it down by a weapon of an ordinary soldier that is the assault rifle. This task can be challenging for the soldier because he needs visually detect and identify the target, track the target visually and aim on the target. The final success of the soldier's mission depends not only on the said visual tasks, but also on the properties of the weapon and ammunition. The paper deals with possible methods of prediction of probability of hitting the UAV targets.
Cohen, Jerome D; Aspry, Karen E; Brown, Alan S; Foody, Joanne M; Furman, Roy; Jacobson, Terry A; Karalis, Dean G; Kris-Etherton, Penny M; Laforge, Ralph; O'Toole, Michael F; Scott, Ronald D; Underberg, James A; Valuck, Thomas B; Willard, Kaye-Eileen; Ziajka, Paul E; Ito, Matthew K
2013-01-01
The workshop discussions focused on how low-density lipoprotein cholesterol (LDL-C) goal attainment can be enhanced with the use of health information technology (HIT) in different clinical settings. A gap is acknowledged in LDL-C goal attainment, but because of the passage of the American Recovery & Reinvestment Act and the Health Information Technology for Economic and Clinical Health Acts there is now reason for optimism that this gap can be narrowed. For HIT to be effectively used to achieve treatment goals, it must be implemented in a setting in which the health care team is fully committed to achieving these goals. Implementation of HIT alone has not resulted in reducing the gap. It is critical to build an effective management strategy into the HIT platform without increasing the overall work/time burden on staff. By enhancing communication between the health care team and the patient, more timely adjustments to treatment plans can be made with greater opportunity for LDL-C goal attainment and improved efficiency in the long run. Patients would be encouraged to take a more active role. Support tools are available. The National Lipid Association has developed a toolkit designed to improve patient compliance and could be modified for use in an HIT system. The importance of a collaborative approach between nongovernmental organizations such as the National Lipid Association, National Quality Forum, HIT partners, and other members of the health care industry offers the best opportunity for long-term success and the real possibility that such efforts could be applied to other chronic conditions, for example, diabetes and hypertension. Copyright © 2013 National Lipid Association. Published by Elsevier Inc. All rights reserved.
Kehimkar, Benjamin; Hoggard, Jamin C; Marney, Luke C; Billingsley, Matthew C; Fraga, Carlos G; Bruno, Thomas J; Synovec, Robert E
2014-01-31
There is an increased need to more fully assess and control the composition of kerosene-based rocket propulsion fuels such as RP-1. In particular, it is critical to make better quantitative connections among the following three attributes: fuel performance (thermal stability, sooting propensity, engine specific impulse, etc.), fuel properties (such as flash point, density, kinematic viscosity, net heat of combustion, and hydrogen content), and the chemical composition of a given fuel, i.e., amounts of specific chemical compounds and compound classes present in a fuel as a result of feedstock blending and/or processing. Recent efforts in predicting fuel chemical and physical behavior through modeling put greater emphasis on attaining detailed and accurate fuel properties and fuel composition information. Often, one-dimensional gas chromatography (GC) combined with mass spectrometry (MS) is employed to provide chemical composition information. Building on approaches that used GC-MS, but to glean substantially more chemical information from these complex fuels, we recently studied the use of comprehensive two dimensional (2D) gas chromatography combined with time-of-flight mass spectrometry (GC×GC-TOFMS) using a "reversed column" format: RTX-wax column for the first dimension, and a RTX-1 column for the second dimension. In this report, by applying chemometric data analysis, specifically partial least-squares (PLS) regression analysis, we are able to readily model (and correlate) the chemical compositional information provided by use of GC×GC-TOFMS to RP-1 fuel property information such as density, kinematic viscosity, net heat of combustion, and so on. Furthermore, we readily identified compounds that contribute significantly to measured differences in fuel properties based on results from the PLS models. We anticipate this new chemical analysis strategy will have broad implications for the development of high fidelity composition-property models, leading to an
Jung, Mary E; Bourne, Jessica E; Little, Jonathan P
2014-01-01
Affect experienced during an exercise session is purported to predict future exercise behaviour. Compared to continuous moderate-intensity exercise (CMI), the affective response to continuous vigorous-intensity exercise (CVI) has consistently been shown to be more aversive. The affective response, and overall tolerability to high-intensity interval training (HIT), is less studied. To date, there has yet to be a comparison between HIT, CVI, and CMI. The purpose of this study was to compare the tolerability and affective responses during HIT to CVI and CMI. This study utilized a repeated measures, randomized, counter-balanced design. Forty-four participants visited the laboratory on four occasions. Baseline fitness testing was conducted to establish peak power output in Watts (W peak). Three subsequent visits involved a single bout of a) HIT, corresponding to 1-minute at ∼ 100% W peak and 1-minute at ∼ 20% W peak for 20 minutes, b) CMI, corresponding to ∼ 40% W peak for 40 minutes, and c) CVI, corresponding to ∼ 80% W peak for 20 minutes. The order of the sessions was randomized. Affective responses were measured before, during and after each session. Task self-efficacy, intentions, enjoyment and preference were measured after sessions. Participants reported greater enjoyment of HIT as compared to CMI and CVI, with over 50% of participants reporting a preference to engage in HIT as opposed to either CMI or CVI. HIT was considered more pleasurable than CVI after exercise, but less pleasurable than CMI at these times. Despite this participants reported being just as confident to engage in HIT as they were CMI, but less confident to engage in CVI. This study highlights the utility of HIT in inactive individuals, and suggests that it may be a viable alternative to traditionally prescribed continuous modalities of exercise for promoting self-efficacy and enjoyment of exercise.
Directory of Open Access Journals (Sweden)
Mary E Jung
Full Text Available Affect experienced during an exercise session is purported to predict future exercise behaviour. Compared to continuous moderate-intensity exercise (CMI, the affective response to continuous vigorous-intensity exercise (CVI has consistently been shown to be more aversive. The affective response, and overall tolerability to high-intensity interval training (HIT, is less studied. To date, there has yet to be a comparison between HIT, CVI, and CMI. The purpose of this study was to compare the tolerability and affective responses during HIT to CVI and CMI. This study utilized a repeated measures, randomized, counter-balanced design. Forty-four participants visited the laboratory on four occasions. Baseline fitness testing was conducted to establish peak power output in Watts (W peak. Three subsequent visits involved a single bout of a HIT, corresponding to 1-minute at ∼ 100% W peak and 1-minute at ∼ 20% W peak for 20 minutes, b CMI, corresponding to ∼ 40% W peak for 40 minutes, and c CVI, corresponding to ∼ 80% W peak for 20 minutes. The order of the sessions was randomized. Affective responses were measured before, during and after each session. Task self-efficacy, intentions, enjoyment and preference were measured after sessions. Participants reported greater enjoyment of HIT as compared to CMI and CVI, with over 50% of participants reporting a preference to engage in HIT as opposed to either CMI or CVI. HIT was considered more pleasurable than CVI after exercise, but less pleasurable than CMI at these times. Despite this participants reported being just as confident to engage in HIT as they were CMI, but less confident to engage in CVI. This study highlights the utility of HIT in inactive individuals, and suggests that it may be a viable alternative to traditionally prescribed continuous modalities of exercise for promoting self-efficacy and enjoyment of exercise.
Linear regression in astronomy. II
Feigelson, Eric D.; Babu, Gutti J.
1992-01-01
A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.
Introduction to the use of regression models in epidemiology.
Bender, Ralf
2009-01-01
Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.
Retro-regression--another important multivariate regression improvement.
Randić, M
2001-01-01
We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.
Energy Technology Data Exchange (ETDEWEB)
Hansen, C., E-mail: hansec@uw.edu [PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Columbia University, New York, New York 10027 (United States); Marklin, G. [PSI-Center, University of Washington, Seattle, Washington 98195 (United States); Victor, B. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); Akcay, C. [Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); Jarboe, T. [HIT-SI Group, University of Washington, Seattle, Washington 98195 (United States); PSI-Center, University of Washington, Seattle, Washington 98195 (United States)
2015-04-15
We present simulations of inductive helicity injection in the Helicity Injected Torus with Steady Inductive helicity injection (HIT-SI) device that treats the entire plasma volume in a single dynamic MHD model. A new fully 3D numerical tool, the PSI-center TETrahedral mesh code, was developed that provides the geometric flexibility required for this investigation. Implementation of a zero-β Hall MHD model using PSI-TET will be presented including formulation of a new self-consistent magnetic boundary condition for the wall of the HIT-SI device. Results from simulations of HIT-SI are presented focusing on injector dynamics that are investigated numerically for the first time. Asymmetries in the plasma loading between the two helicity injectors and progression of field reversal in each injector are observed. Analysis indicates cross-coupling between injectors through confinement volume structures. Injector impedance is found to scale with toroidal current at fixed density, consistent with experimental observation. Comparison to experimental data with an injector drive frequency of 14.5 kHz shows good agreement with magnetic diagnostics. Global mode structures from Bi-Orthogonal decomposition agree well with experimental data for the first four modes.
EASY-HIT: HIV full-replication technology for broad discovery of multiple classes of HIV inhibitors.
Kremb, Stephan; Helfer, Markus; Heller, Werner; Hoffmann, Dieter; Wolff, Horst; Kleinschmidt, Andrea; Cepok, Sabine; Hemmer, Bernhard; Durner, Jörg; Brack-Werner, Ruth
2010-12-01
HIV replication assays are important tools for HIV drug discovery efforts. Here, we present a full HIV replication system (EASY-HIT) for the identification and analysis of HIV inhibitors. This technology is based on adherently growing HIV-susceptible cells, with a stable fluorescent reporter gene activated by HIV Tat and Rev. A fluorescence-based assay was designed that measures HIV infection by two parameters relating to the early and the late phases of HIV replication, respectively. Validation of the assay with a panel of nine reference inhibitors yielded effective inhibitory concentrations consistent with published data and allowed discrimination between inhibitors of early and late phases of HIV replication. Finer resolution of the effects of reference drugs on different steps of HIV replication was achieved in secondary time-of-addition assays. The EASY-HIT assay yielded high Z' scores (>0.9) and signal stabilities, confirming its robustness. Screening of the LOPAC(1280) library identified 10 compounds (0.8%), of which eight were known to inhibit HIV, validating the suitability of this assay for screening applications. Studies evaluating anti-HIV activities of natural products with the EASY-HIT technology led to the identification of three novel inhibitory compounds that apparently act at different steps of HIV-1 replication. Furthermore, we demonstrate successful evaluation of plant extracts for HIV-inhibitory activities, suggesting application of this technology for the surveillance of biological extracts with anti-HIV activities. We conclude that the EASY-HIT technology is a versatile tool for the discovery and characterization of HIV inhibitors.
Quantile regression theory and applications
Davino, Cristina; Vistocco, Domenico
2013-01-01
A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and
Lead generation and examples opinion regarding how to follow up hits.
Orita, Masaya; Ohno, Kazuki; Warizaya, Masaichi; Amano, Yasushi; Niimi, Tatsuya
2011-01-01
In fragment-based drug discovery (FBDD), not only identifying the starting fragment hit to be developed but also generating a drug lead from that starting fragment hit is important. Converting fragment hits to leads is generally similar to a high-throughput screening (HTS) hits-to-leads approach in that properties associated with activity for a target protein, such as selectivity against other targets and absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox), and physicochemical properties should be taken into account. However, enhancing the potency of the fragment hit is a key requirement in FBDD, unlike HTS, because initial fragment hits are generally weak. This enhancement is presently achieved by adding additional chemical groups which bind to additional parts of the target protein or by joining or combining two or more hit fragments; however, strategies for effecting greater improvements in effective activity are needed. X-ray analysis is a key technology attractive for converting fragments to drug leads. This method makes it clear whether a fragment hit can act as an anchor and provides insight regarding introduction of functional groups to improve fragment activity. Data on follow-up chemical synthesis of fragment hits has allowed for the differentiation of four different strategies: fragment optimization, fragment linking, fragment self-assembly, and fragment evolution. Here, we discuss our opinion regarding how to follow up on fragment hits, with a focus on the importance of fragment hits as an anchor moiety to so-called hot spots in the target protein using crystallographic data. Copyright © 2011 Elsevier Inc. All rights reserved.
He, Lu; De Groot, Anne S; Bailey-Kellogg, Chris
2015-11-27
Different types of bacteria face different pressures from the immune system, with those that persist ("hit-and-stay") potentially having to adapt more in order to escape than those prone to short-lived infection ("hit-and-run"), and with commensal bacteria potentially different from both due to additional physical mechanisms for avoiding immune detection. The Janus Immunogenicity Score (JIS) was recently developed to assess the likelihood of T cell recognition of an antigen, using an analysis that considers both binding of a peptide within the antigen by major histocompatability complex (MHC) and recognition of the peptide:MHC complex by cognate T cell receptor (TCR). This score was shown to be predictive of T effector vs. T regulatory or null responses in experimental data, as well as to distinguish viruses representative of the hit-and-stay vs. hit-and-run phenotypes. Here, JIS-based analyses were conducted in order to characterize the extent to which the pressure to avoid T cell recognition is manifested in genomic differences among representative hit-and-run, hit-and-stay, and commensal bacteria. Overall, extracellular proteins were found to have different JIS profiles from cytoplasmic ones. Contrasting the bacterial groups, extracellular proteins were shown to be quite different across the groups, much more so than intracellular proteins. The differences were evident even at the level of corresponding peptides in homologous protein pairs from hit-and-run and hit-and-stay bacteria. The multi-level analysis of patterns of immunogenicity across different groups of bacteria provides a new way to approach questions of bacterial immune camouflage or escape, as well as to approach the selection and optimization of candidates for vaccine design. Copyright © 2015 Elsevier Ltd. All rights reserved.
Morgan, K. D.; Jarboe, T. R.; Hossack, A. C.; Chandra, R. N.; Everson, C. J.
2017-12-01
The HIT-SI3 experiment uses a set of inductively driven helicity injectors to apply a non-axisymmetric current drive on the edge of the plasma, driving an axisymmetric spheromak equilibrium in a central confinement volume. These helicity injectors drive a non-axisymmetric perturbation that oscillates in time, with relative temporal phasing of the injectors modifying the mode structure of the applied perturbation. A set of three experimental discharges with different perturbation spectra are modelled using the NIMROD extended magnetohydrodynamics code, and comparisons are made to both magnetic and fluid measurements. These models successfully capture the bulk dynamics of both the perturbation and the equilibrium, though disagreements related to the pressure gradients experimentally measured exist.
Stone, Mandy L.; Graham, Jennifer L.; Gatotho, Jackline W.
2013-01-01
Cheney Reservoir, located in south-central Kansas, is the primary water supply for the city of Wichita. The U.S. Geological Survey has operated a continuous real-time water-quality monitoring station since 1998 on the North Fork Ninnescah River, the main source of inflow to Cheney Reservoir. Continuously measured water-quality physical properties include streamflow, specific conductance, pH, water temperature, dissolved oxygen, and turbidity. Discrete water-quality samples were collected during 1999 through 2009 and analyzed for sediment, nutrients, bacteria, and other water-quality constituents. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physical properties to compute concentrations of those constituents of interest that are not easily measured in real time because of limitations in sensor technology and fiscal constraints. Regression models were published in 2006 that were based on data collected during 1997 through 2003. This report updates those models using discrete and continuous data collected during January 1999 through December 2009. Models also were developed for four new constituents, including additional nutrient species and indicator bacteria. In addition, a conversion factor of 0.68 was established to convert the Yellow Springs Instruments (YSI) model 6026 turbidity sensor measurements to the newer YSI model 6136 sensor at the North Ninnescah River upstream from Cheney Reservoir site. Newly developed models and 14 years of hourly continuously measured data were used to calculate selected constituent concentrations and loads during January 1999 through December 2012. The water-quality information in this report is important to the city of Wichita because it allows the concentrations of many potential pollutants of interest to Cheney Reservoir, including nutrients and sediment, to be estimated in real time and characterized over conditions and time scales that
Heidelberg Ion Therapy Center (HIT): Initial clinical experience in the first 80 patients
Energy Technology Data Exchange (ETDEWEB)
Combs, Stephanie E. (Univ. Hospital of Heidelberg, Dept. of Radiation Oncology, Heidelberg (Germany)), E-mail: Stephanie.Combs@med.uni-heidelberg.de; Ellerbrock, Malte; Haberer, Thomas (Heidelberger Ionenstrahl Therapiezentrum (HIT), Im Neuenheimer Feld 450, 69120 Heidelberg (Germany)) (and others)
2010-10-15
The Heidelberg Ion Therapy Center (HIT) started clinical operation in November 2009. In this report we present the first 80 patients treated with proton and carbon ion radiotherapy and describe patient selection, treatment planning and daily treatment for different indications. Patients and methods. Between November 15, 2009 and April 15, 2010, 80 patients were treated at the Heidelberg Ion Therapy Center (HIT) with carbon ion and proton radiotherapy. Main treated indications consisted of skull base chordoma (n = 9) and chondrosarcoma (n = 18), malignant salivary gland tumors (n=29), chordomas of the sacrum (n = 5), low grade glioma (n=3), primary and recurrent malignant astrocytoma and glioblastoma (n=7) and well as osteosarcoma (n = 3). Of these patients, four pediatric patients aged under 18 years were treated. Results. All patients were treated using the intensity-modulated rasterscanning technique. Seventy-six patients were treated with carbon ions (95%), and four patients were treated with protons. In all patients x-ray imaging was performed prior to each fraction. Treatment concepts were based on the initial experiences with carbon ion therapy at the Gesellschaft fuer Schwerionenforschung (GSI) including carbon-only treatments and carbon-boost treatments with photon-IMRT. The average time per fraction in the treatment room per patient was 29 minutes; for irradiation only, the mean time including all patients was 16 minutes. Position verification was performed prior to every treatment fraction with orthogonal x-ray imaging. Conclusion. Particle therapy could be included successfully into the clinical routine at the Dept. of Radiation Oncology in Heidelberg. Numerous clinical trials will subsequently be initiated to precisely define the role of proton and carbon ion radiotherapy in radiation oncology.
Testing discontinuities in nonparametric regression
Dai, Wenlin
2017-01-19
In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100
Testing discontinuities in nonparametric regression
Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun
2017-01-01
In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100
Logistic Regression: Concept and Application
Cokluk, Omay
2010-01-01
The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…
77 FR 57567 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-09-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
77 FR 50690 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-08-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
77 FR 15760 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-03-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
76 FR 39108 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-07-05
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS ACTION: Notice of... the public. Name of Committee: HIT Policy Committee. General Function of the Committee: to provide...
77 FR 37407 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-06-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
76 FR 22397 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-04-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
77 FR 22787 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-04-17
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
77 FR 27459 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-05-10
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: to provide recommendations to...
76 FR 28783 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-05-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
76 FR 79684 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-12-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
77 FR 73660 - HIT Policy Committee Advisory Meetings; Notice of Meetings
2012-12-11
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meetings; Notice of Meetings AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
77 FR 28881 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-05-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: to provide recommendations to...
76 FR 46298 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-08-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
77 FR 65691 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-10-30
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
76 FR 70455 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-11-14
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: to provide recommendations to...
77 FR 2727 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-01-19
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
75 FR 21629 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2010-04-26
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Office of the National Coordinator for Health Information Technology HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Policy Committee...
76 FR 14975 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-03-18
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: to provide recommendations to...
76 FR 50734 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-08-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
76 FR 55912 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2011-09-09
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... the public. Name of Committee: HIT Policy Committee. General Function of the Committee: To provide...
2010-02-09
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Office of the National Coordinator for Health Information Technology; HIT Policy Committee's Adoption/Certification Workgroup Meeting; Notice of Meeting AGENCY: Office... of Committee: HIT Policy Committee's Adoption/Certification Workgroup. General Function of the...
77 FR 41788 - HIT Policy Committee Advisory Meeting; Notice of Meeting
2012-07-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of... of Committee: HIT Policy Committee. General Function of the Committee: To provide recommendations to...
2011-01-10
... Technology; HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of Meeting. This notice announces a... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards...
2010-02-26
... Technology; HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS ACTION: Notice of meeting. This notice announces a... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards...
2010-11-19
... Technology; HIT Policy Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS ACTION: Notice of meeting. This notice announces a... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Policy Committee...
2010-11-19
... Technology; HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of meeting. This notice announces a... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards...
2010-05-27
... Technology: HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of meeting. This notice announces a... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards...
Tulu, Bengisu; Daniels, Susan; Feldman, Sue; Horan, Thomas A
2008-11-06
This exploratory study investigated the impact of incomplete medical evidence on the SSA disability determination process and the role of HIT as a solution. We collected qualitative data from nineteen expert-interviews. Findings indicate that HIT can lead to innovative solutions that can significantly improve the determination process.
Dykes, Patricia C; Hurley, Ann; Cashen, Margaret; Bakken, Suzanne; Duffy, Mary E
2007-01-01
The use of health information technology (HIT) for the support of communication processes and data and information access in acute care settings is a relatively new phenomenon. A means of evaluating the impact of HIT in hospital settings is needed. The purpose of this research was to design and psychometrically evaluate the Impact of Health Information Technology scale (I-HIT). I-HIT was designed to measure the perception of nurses regarding the ways in which HIT influences interdisciplinary communication and workflow patterns and nurses' satisfaction with HIT applications and tools. Content for a 43-item tool was derived from the literature, and supported theoretically by the Coiera model and by nurse informaticists. Internal consistency reliability analysis using Cronbach's alpha was conducted on the 43-item scale to initiate the item reduction process. Items with an item total correlation of less than 0.35 were removed, leaving a total of 29 items. Item analysis, exploratory principal component analysis and internal consistency reliability using Cronbach's alpha were used to confirm the 29-item scale. Principal components analysis with Varimax rotation produced a four-factor solution that explained 58.5% of total variance (general advantages, information tools to support information needs, information tools to support communication needs, and workflow implications). Internal consistency of the total scale was 0.95 and ranged from 0.80-0.89 for four subscales. I-HIT demonstrated psychometric adequacy and is recommended to measure the impact of HIT on nursing practice in acute care settings.
2011-01-25
... Technology; HIT Standards Committee Advisory Meeting; Notice of Meeting AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of meeting. This notice announces a... Information Technology (ONC). The meeting will be open to the public. Name of Committee: HIT Standards...
Visual Illusions and the Control of Ball Placement in Goal-Directed Hitting
Caljouw, Simone R.; Van der Kamp, John; Savelsbergh, Geert J. P.
2010-01-01
When hitting, kicking, or throwing balls at targets, online control in the target area is impossible. We assumed this lack of late corrections in the target area would induce an effect of a single-winged Muller-Lyer illusion on ball placement. After extensive practice in hitting balls to different landing locations, participants (N = 9) had to hit…
Fungible weights in logistic regression.
Jones, Jeff A; Waller, Niels G
2016-06-01
In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
International Nuclear Information System (INIS)
Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei
2007-01-01
Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age
Tumor regression patterns in retinoblastoma
International Nuclear Information System (INIS)
Zafar, S.N.; Siddique, S.N.; Zaheer, N.
2016-01-01
To observe the types of tumor regression after treatment, and identify the common pattern of regression in our patients. Study Design: Descriptive study. Place and Duration of Study: Department of Pediatric Ophthalmology and Strabismus, Al-Shifa Trust Eye Hospital, Rawalpindi, Pakistan, from October 2011 to October 2014. Methodology: Children with unilateral and bilateral retinoblastoma were included in the study. Patients were referred to Pakistan Institute of Medical Sciences, Islamabad, for chemotherapy. After every cycle of chemotherapy, dilated funds examination under anesthesia was performed to record response of the treatment. Regression patterns were recorded on RetCam II. Results: Seventy-four tumors were included in the study. Out of 74 tumors, 3 were ICRB group A tumors, 43 were ICRB group B tumors, 14 tumors belonged to ICRB group C, and remaining 14 were ICRB group D tumors. Type IV regression was seen in 39.1% (n=29) tumors, type II in 29.7% (n=22), type III in 25.6% (n=19), and type I in 5.4% (n=4). All group A tumors (100%) showed type IV regression. Seventeen (39.5%) group B tumors showed type IV regression. In group C, 5 tumors (35.7%) showed type II regression and 5 tumors (35.7%) showed type IV regression. In group D, 6 tumors (42.9%) regressed to type II non-calcified remnants. Conclusion: The response and success of the focal and systemic treatment, as judged by the appearance of different patterns of tumor regression, varies with the ICRB grouping of the tumor. (author)
PageRank, HITS and a unified framework for link analysis
Energy Technology Data Exchange (ETDEWEB)
Ding, Chris; He, Xiaofeng; Husbands, Parry; Zha, Hongyuan; Simon, Horst
2001-10-01
Two popular webpage ranking algorithms are HITS and PageRank. HITS emphasizes mutual reinforcement between authority and hub webpages, while PageRank emphasizes hyperlink weight normalization and web surfing based on random walk models. We systematically generalize/combine these concepts into a unified framework. The ranking framework contains a large algorithm space; HITS and PageRank are two extreme ends in this space. We study several normalized ranking algorithms which are intermediate between HITS and PageRank, and obtain closed-form solutions. We show that, to first order approximation, all ranking algorithms in this framework, including PageRank and HITS, lead to same ranking which is highly correlated with ranking by indegree. These results support the notion that in web resource ranking indegree and outdegree are of fundamental importance. Rankings of webgraphs of different sizes and queries are presented to illustrate our analysis.
Regularized Label Relaxation Linear Regression.
Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung; Fang, Bingwu
2018-04-01
Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on -norm and -norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.
Regression to Causality : Regression-style presentation influences causal attribution
DEFF Research Database (Denmark)
Bordacconi, Mats Joe; Larsen, Martin Vinæs
2014-01-01
of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...
Regression analysis with categorized regression calibrated exposure: some interesting findings
Directory of Open Access Journals (Sweden)
Hjartåker Anette
2006-07-01
Full Text Available Abstract Background Regression calibration as a method for handling measurement error is becoming increasingly well-known and used in epidemiologic research. However, the standard version of the method is not appropriate for exposure analyzed on a categorical (e.g. quintile scale, an approach commonly used in epidemiologic studies. A tempting solution could then be to use the predicted continuous exposure obtained through the regression calibration method and treat it as an approximation to the true exposure, that is, include the categorized calibrated exposure in the main regression analysis. Methods We use semi-analytical calculations and simulations to evaluate the performance of the proposed approach compared to the naive approach of not correcting for measurement error, in situations where analyses are performed on quintile scale and when incorporating the original scale into the categorical variables, respectively. We also present analyses of real data, containing measures of folate intake and depression, from the Norwegian Women and Cancer study (NOWAC. Results In cases where extra information is available through replicated measurements and not validation data, regression calibration does not maintain important qualities of the true exposure distribution, thus estimates of variance and percentiles can be severely biased. We show that the outlined approach maintains much, in some cases all, of the misclassification found in the observed exposure. For that reason, regression analysis with the corrected variable included on a categorical scale is still biased. In some cases the corrected estimates are analytically equal to those obtained by the naive approach. Regression calibration is however vastly superior to the naive method when applying the medians of each category in the analysis. Conclusion Regression calibration in its most well-known form is not appropriate for measurement error correction when the exposure is analyzed on a
Creating a safe place for pediatric care: A no hit zone.
Frazier, Erin R; Liu, Gilbert C; Dauk, Kelly L
2014-07-01
Our goal was to create and implement a program, Kosair Children's Hospital's No Hit Zone, which trains health care workers in de-escalation techniques to address parental disruptive behaviors and physical discipline of children commonly encountered in the hospital environment. The Child Abuse Task Force, a multidisciplinary group, along with key hospital administrators developed specific content for the policy, as well as marketing and educational materials. The No Hit Zone policy designates Kosair Children's Hospital as "an environment in which no adult shall hit a child, no adult shall hit another adult, no child shall hit an adult, and no child shall hit another child. When hitting is observed, it is everyone's responsibility to interrupt the behavior as well as communicate system policy to those present." Via a multidisciplinary, collaborative approach, the No Hit Zone was successfully implemented at Kosair Children's Hospital in 2012. Cost was nominal, and the support of key hospital administrators was critical to the program's success. Education of health professionals on de-escalation techniques and intervention with families at the early signs of parental stress occurred via live sessions and online training via case-based scenarios. The No Hit Zone is an important program used to provide a safe and caring environment for all families and staff of Kosair Children's Hospital. Demand for the program continues, demonstrated by the establishment of No Hit Zones at other local hospitals and multiple outpatient clinics. This article offers information for other organizations planning to conduct similar initiatives. Copyright © 2014 by the American Academy of Pediatrics.
Advanced statistics: linear regression, part II: multiple linear regression.
Marill, Keith A
2004-01-01
The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.
Regression analysis using dependent Polya trees.
Schörgendorfer, Angela; Branscum, Adam J
2013-11-30
Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.
Fast multi-output relevance vector regression
Ha, Youngmin
2017-01-01
This paper aims to decrease the time complexity of multi-output relevance vector regression from O(VM^3) to O(V^3+M^3), where V is the number of output dimensions, M is the number of basis functions, and V
Stochastic, weighted hit size theory of cellular radiobiological action
International Nuclear Information System (INIS)
Bond, V.P.; Varma, M.N.
1982-01-01
A stochastic theory that appears to account well for the observed responses of cell populations exposed in radiation fields of different qualities and for different durations of exposure is described. The theory appears to explain well most cellular radiobiological phenomena observed in at least autonomous cell systems, argues for the use of fluence rate (phi) instead of absorbed dose for quantification of the amount of radiation involved in low level radiation exposure. With or without invoking the cell sensitivity function, the conceptual improvement would be substantial. The approach suggested also shows that the absorbed dose-cell response functions currently employed do not reflect the spectrum of cell sensitivities to increasing cell doses of a single agent, nor can RBE represent the potency ratio for different agents that can produce similar quantal responses. Thus, for accurate comparison of cell sensitivities among different cells in the same individual, or between the cells in different kinds of individuals, it is necessary to quantify cell sensitivity in terms of the hit size weighting or cell sensitivity function introduced here. Similarly, this function should be employed to evaluate the relative potency of radiation and other radiomimetic chemical or physical agents
The MIDAS Touch: Mixed Data Sampling Regression Models
Ghysels, Eric; Santa-Clara, Pedro; Valkanov, Rossen
2004-01-01
We introduce Mixed Data Sampling (henceforth MIDAS) regression models. The regressions involve time series data sampled at different frequencies. Technically speaking MIDAS models specify conditional expectations as a distributed lag of regressors recorded at some higher sampling frequencies. We examine the asymptotic properties of MIDAS regression estimation and compare it with traditional distributed lag models. MIDAS regressions have wide applicability in macroeconomics and ï¿½nance.
Abstract Expression Grammar Symbolic Regression
Korns, Michael F.
This chapter examines the use of Abstract Expression Grammars to perform the entire Symbolic Regression process without the use of Genetic Programming per se. The techniques explored produce a symbolic regression engine which has absolutely no bloat, which allows total user control of the search space and output formulas, which is faster, and more accurate than the engines produced in our previous papers using Genetic Programming. The genome is an all vector structure with four chromosomes plus additional epigenetic and constraint vectors, allowing total user control of the search space and the final output formulas. A combination of specialized compiler techniques, genetic algorithms, particle swarm, aged layered populations, plus discrete and continuous differential evolution are used to produce an improved symbolic regression sytem. Nine base test cases, from the literature, are used to test the improvement in speed and accuracy. The improved results indicate that these techniques move us a big step closer toward future industrial strength symbolic regression systems.
Quantile Regression With Measurement Error
Wei, Ying; Carroll, Raymond J.
2009-01-01
. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a
From Rasch scores to regression
DEFF Research Database (Denmark)
Christensen, Karl Bang
2006-01-01
Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....
Testing Heteroscedasticity in Robust Regression
Czech Academy of Sciences Publication Activity Database
Kalina, Jan
2011-01-01
Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf
Regression methods for medical research
Tai, Bee Choo
2013-01-01
Regression Methods for Medical Research provides medical researchers with the skills they need to critically read and interpret research using more advanced statistical methods. The statistical requirements of interpreting and publishing in medical journals, together with rapid changes in science and technology, increasingly demands an understanding of more complex and sophisticated analytic procedures.The text explains the application of statistical models to a wide variety of practical medical investigative studies and clinical trials. Regression methods are used to appropriately answer the
Kempe, P T; van Oppen, P; de Haan, E; Twisk, J W R; Sluis, A; Smit, J H; van Dyck, R; van Balkom, A J L M
2007-09-01
Two methods for predicting remissions in obsessive-compulsive disorder (OCD) treatment are evaluated. Y-BOCS measurements of 88 patients with a primary OCD (DSM-III-R) diagnosis were performed over a 16-week treatment period, and during three follow-ups. Remission at any measurement was defined as a Y-BOCS score lower than thirteen combined with a reduction of seven points when compared with baseline. Logistic regression models were compared with a Cox regression for recurrent events model. Logistic regression yielded different models at different evaluation times. The recurrent events model remained stable when fewer measurements were used. Higher baseline levels of neuroticism and more severe OCD symptoms were associated with a lower chance of remission, early age of onset and more depressive symptoms with a higher chance. Choice of outcome time affects logistic regression prediction models. Recurrent events analysis uses all information on remissions and relapses. Short- and long-term predictors for OCD remission show overlap.
Warkentin, Theodore E; Sheppard, Jo-Ann I; Chu, F Victor; Kapoor, Anil; Crowther, Mark A; Gangji, Azim
2015-01-01
Repeated therapeutic plasma exchange (TPE) has been advocated to remove heparin-induced thrombocytopenia (HIT) IgG antibodies before cardiac/vascular surgery in patients who have serologically-confirmed acute or subacute HIT; for this situation, a negative platelet activation assay (eg, platelet serotonin-release assay [SRA]) has been recommended as the target serological end point to permit safe surgery. We compared reactivities in the SRA and an anti-PF4/heparin IgG-specific enzyme immunoassay (EIA), testing serial serum samples in a patient with recent (subacute) HIT who underwent serial TPE precardiac surgery, as well as for 15 other serially-diluted HIT sera. We observed that post-TPE/diluted HIT sera-when first testing SRA-negative-continue to test strongly positive by EIA-IgG. This dissociation between the platelet activation assay and a PF4-dependent immunoassay for HIT antibodies indicates that patients with subacute HIT undergoing repeated TPE before heparin reexposure should be tested by serial platelet activation assays even when their EIAs remain strongly positive. © 2015 by The American Society of Hematology.
Throat hit in users of the electronic cigarette: An exploratory study.
Etter, Jean-François
2016-02-01
A cross-sectional survey on the Internet in 2012-2014 was used to study the "throat hit," the specific sensation in the throat felt by users of e-cigarettes. Participants were 1672 current users of e-cigarettes, visitors of Websites dedicated to e-cigarettes and to smoking cessation. It was assessed whether the strength of the throat hit was associated with the characteristics of e-cigarettes and e-liquids, modifications of the devices, patterns of use, reasons for use, satisfaction with e-cigarettes, dependence on e-cigarettes, smoking behavior, and perceived effects on smoking. The strongest throat hit was obtained by using better-quality models and liquids with high nicotine content. Those who reported a "very strong" throat hit used liquids with 17.3 mg/mL nicotine, versus 7.1 mg/mL for those reporting a "very weak" hit (p e-cigarette models that provide high levels of nicotine, a strong throat hit, high satisfaction, and more effects on smoking, but may also be addictive, and models than contain less nicotine and are less addictive, but produce a weaker throat hit, are less satisfactory, and are possibly less efficient at helping people quit smoking. This trade-off must be kept in mind when regulating e-cigarettes. (c) 2016 APA, all rights reserved).
International Nuclear Information System (INIS)
Fritsch, P.
2006-01-01
in each region calculated with the model was considered as a deposition probability). The total number of the aerosol particles increases when the specific activity of the oxide decreases (3 x 10 3 , 1 x 10 6 and 2 x 10 11 ). The number of particles deposited was 6 x 10 6 , 37 and 0.1 in E.T.seq and, 1 x 10 7 , 50 and 0.1 in B.B. seq , for 238 U and 239-238 Pu, respectively. Uncertainties in these values were negligible for 238 U, standard error of 15-18 % for 239 Pu and, for 238 Pu, no deposition occurred for most of the 1000 simulations performed. Second, the residence time of each particle was calculated assuming that the rate constant of the clearance model for particle transport corresponded to a probability to leave the sequestered region. Third, the dosimetric model of ICRP was applied to characterize α hit distribution assuming that, each day during the presence of 1 particle, it was randomly located within the sequestered layer, and it irradiated cells which were not previously exposed. Finally, dose calculation was performed for each simulated exposure. After U exposure, most of the basal cells received only 1 α hit, whereas, for Pu, when a significant irradiation occurred, most of the dose was delivered to cells exposed to several α hits. Thus, it appears that the modification of the effective dose calculation recommended by ICRP when the maximal equivalent dose is delivered to E.T. cannot be applied to actinide oxides for a realistic risk assessment. Further studies are in progress to characterize α hit distribution within the target cells of the other respiratory tract regions. (author)
On Solving Lq-Penalized Regressions
Directory of Open Access Journals (Sweden)
Tracy Zhou Wu
2007-01-01
Full Text Available Lq-penalized regression arises in multidimensional statistical modelling where all or part of the regression coefficients are penalized to achieve both accuracy and parsimony of statistical models. There is often substantial computational difficulty except for the quadratic penalty case. The difficulty is partly due to the nonsmoothness of the objective function inherited from the use of the absolute value. We propose a new solution method for the general Lq-penalized regression problem based on space transformation and thus efficient optimization algorithms. The new method has immediate applications in statistics, notably in penalized spline smoothing problems. In particular, the LASSO problem is shown to be polynomial time solvable. Numerical studies show promise of our approach.
Vaeth, Michael; Skovlund, Eva
2004-06-15
For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.
Lower bounds on the periodic Hamming correlations of frequency hopping sequences with low hit zone
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
In this paper, several periodic Hamming correlation lower bounds for frequency hopping sequences with low hit zone, with respect to the size p of the frequency slot set, the sequence length L, the family size M, low hit zone LH ( or no hit zone NH ), the maximum periodic Hamming autocorrelation sidelobe Ha and the maximum periodic Hamming crosscorrelation Hc, are established. It is shown that the new bounds include the known Lempel-Greenberger bounds, T.S. Seay bounds and Peng-Fan bounds for the conventional frequency hopping sequences as special cases.
Cohen, K Bretonnel; Fort, Karën; Adda, Gilles; Zhou, Sophia; Farri, Dimeji
2016-05-01
Ethical issues reported with paid crowdsourcing include unfairly low wages. It is assumed that such issues are under the control of the task requester. Can one control the amount that a worker earns by controlling the amount that one pays? 412 linguistic data development tasks were submitted to Amazon Mechanical Turk. The pay per HIT was manipulated through a range of values. We examined the relationship between the pay that is offered per HIT and the effective pay rate. There is no such relationship. Paying more per HIT does not cause workers to earn more: the higher the pay per HIT, the more time workers spend on them ( R = 0.92). So, the effective hourly rate stays roughly the same. The finding has clear implications for language resource builders who want to behave ethically: other means must be found in order to compensate workers fairly. The findings of this paper should not be taken as an endorsement of unfairly low pay rates for crowdsourcing workers. Rather, the intention is to point out that additional measures, such as pre-calculating and communicating to the workers an average hourly, rather than per-task, rate must be found in order to ensure an ethical rate of pay.
Logistic regression for dichotomized counts.
Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W
2016-12-01
Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.
Demonstration of a Fiber Optic Regression Probe
Korman, Valentin; Polzin, Kurt A.
2010-01-01
The capability to provide localized, real-time monitoring of material regression rates in various applications has the potential to provide a new stream of data for development testing of various components and systems, as well as serving as a monitoring tool in flight applications. These applications include, but are not limited to, the regression of a combusting solid fuel surface, the ablation of the throat in a chemical rocket or the heat shield of an aeroshell, and the monitoring of erosion in long-life plasma thrusters. The rate of regression in the first application is very fast, while the second and third are increasingly slower. A recent fundamental sensor development effort has led to a novel regression, erosion, and ablation sensor technology (REAST). The REAST sensor allows for measurement of real-time surface erosion rates at a discrete surface location. The sensor is optical, using two different, co-located fiber-optics to perform the regression measurement. The disparate optical transmission properties of the two fiber-optics makes it possible to measure the regression rate by monitoring the relative light attenuation through the fibers. As the fibers regress along with the parent material in which they are embedded, the relative light intensities through the two fibers changes, providing a measure of the regression rate. The optical nature of the system makes it relatively easy to use in a variety of harsh, high temperature environments, and it is also unaffected by the presence of electric and magnetic fields. In addition, the sensor could be used to perform optical spectroscopy on the light emitted by a process and collected by fibers, giving localized measurements of various properties. The capability to perform an in-situ measurement of material regression rates is useful in addressing a variety of physical issues in various applications. An in-situ measurement allows for real-time data regarding the erosion rates, providing a quick method for
Producing The New Regressive Left
DEFF Research Database (Denmark)
Crone, Christine
members, this thesis investigates a growing political trend and ideological discourse in the Arab world that I have called The New Regressive Left. On the premise that a media outlet can function as a forum for ideology production, the thesis argues that an analysis of this material can help to trace...... the contexture of The New Regressive Left. If the first part of the thesis lays out the theoretical approach and draws the contextual framework, through an exploration of the surrounding Arab media-and ideoscapes, the second part is an analytical investigation of the discourse that permeates the programmes aired...... becomes clear from the analytical chapters is the emergence of the new cross-ideological alliance of The New Regressive Left. This emerging coalition between Shia Muslims, religious minorities, parts of the Arab Left, secular cultural producers, and the remnants of the political,strategic resistance...
Nooren, G.
2004-01-01
The T10 beam produces a few hits per event. In ALICE the SSD will have to cope with many hits per strip. In the three centimeters of aluminium the beam will produce many secondary particles. This increases the chance of multiple hits per strip, although not to the level in ALICE.
A Matlab program for stepwise regression
Directory of Open Access Journals (Sweden)
Yanhong Qi
2016-03-01
Full Text Available The stepwise linear regression is a multi-variable regression for identifying statistically significant variables in the linear regression equation. In present study, we presented the Matlab program of stepwise regression.
Correlation and simple linear regression.
Zou, Kelly H; Tuncali, Kemal; Silverman, Stuart G
2003-06-01
In this tutorial article, the concepts of correlation and regression are reviewed and demonstrated. The authors review and compare two correlation coefficients, the Pearson correlation coefficient and the Spearman rho, for measuring linear and nonlinear relationships between two continuous variables. In the case of measuring the linear relationship between a predictor and an outcome variable, simple linear regression analysis is conducted. These statistical concepts are illustrated by using a data set from published literature to assess a computed tomography-guided interventional technique. These statistical methods are important for exploring the relationships between variables and can be applied to many radiologic studies.
Regression filter for signal resolution
International Nuclear Information System (INIS)
Matthes, W.
1975-01-01
The problem considered is that of resolving a measured pulse height spectrum of a material mixture, e.g. gamma ray spectrum, Raman spectrum, into a weighed sum of the spectra of the individual constituents. The model on which the analytical formulation is based is described. The problem reduces to that of a multiple linear regression. A stepwise linear regression procedure was constructed. The efficiency of this method was then tested by transforming the procedure in a computer programme which was used to unfold test spectra obtained by mixing some spectra, from a library of arbitrary chosen spectra, and adding a noise component. (U.K.)
Nonparametric Mixture of Regression Models.
Huang, Mian; Li, Runze; Wang, Shaoli
2013-07-01
Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.
Efficiency Improvement of HIT Solar Cells on p-Type Si Wafers.
Wei, Chun-You; Lin, Chu-Hsuan; Hsiao, Hao-Tse; Yang, Po-Chuan; Wang, Chih-Ming; Pan, Yen-Chih
2013-11-22
Single crystal silicon solar cells are still predominant in the market due to the abundance of silicon on earth and their acceptable efficiency. Different solar-cell structures of single crystalline Si have been investigated to boost efficiency; the heterojunction with intrinsic thin layer (HIT) structure is currently the leading technology. The record efficiency values of state-of-the art HIT solar cells have always been based on n-type single-crystalline Si wafers. Improving the efficiency of cells based on p-type single-crystalline Si wafers could provide broader options for the development of HIT solar cells. In this study, we varied the thickness of intrinsic hydrogenated amorphous Si layer to improve the efficiency of HIT solar cells on p-type Si wafers.
High-Throughput Screening and Hit Validation of Extracellular-Related Kinase 5 (ERK5) Inhibitors.
Myers, Stephanie M; Bawn, Ruth H; Bisset, Louise C; Blackburn, Timothy J; Cottyn, Betty; Molyneux, Lauren; Wong, Ai-Ching; Cano, Celine; Clegg, William; Harrington, Ross W; Leung, Hing; Rigoreau, Laurent; Vidot, Sandrine; Golding, Bernard T; Griffin, Roger J; Hammonds, Tim; Newell, David R; Hardcastle, Ian R
2016-08-08
The extracellular-related kinase 5 (ERK5) is a promising target for cancer therapy. A high-throughput screen was developed for ERK5, based on the IMAP FP progressive binding system, and used to identify hits from a library of 57 617 compounds. Four distinct chemical series were evident within the screening hits. Resynthesis and reassay of the hits demonstrated that one series did not return active compounds, whereas three series returned active hits. Structure-activity studies demonstrated that the 4-benzoylpyrrole-2-carboxamide pharmacophore had excellent potential for further development. The minimum kinase binding pharmacophore was identified, and key examples demonstrated good selectivity for ERK5 over p38α kinase.
Optimal Fixed-Interval Integrated Guidance-Control Laws for Hit-to-Kill Missiles
National Research Council Canada - National Science Library
Menon, P. K; Sweriduk, G. D; Ohlmeyer, E. J
2003-01-01
Due to their potential for reducing the weapon size and efficiency, design methods for realizing hit-to- kill capabilities in missile systems are of significant research interest in the missile flight control community...
2010-06-08
... Coordinator for Health Information Technology AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of committee recommendations and invitation for public input... Coordinator for Health Information Technology (ONC). Name of Committee: HIT Standards Committee. General...
Cactus: An Introduction to Regression
Hyde, Hartley
2008-01-01
When the author first used "VisiCalc," the author thought it a very useful tool when he had the formulas. But how could he design a spreadsheet if there was no known formula for the quantities he was trying to predict? A few months later, the author relates he learned to use multiple linear regression software and suddenly it all clicked into…
Regression Models for Repairable Systems
Czech Academy of Sciences Publication Activity Database
Novák, Petr
2015-01-01
Roč. 17, č. 4 (2015), s. 963-972 ISSN 1387-5841 Institutional support: RVO:67985556 Keywords : Reliability analysis * Repair models * Regression Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.782, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/novak-0450902.pdf
Survival analysis II: Cox regression
Stel, Vianda S.; Dekker, Friedo W.; Tripepi, Giovanni; Zoccali, Carmine; Jager, Kitty J.
2011-01-01
In contrast to the Kaplan-Meier method, Cox proportional hazards regression can provide an effect estimate by quantifying the difference in survival between patient groups and can adjust for confounding effects of other variables. The purpose of this article is to explain the basic concepts of the