WorldWideScience

Sample records for sampling time intervals

  1. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  2. Continuous time modelling with individually varying time intervals for oscillating and non-oscillating processes.

    Science.gov (United States)

    Voelkle, Manuel C; Oud, Johan H L

    2013-02-01

    When designing longitudinal studies, researchers often aim at equal intervals. In practice, however, this goal is hardly ever met, with different time intervals between assessment waves and different time intervals between individuals being more the rule than the exception. One of the reasons for the introduction of continuous time models by means of structural equation modelling has been to deal with irregularly spaced assessment waves (e.g., Oud & Delsing, 2010). In the present paper we extend the approach to individually varying time intervals for oscillating and non-oscillating processes. In addition, we show not only that equal intervals are unnecessary but also that it can be advantageous to use unequal sampling intervals, in particular when the sampling rate is low. Two examples are provided to support our arguments. In the first example we compare a continuous time model of a bivariate coupled process with varying time intervals to a standard discrete time model to illustrate the importance of accounting for the exact time intervals. In the second example the effect of different sampling intervals on estimating a damped linear oscillator is investigated by means of a Monte Carlo simulation. We conclude that it is important to account for individually varying time intervals, and encourage researchers to conceive of longitudinal studies with different time intervals within and between individuals as an opportunity rather than a problem. © 2012 The British Psychological Society.

  3. Relativistic rise measurements with very fine sampling intervals

    International Nuclear Information System (INIS)

    Ludlam, T.; Platner, E.D.; Polychronakos, V.A.; Lindenbaum, S.J.; Kramer, M.A.; Teramoto, Y.

    1980-01-01

    The motivation of this work was to determine whether the technique of charged particle identification via the relativistic rise in the ionization loss can be significantly improved by virtue of very small sampling intervals. A fast-sampling ADC and a longitudinal drift geometry were used to provide a large number of samples from a single drift chamber gap, achieving sampling intervals roughly 10 times smaller than any previous study. A single layer drift chamber was used, and tracks of 1 meter length were simulated by combining together samples from many identified particles in this detector. These data were used to study the resolving power for particle identification as a function of sample size, averaging technique, and the number of discrimination levels (ADC bits) used for pulse height measurements

  4. Estimating fluvial wood discharge from timelapse photography with varying sampling intervals

    Science.gov (United States)

    Anderson, N. K.

    2013-12-01

    There is recent focus on calculating wood budgets for streams and rivers to help inform management decisions, ecological studies and carbon/nutrient cycling models. Most work has measured in situ wood in temporary storage along stream banks or estimated wood inputs from banks. Little effort has been employed monitoring and quantifying wood in transport during high flows. This paper outlines a procedure for estimating total seasonal wood loads using non-continuous coarse interval sampling and examines differences in estimation between sampling at 1, 5, 10 and 15 minutes. Analysis is performed on wood transport for the Slave River in Northwest Territories, Canada. Relative to the 1 minute dataset, precision decreased by 23%, 46% and 60% for the 5, 10 and 15 minute datasets, respectively. Five and 10 minute sampling intervals provided unbiased equal variance estimates of 1 minute sampling, whereas 15 minute intervals were biased towards underestimation by 6%. Stratifying estimates by day and by discharge increased precision over non-stratification by 4% and 3%, respectively. Not including wood transported during ice break-up, the total minimum wood load estimated at this site is 3300 × 800$ m3 for the 2012 runoff season. The vast majority of the imprecision in total wood volumes came from variance in estimating average volume per log. Comparison of proportions and variance across sample intervals using bootstrap sampling to achieve equal n. Each trial was sampled for n=100, 10,000 times and averaged. All trials were then averaged to obtain an estimate for each sample interval. Dashed lines represent values from the one minute dataset.

  5. Efficient Estimation for Diffusions Sampled at High Frequency Over a Fixed Time Interval

    DEFF Research Database (Denmark)

    Jakobsen, Nina Munkholt; Sørensen, Michael

    Parametric estimation for diffusion processes is considered for high frequency observations over a fixed time interval. The processes solve stochastic differential equations with an unknown parameter in the diffusion coefficient. We find easily verified conditions on approximate martingale...

  6. Technical note: Instantaneous sampling intervals validated from continuous video observation for behavioral recording of feedlot lambs.

    Science.gov (United States)

    Pullin, A N; Pairis-Garcia, M D; Campbell, B J; Campler, M R; Proudfoot, K L

    2017-11-01

    When considering methodologies for collecting behavioral data, continuous sampling provides the most complete and accurate data set whereas instantaneous sampling can provide similar results and also increase the efficiency of data collection. However, instantaneous time intervals require validation to ensure accurate estimation of the data. Therefore, the objective of this study was to validate scan sampling intervals for lambs housed in a feedlot environment. Feeding, lying, standing, drinking, locomotion, and oral manipulation were measured on 18 crossbred lambs housed in an indoor feedlot facility for 14 h (0600-2000 h). Data from continuous sampling were compared with data from instantaneous scan sampling intervals of 5, 10, 15, and 20 min using a linear regression analysis. Three criteria determined if a time interval accurately estimated behaviors: 1) ≥ 0.90, 2) slope not statistically different from 1 ( > 0.05), and 3) intercept not statistically different from 0 ( > 0.05). Estimations for lying behavior were accurate up to 20-min intervals, whereas feeding and standing behaviors were accurate only at 5-min intervals (i.e., met all 3 regression criteria). Drinking, locomotion, and oral manipulation demonstrated poor associations () for all tested intervals. The results from this study suggest that a 5-min instantaneous sampling interval will accurately estimate lying, feeding, and standing behaviors for lambs housed in a feedlot, whereas continuous sampling is recommended for the remaining behaviors. This methodology will contribute toward the efficiency, accuracy, and transparency of future behavioral data collection in lamb behavior research.

  7. Comparing interval estimates for small sample ordinal CFA models.

    Science.gov (United States)

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading

  8. Estimation of individual reference intervals in small sample sizes

    DEFF Research Database (Denmark)

    Hansen, Ase Marie; Garde, Anne Helene; Eller, Nanna Hurwitz

    2007-01-01

    In occupational health studies, the study groups most often comprise healthy subjects performing their work. Sampling is often planned in the most practical way, e.g., sampling of blood in the morning at the work site just after the work starts. Optimal use of reference intervals requires...... from various variables such as gender, age, BMI, alcohol, smoking, and menopause. The reference intervals were compared to reference intervals calculated using IFCC recommendations. Where comparable, the IFCC calculated reference intervals had a wider range compared to the variance component models...

  9. Timing intervals using population synchrony and spike timing dependent plasticity

    Directory of Open Access Journals (Sweden)

    Wei Xu

    2016-12-01

    Full Text Available We present a computational model by which ensembles of regularly spiking neurons can encode different time intervals through synchronous firing. We show that a neuron responding to a large population of convergent inputs has the potential to learn to produce an appropriately-timed output via spike-time dependent plasticity. We explain why temporal variability of this population synchrony increases with increasing time intervals. We also show that the scalar property of timing and its violation at short intervals can be explained by the spike-wise accumulation of jitter in the inter-spike intervals of timing neurons. We explore how the challenge of encoding longer time intervals can be overcome and conclude that this may involve a switch to a different population of neurons with lower firing rate, with the added effect of producing an earlier bias in response. Experimental data on human timing performance show features in agreement with the model’s output.

  10. Intact interval timing in circadian CLOCK mutants.

    Science.gov (United States)

    Cordes, Sara; Gallistel, C R

    2008-08-28

    While progress has been made in determining the molecular basis for the circadian clock, the mechanism by which mammalian brains time intervals measured in seconds to minutes remains a mystery. An obvious question is whether the interval-timing mechanism shares molecular machinery with the circadian timing mechanism. In the current study, we trained circadian CLOCK +/- and -/- mutant male mice in a peak-interval procedure with 10 and 20-s criteria. The mutant mice were more active than their wild-type littermates, but there were no reliable deficits in the accuracy or precision of their timing as compared with wild-type littermates. This suggests that expression of the CLOCK protein is not necessary for normal interval timing.

  11. Department of Defense Precise Time and Time Interval program improvement plan

    Science.gov (United States)

    Bowser, J. R.

    1981-01-01

    The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.

  12. Time-to-code converter with selection of time intervals on duration

    International Nuclear Information System (INIS)

    Atanasov, I.Kh.; Rusanov, I.R.; )

    2001-01-01

    Identification of elementary particles on the basis of time-of-flight represents the important approach of the preliminary selection procedure. Paper describes a time-to-code converter with preliminary selection of the measured time intervals as to duration. It consists of a time-to-amplitude converter, an analog-to-digital converter, a unit of selection of time intervals as to duration, a unit of total reset and CAMAC command decoder. The time-to-code converter enables to measure time intervals with 100 ns accuracy within 0-100 ns range. Output code capacity is of 10. Selection time constitutes 50 ns [ru

  13. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    Science.gov (United States)

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  14. Adaptive Kalman Filter Based on Adjustable Sampling Interval in Burst Detection for Water Distribution System

    Directory of Open Access Journals (Sweden)

    Doo Yong Choi

    2016-04-01

    Full Text Available Rapid detection of bursts and leaks in water distribution systems (WDSs can reduce the social and economic costs incurred through direct loss of water into the ground, additional energy demand for water supply, and service interruptions. Many real-time burst detection models have been developed in accordance with the use of supervisory control and data acquisition (SCADA systems and the establishment of district meter areas (DMAs. Nonetheless, no consideration has been given to how frequently a flow meter measures and transmits data for predicting breaks and leaks in pipes. This paper analyzes the effect of sampling interval when an adaptive Kalman filter is used for detecting bursts in a WDS. A new sampling algorithm is presented that adjusts the sampling interval depending on the normalized residuals of flow after filtering. The proposed algorithm is applied to a virtual sinusoidal flow curve and real DMA flow data obtained from Jeongeup city in South Korea. The simulation results prove that the self-adjusting algorithm for determining the sampling interval is efficient and maintains reasonable accuracy in burst detection. The proposed sampling method has a significant potential for water utilities to build and operate real-time DMA monitoring systems combined with smart customer metering systems.

  15. Practical reporting times for environmental samples

    International Nuclear Information System (INIS)

    Bayne, C.K.; Schmoyer, D.D.; Jenkins, R.A.

    1993-02-01

    Preanalytical holding times for environmental samples are specified because chemical and physical characteristics may change between sampling and chemical analysis. For example, the Federal Register prescribes a preanalytical holding time of 14 days for volatile organic compounds in soil stored at 4 degrees C. The American Society for Testing Materials (ASTM) uses a more technical definition that the preanalytical holding time is the day when the analyte concentration for an environmental sample falls below the lower 99% confidence interval on the analyte concentration at day zero. This study reviews various holding time definitions and suggest a new preanalytical holding time approach using acceptable error rates for measuring an environmental analyte. This practical reporting time (PRT) approach has been applied to nineteen volatile organic compounds and four explosives in three environmental soil samples. A PRT nomograph of error rates has been developed to estimate the consequences of missing a preanalytical holding time. This nomograph can be applied to a large class of analytes with concentrations that decay linearly or exponentially with time regardless of sample matrices and storage conditions

  16. Practical reporting times for environmental samples

    Energy Technology Data Exchange (ETDEWEB)

    Bayne, C.K.; Schmoyer, D.D.; Jenkins, R.A.

    1993-02-01

    Preanalytical holding times for environmental samples are specified because chemical and physical characteristics may change between sampling and chemical analysis. For example, the Federal Register prescribes a preanalytical holding time of 14 days for volatile organic compounds in soil stored at 4{degrees}C. The American Society for Testing Materials (ASTM) uses a more technical definition that the preanalytical holding time is the day when the analyte concentration for an environmental sample falls below the lower 99% confidence interval on the analyte concentration at day zero. This study reviews various holding time definitions and suggest a new preanalytical holding time approach using acceptable error rates for measuring an environmental analyte. This practical reporting time (PRT) approach has been applied to nineteen volatile organic compounds and four explosives in three environmental soil samples. A PRT nomograph of error rates has been developed to estimate the consequences of missing a preanalytical holding time. This nomograph can be applied to a large class of analytes with concentrations that decay linearly or exponentially with time regardless of sample matrices and storage conditions.

  17. Across-province standardization and comparative analysis of time-to-care intervals for cancer

    Directory of Open Access Journals (Sweden)

    Nugent Zoann

    2007-10-01

    and appropriate inclusion criteria that were robust across the agencies that did not result in an overly selective sample of patients to be compared. Comparisons of data across three provinces of the selected time-to-care intervals identified several important differences related to treatment and access that require further attention. Expanding this collaboration across Canada would facilitate improvement of and equitable access to quality cancer care at a national level.

  18. An Improvement to Interval Estimation for Small Samples

    Directory of Open Access Journals (Sweden)

    SUN Hui-Ling

    2017-02-01

    Full Text Available Because it is difficult and complex to determine the probability distribution of small samples,it is improper to use traditional probability theory to process parameter estimation for small samples. Bayes Bootstrap method is always used in the project. Although,the Bayes Bootstrap method has its own limitation,In this article an improvement is given to the Bayes Bootstrap method,This method extended the amount of samples by numerical simulation without changing the circumstances in a small sample of the original sample. And the new method can give the accurate interval estimation for the small samples. Finally,by using the Monte Carlo simulation to model simulation to the specific small sample problems. The effectiveness and practicability of the Improved-Bootstrap method was proved.

  19. Impact of sampling interval in training data acquisition on intrafractional predictive accuracy of indirect dynamic tumor-tracking radiotherapy.

    Science.gov (United States)

    Mukumoto, Nobutaka; Nakamura, Mitsuhiro; Akimoto, Mami; Miyabe, Yuki; Yokota, Kenji; Matsuo, Yukinori; Mizowaki, Takashi; Hiraoka, Masahiro

    2017-08-01

    To explore the effect of sampling interval of training data acquisition on the intrafractional prediction error of surrogate signal-based dynamic tumor-tracking using a gimbal-mounted linac. Twenty pairs of respiratory motions were acquired from 20 patients (ten lung, five liver, and five pancreatic cancer patients) who underwent dynamic tumor-tracking with the Vero4DRT. First, respiratory motions were acquired as training data for an initial construction of the prediction model before the irradiation. Next, additional respiratory motions were acquired for an update of the prediction model due to the change of the respiratory pattern during the irradiation. The time elapsed prior to the second acquisition of the respiratory motion was 12.6 ± 3.1 min. A four-axis moving phantom reproduced patients' three dimensional (3D) target motions and one dimensional surrogate motions. To predict the future internal target motion from the external surrogate motion, prediction models were constructed by minimizing residual prediction errors for training data acquired at 80 and 320 ms sampling intervals for 20 s, and at 500, 1,000, and 2,000 ms sampling intervals for 60 s using orthogonal kV x-ray imaging systems. The accuracies of prediction models trained with various sampling intervals were estimated based on training data with each sampling interval during the training process. The intrafractional prediction errors for various prediction models were then calculated on intrafractional monitoring images taken for 30 s at the constant sampling interval of a 500 ms fairly to evaluate the prediction accuracy for the same motion pattern. In addition, the first respiratory motion was used for the training and the second respiratory motion was used for the evaluation of the intrafractional prediction errors for the changed respiratory motion to evaluate the robustness of the prediction models. The training error of the prediction model was 1.7 ± 0.7 mm in 3D for all sampling

  20. Delay-Dependent Guaranteed Cost Control of an Interval System with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Xiao Min

    2009-01-01

    Full Text Available This paper concerns the problem of the delay-dependent robust stability and guaranteed cost control for an interval system with time-varying delay. The interval system with matrix factorization is provided and leads to less conservative conclusions than solving a square root. The time-varying delay is assumed to belong to an interval and the derivative of the interval time-varying delay is not a restriction, which allows a fast time-varying delay; also its applicability is broad. Based on the Lyapunov-Ktasovskii approach, a delay-dependent criterion for the existence of a state feedback controller, which guarantees the closed-loop system stability, the upper bound of cost function, and disturbance attenuation lever for all admissible uncertainties as well as out perturbation, is proposed in terms of linear matrix inequalities (LMIs. The criterion is derived by free weighting matrices that can reduce the conservatism. The effectiveness has been verified in a number example and the compute results are presented to validate the proposed design method.

  1. Effects of Spatial Sampling Interval on Roughness Parameters and Microwave Backscatter over Agricultural Soil Surfaces

    Directory of Open Access Journals (Sweden)

    Matías Ernesto Barber

    2016-06-01

    Full Text Available The spatial sampling interval, as related to the ability to digitize a soil profile with a certain number of features per unit length, depends on the profiling technique itself. From a variety of profiling techniques, roughness parameters are estimated at different sampling intervals. Since soil profiles have continuous spectral components, it is clear that roughness parameters are influenced by the sampling interval of the measurement device employed. In this work, we contributed to answer which sampling interval the profiles needed to be measured at to accurately account for the microwave response of agricultural surfaces. For this purpose, a 2-D laser profiler was built and used to measure surface soil roughness at field scale over agricultural sites in Argentina. Sampling intervals ranged from large (50 mm to small ones (1 mm, with several intermediate values. Large- and intermediate-sampling-interval profiles were synthetically derived from nominal, 1 mm ones. With these data, the effect of sampling-interval-dependent roughness parameters on backscatter response was assessed using the theoretical backscatter model IEM2M. Simulations demonstrated that variations of roughness parameters depended on the working wavelength and was less important at L-band than at C- or X-band. In any case, an underestimation of the backscattering coefficient of about 1-4 dB was observed at larger sampling intervals. As a general rule a sampling interval of 15 mm can be recommended for L-band and 5 mm for C-band.

  2. The Gas Sampling Interval Effect on V˙O2peak Is Independent of Exercise Protocol.

    Science.gov (United States)

    Scheadler, Cory M; Garver, Matthew J; Hanson, Nicholas J

    2017-09-01

    There is a plethora of gas sampling intervals available during cardiopulmonary exercise testing to measure peak oxygen consumption (V˙O2peak). Different intervals can lead to altered V˙O2peak. Whether differences are affected by the exercise protocol or subject sample is not clear. The purpose of this investigation was to determine whether V˙O2peak differed because of the manipulation of sampling intervals and whether differences were independent of the protocol and subject sample. The first subject sample (24 ± 3 yr; V˙O2peak via 15-breath moving averages: 56.2 ± 6.8 mL·kg·min) completed the Bruce and the self-paced V˙O2max protocols. The second subject sample (21.9 ± 2.7 yr; V˙O2peak via 15-breath moving averages: 54.2 ± 8.0 mL·kg·min) completed the Bruce and the modified Astrand protocols. V˙O2peak was identified using five sampling intervals: 15-s block averages, 30-s block averages, 15-breath block averages, 15-breath moving averages, and 30-s block averages aligned to the end of exercise. Differences in V˙O2peak between intervals were determined using repeated-measures ANOVAs. The influence of subject sample on the sampling effect was determined using independent t-tests. There was a significant main effect of sampling interval on V˙O2peak (first sample Bruce and self-paced V˙O2max P sample Bruce and modified Astrand P sampling intervals followed a similar pattern for each protocol and subject sample, with 15-breath moving average presenting the highest V˙O2peak. The effect of manipulating gas sampling intervals on V˙O2peak appears to be protocol and sample independent. These findings highlight our recommendation that the clinical and scientific community request and report the sampling interval whenever metabolic data are presented. The standardization of reporting would assist in the comparison of V˙O2peak.

  3. Reviewing interval cancers: Time well spent?

    International Nuclear Information System (INIS)

    Gower-Thomas, Kate; Fielder, Hilary M.P.; Branston, Lucy; Greening, Sarah; Beer, Helen; Rogers, Cerilan

    2002-01-01

    OBJECTIVES: To categorize interval cancers, and thus identify false-negatives, following prevalent and incident screens in the Welsh breast screening programme. SETTING: Breast Test Wales (BTW) Llandudno, Cardiff and Swansea breast screening units. METHODS: Five hundred and sixty interval breast cancers identified following negative mammographic screening between 1989 and 1997 were reviewed by eight screening radiologists. The blind review was achieved by mixing the screening films of women who subsequently developed an interval cancer with screen negative films of women who did not develop cancer, in a ratio of 4:1. Another radiologist used patients' symptomatic films to record a reference against which the reviewers' reports of the screening films were compared. Interval cancers were categorized as 'true', 'occult', 'false-negative' or 'unclassified' interval cancers or interval cancers with minimal signs, based on the National Health Service breast screening programme (NHSBSP) guidelines. RESULTS: Of the classifiable interval films, 32% were false-negatives, 55% were true intervals and 12% occult. The proportion of false-negatives following incident screens was half that following prevalent screens (P = 0.004). Forty percent of the seed films were recalled by the panel. CONCLUSIONS: Low false-negative interval cancer rates following incident screens (18%) versus prevalent screens (36%) suggest that lower cancer detection rates at incident screens may have resulted from fewer cancers than expected being present, rather than from a failure to detect tumours. The panel method for categorizing interval cancers has significant flaws as the results vary markedly with different protocol and is no more accurate than other, quicker and more timely methods. Gower-Thomas, K. et al. (2002)

  4. Does the time interval between antimüllerian hormone serum sampling and initiation of ovarian stimulation affect its predictive ability in in vitro fertilization-intracytoplasmic sperm injection cycles with a gonadotropin-releasing hormone antagonist?

    DEFF Research Database (Denmark)

    Polyzos, Nikolaos P; Nelson, Scott M; Stoop, Dominic

    2013-01-01

    To investigate whether the time interval between serum antimüllerian hormone (AMH) sampling and initiation of ovarian stimulation for in vitro fertilization-intracytoplasmic sperm injection (IVF-ICSI) may affect the predictive ability of the marker for low and excessive ovarian response.......To investigate whether the time interval between serum antimüllerian hormone (AMH) sampling and initiation of ovarian stimulation for in vitro fertilization-intracytoplasmic sperm injection (IVF-ICSI) may affect the predictive ability of the marker for low and excessive ovarian response....

  5. Interval timing in genetically modified mice: a simple paradigm

    OpenAIRE

    Balci, F.; Papachristos, E. B.; Gallistel, C. R.; Brunner, D.; Gibson, J.; Shumyatsky, G. P.

    2007-01-01

    We describe a behavioral screen for the quantitative study of interval timing and interval memory in mice. Mice learn to switch from a short-latency feeding station to a long-latency station when the short latency has passed without a feeding. The psychometric function is the cumulative distribution of switch latencies. Its median measures timing accuracy and its interquartile interval measures timing precision. Next, using this behavioral paradigm, we have examined mice with a gene knockout ...

  6. A model of interval timing by neural integration.

    Science.gov (United States)

    Simen, Patrick; Balci, Fuat; de Souza, Laura; Cohen, Jonathan D; Holmes, Philip

    2011-06-22

    We show that simple assumptions about neural processing lead to a model of interval timing as a temporal integration process, in which a noisy firing-rate representation of time rises linearly on average toward a response threshold over the course of an interval. Our assumptions include: that neural spike trains are approximately independent Poisson processes, that correlations among them can be largely cancelled by balancing excitation and inhibition, that neural populations can act as integrators, and that the objective of timed behavior is maximal accuracy and minimal variance. The model accounts for a variety of physiological and behavioral findings in rodents, monkeys, and humans, including ramping firing rates between the onset of reward-predicting cues and the receipt of delayed rewards, and universally scale-invariant response time distributions in interval timing tasks. It furthermore makes specific, well-supported predictions about the skewness of these distributions, a feature of timing data that is usually ignored. The model also incorporates a rapid (potentially one-shot) duration-learning procedure. Human behavioral data support the learning rule's predictions regarding learning speed in sequences of timed responses. These results suggest that simple, integration-based models should play as prominent a role in interval timing theory as they do in theories of perceptual decision making, and that a common neural mechanism may underlie both types of behavior.

  7. A new variable interval schedule with constant hazard rate and finite time range.

    Science.gov (United States)

    Bugallo, Mehdi; Machado, Armando; Vasconcelos, Marco

    2018-05-27

    We propose a new variable interval (VI) schedule that achieves constant probability of reinforcement in time while using a bounded range of intervals. By sampling each trial duration from a uniform distribution ranging from 0 to 2 T seconds, and then applying a reinforcement rule that depends linearly on trial duration, the schedule alternates reinforced and unreinforced trials, each less than 2 T seconds, while preserving a constant hazard function. © 2018 Society for the Experimental Analysis of Behavior.

  8. Time interval approach to the pulsed neutron logging method

    International Nuclear Information System (INIS)

    Zhao Jingwu; Su Weining

    1994-01-01

    The time interval of neighbouring neutrons emitted from a steady state neutron source can be treated as that from a time-dependent neutron source. In the rock space, the neutron flux is given by the neutron diffusion equation and is composed of an infinite terms. Each term s composed of two die-away curves. The delay action is discussed and used to measure the time interval with only one detector in the experiment. Nuclear reactions with the time distribution due to different types of radiations observed in the neutron well-logging methods are presented with a view to getting the rock nuclear parameters from the time interval technique

  9. Delay-Dependent Guaranteed Cost H∞ Control of an Interval System with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Zhongke Shi

    2009-01-01

    Full Text Available This paper concerns the problem of the delay-dependent robust stability and guaranteed cost H∞ control for an interval system with time-varying delay. The interval system with matrix factorization is provided and leads to less conservative conclusions than solving a square root. The time-varying delay is assumed to belong to an interval and the derivative of the interval time-varying delay is not a restriction, which allows a fast time-varying delay; also its applicability is broad. Based on the Lyapunov-Ktasovskii approach, a delay-dependent criterion for the existence of a state feedback controller, which guarantees the closed-loop system stability, the upper bound of cost function, and disturbance attenuation lever for all admissible uncertainties as well as out perturbation, is proposed in terms of linear matrix inequalities (LMIs. The criterion is derived by free weighting matrices that can reduce the conservatism. The effectiveness has been verified in a number example and the compute results are presented to validate the proposed design method.

  10. Discrete-time optimal control and games on large intervals

    CERN Document Server

    Zaslavski, Alexander J

    2017-01-01

    Devoted to the structure of approximate solutions of discrete-time optimal control problems and approximate solutions of dynamic discrete-time two-player zero-sum games, this book presents results on properties of approximate solutions in an interval that is independent lengthwise, for all sufficiently large intervals. Results concerning the so-called turnpike property of optimal control problems and zero-sum games in the regions close to the endpoints of the time intervals are the main focus of this book. The description of the structure of approximate solutions on sufficiently large intervals and its stability will interest graduate students and mathematicians in optimal control and game theory, engineering, and economics. This book begins with a brief overview and moves on to analyze the structure of approximate solutions of autonomous nonconcave discrete-time optimal control Lagrange problems.Next the structures of approximate solutions of autonomous discrete-time optimal control problems that are discret...

  11. Specifying real-time systems with interval logic

    Science.gov (United States)

    Rushby, John

    1988-01-01

    Pure temporal logic makes no reference to time. An interval temporal logic and an extension to that logic which includes real time constraints are described. The application of this logic by giving a specification for the well-known lift (elevator) example is demonstrated. It is shown how interval logic can be extended to include a notion of process. How the specification language and verification environment of EHDM could be enhanced to support this logic is described. A specification of the alternating bit protocol in this extended version of the specification language of EHDM is given.

  12. Discrete time interval measurement system: fundamentals, resolution and errors in the measurement of angular vibrations

    International Nuclear Information System (INIS)

    Gómez de León, F C; Meroño Pérez, P A

    2010-01-01

    The traditional method for measuring the velocity and the angular vibration in the shaft of rotating machines using incremental encoders is based on counting the pulses at given time intervals. This method is generically called the time interval measurement system (TIMS). A variant of this method that we have developed in this work consists of measuring the corresponding time of each pulse from the encoder and sampling the signal by means of an A/D converter as if it were an analog signal, that is to say, in discrete time. For this reason, we have denominated this method as the discrete time interval measurement system (DTIMS). This measurement system provides a substantial improvement in the precision and frequency resolution compared with the traditional method of counting pulses. In addition, this method permits modification of the width of some pulses in order to obtain a mark-phase on every lap. This paper explains the theoretical fundamentals of the DTIMS and its application for measuring the angular vibrations of rotating machines. It also displays the required relationship between the sampling rate of the signal, the number of pulses of the encoder and the rotating velocity in order to obtain the required resolution and to delimit the methodological errors in the measurement

  13. Unpacking a time interval lengthens its perceived temporal distance

    Directory of Open Access Journals (Sweden)

    Yang eLiu

    2014-11-01

    Full Text Available In quantity estimation, people often perceive that the whole is less than the sum of its parts. The current study investigated such an unpacking effect in temporal distance judgment. Our results showed that participants in the unpacked condition judged a given time interval longer than those in the packed condition, even the time interval was kept constant between the two conditions. Furthermore, this unpacking effect persists regardless of the unpacking ways we employed. Results suggest that unpacking a time interval may be a good strategy for lengthening its perceived temporal distance.

  14. Changes in crash risk following re-timing of traffic signal change intervals.

    Science.gov (United States)

    Retting, Richard A; Chapline, Janella F; Williams, Allan F

    2002-03-01

    More than I million motor vehicle crashes occur annually at signalized intersections in the USA. The principal method used to prevent crashes associated with routine changes in signal indications is employment of a traffic signal change interval--a brief yellow and all-red period that follows the green indication. No universal practice exists for selecting the duration of change intervals, and little is known about the influence of the duration of the change interval on crash risk. The purpose of this study was to estimate potential crash effects of modifying the duration of traffic signal change intervals to conform with values associated with a proposed recommended practice published by the Institute of Transportation Engineers. A sample of 122 intersections was identified and randomly assigned to experimental and control groups. Of 51 eligible experimental sites, 40 (78%) needed signal timing changes. For the 3-year period following implementation of signal timing changes, there was an 8% reduction in reportable crashes at experimental sites relative to those occurring at control sites (P = 0.08). For injury crashes, a 12% reduction at experimental sites relative to those occurring at control sites was found (P = 0.03). Pedestrian and bicycle crashes at experimental sites decreased 37% (P = 0.03) relative to controls. Given these results and the relatively low cost of re-timing traffic signals, modifying the duration of traffic signal change intervals to conform with values associated with the Institute of Transportation Engineers' proposed recommended practice should be strongly considered by transportation agencies to reduce the frequency of urban motor vehicle crashes.

  15. Traces of times past : Representations of temporal intervals in memory

    NARCIS (Netherlands)

    Taatgen, Niels; van Rijn, Hedderik

    2011-01-01

    Theories of time perception typically assume that some sort of memory represents time intervals. This memory component is typically underdeveloped in theories of time perception. Following earlier work that suggested that representations of different time intervals contaminate each other (Grondin,

  16. Learned Interval Time Facilitates Associate Memory Retrieval

    Science.gov (United States)

    van de Ven, Vincent; Kochs, Sarah; Smulders, Fren; De Weerd, Peter

    2017-01-01

    The extent to which time is represented in memory remains underinvestigated. We designed a time paired associate task (TPAT) in which participants implicitly learned cue-time-target associations between cue-target pairs and specific cue-target intervals. During subsequent memory testing, participants showed increased accuracy of identifying…

  17. Interval-Censored Time-to-Event Data Methods and Applications

    CERN Document Server

    Chen, Ding-Geng

    2012-01-01

    Interval-Censored Time-to-Event Data: Methods and Applications collects the most recent techniques, models, and computational tools for interval-censored time-to-event data. Top biostatisticians from academia, biopharmaceutical industries, and government agencies discuss how these advances are impacting clinical trials and biomedical research. Divided into three parts, the book begins with an overview of interval-censored data modeling, including nonparametric estimation, survival functions, regression analysis, multivariate data analysis, competing risks analysis, and other models for interva

  18. Early diastolic time intervals during hypertensive pregnancy.

    Science.gov (United States)

    Spinelli, L; Ferro, G; Nappi, C; Farace, M J; Talarico, G; Cinquegrana, G; Condorelli, M

    1987-10-01

    Early diastolic time intervals have been assessed by means of the echopolycardiographic method in 17 pregnant women who developed hypertension during pregnancy (HP) and in 14 normal pregnant women (N). Systolic time intervals (STI), stroke volume (SV), ejection fraction (EF), and mean velocity of myocardial fiber shortening (VCF) were also evaluated. Recordings were performed in the left lateral decubitus (LLD) and then in the supine decubitus (SD). In LLD, isovolumic relaxation period (IRP) was prolonged in the hypertensive pregnant women compared with normal pregnant women (HP 51 +/- 12.5 ms, N 32.4 +/- 15 ms p less than 0.05), whereas time of the mitral valve maximum opening (DE) was not different in the groups. There was no difference in SV, EF, and mean VCF, whereas STI showed only a significant (p less than 0.05) lengthening of pre-ejection period (PEP) in HP. When the subjects shifted from the left lateral to the supine decubitus position, left ventricular ejection time index (LVETi) and SV decreased significantly (p less than 0.05) in both normotensive hypertensive pregnant women. IRP and PEP lengthened significantly (p less than 0.05) only in normals, whereas they were unchanged in HP. DE time did not vary in either group. In conclusion, hypertension superimposed on pregnancy induces lengthening of IRP, as well as of PEP, and minimizes the effects of the postural changes in preload on the above-mentioned time intervals.

  19. Timing of multiple overlapping intervals : How many clocks do we have?

    NARCIS (Netherlands)

    van Rijn, Hedderik; Taatgen, Niels A.

    2008-01-01

    Humans perceive and reproduce short intervals of time (e.g. 1-60 s) relatively accurately, and are capable of timing multiple overlapping intervals if these intervals are presented in different modalities [e.g., Rousseau, L., & Rousseau, RL (1996). Stop-reaction time and the internal clock.

  20. The Time Is Up: Compression of Visual Time Interval Estimations of Bimodal Aperiodic Patterns

    Science.gov (United States)

    Duarte, Fabiola; Lemus, Luis

    2017-01-01

    The ability to estimate time intervals subserves many of our behaviors and perceptual experiences. However, it is not clear how aperiodic (AP) stimuli affect our perception of time intervals across sensory modalities. To address this question, we evaluated the human capacity to discriminate between two acoustic (A), visual (V) or audiovisual (AV) time intervals of trains of scattered pulses. We first measured the periodicity of those stimuli and then sought for correlations with the accuracy and reaction times (RTs) of the subjects. We found that, for all time intervals tested in our experiment, the visual system consistently perceived AP stimuli as being shorter than the periodic (P) ones. In contrast, such a compression phenomenon was not apparent during auditory trials. Our conclusions are: first, the subjects exposed to P stimuli are more likely to measure their durations accurately. Second, perceptual time compression occurs for AP visual stimuli. Lastly, AV discriminations are determined by A dominance rather than by AV enhancement. PMID:28848406

  1. Monitoring molecular interactions using photon arrival-time interval distribution analysis

    Science.gov (United States)

    Laurence, Ted A [Livermore, CA; Weiss, Shimon [Los Angels, CA

    2009-10-06

    A method for analyzing/monitoring the properties of species that are labeled with fluorophores. A detector is used to detect photons emitted from species that are labeled with one or more fluorophores and located in a confocal detection volume. The arrival time of each of the photons is determined. The interval of time between various photon pairs is then determined to provide photon pair intervals. The number of photons that have arrival times within the photon pair intervals is also determined. The photon pair intervals are then used in combination with the corresponding counts of intervening photons to analyze properties and interactions of the molecules including brightness, concentration, coincidence and transit time. The method can be used for analyzing single photon streams and multiple photon streams.

  2. The effects of varying sampling intervals on the growth and survival ...

    African Journals Online (AJOL)

    Four different sampling intervals were investigated during a six-week outdoor nursery management of Heterobranchus longifilis (Valenciennes, 1840) fry in outdoor concrete tanks in order to determine the most suitable sampling regime for maximum productivity in terms of optimum growth and survival of hatchlings and ...

  3. Reference Intervals for Urinary Cotinine Levels and the Influence of Sampling Time and Other Predictors on Its Excretion Among Italian Schoolchildren

    Directory of Open Access Journals (Sweden)

    Carmela Protano

    2018-04-01

    Full Text Available (1 Background: Environmental Tobacco Smoke (ETS exposure remains a public health problem worldwide. The aims are to establish urinary (u- cotinine reference values for healthy Italian children, to evaluate the role of the sampling time and of other factors on children’s u-cotinine excretion. (2 Methods: A cross-sectional study was performed on 330 children. Information on participants was gathered by a questionnaire and u-cotinine was determined in two samples for each child, collected during the evening and the next morning. (3 Results: Reference intervals (as the 2.5th and 97.5th percentiles of the distribution in evening and morning samples were respectively equal to 0.98–4.29 and 0.91–4.50 µg L−1 (ETS unexposed and 1.39–16.34 and 1.49–20.95 µg L−1 (ETS exposed. No statistical differences were recovered between median values found in evening and morning samples, both in ETS unexposed and exposed. Significant predictors of u-cotinine excretions were ponderal status according to body mass index of children (β = 0.202; p-value = 0.041 for evening samples; β = 0.169; p-value = 0.039 for morning samples and paternal educational level (β = −0.258; p-value = 0.010; for evening samples; β = −0.013; p-value = 0.003 for morning samples. (4 Conclusions: The results evidenced the need of further studies for assessing the role of confounding factors on ETS exposure, and the necessity of educational interventions on smokers for rising their awareness about ETS.

  4. Interval timing in genetically modified mice: a simple paradigm.

    Science.gov (United States)

    Balci, F; Papachristos, E B; Gallistel, C R; Brunner, D; Gibson, J; Shumyatsky, G P

    2008-04-01

    We describe a behavioral screen for the quantitative study of interval timing and interval memory in mice. Mice learn to switch from a short-latency feeding station to a long-latency station when the short latency has passed without a feeding. The psychometric function is the cumulative distribution of switch latencies. Its median measures timing accuracy and its interquartile interval measures timing precision. Next, using this behavioral paradigm, we have examined mice with a gene knockout of the receptor for gastrin-releasing peptide that show enhanced (i.e. prolonged) freezing in fear conditioning. We have tested the hypothesis that the mutants freeze longer because they are more uncertain than wild types about when to expect the electric shock. The knockouts however show normal accuracy and precision in timing, so we have rejected this alternative hypothesis. Last, we conduct the pharmacological validation of our behavioral screen using d-amphetamine and methamphetamine. We suggest including the analysis of interval timing and temporal memory in tests of genetically modified mice for learning and memory and argue that our paradigm allows this to be done simply and efficiently.

  5. Binomial Distribution Sample Confidence Intervals Estimation 1. Sampling and Medical Key Parameters Calculation

    Directory of Open Access Journals (Sweden)

    Tudor DRUGAN

    2003-08-01

    Full Text Available The aim of the paper was to present the usefulness of the binomial distribution in studying of the contingency tables and the problems of approximation to normality of binomial distribution (the limits, advantages, and disadvantages. The classification of the medical keys parameters reported in medical literature and expressing them using the contingency table units based on their mathematical expressions restrict the discussion of the confidence intervals from 34 parameters to 9 mathematical expressions. The problem of obtaining different information starting with the computed confidence interval for a specified method, information like confidence intervals boundaries, percentages of the experimental errors, the standard deviation of the experimental errors and the deviation relative to significance level was solves through implementation in PHP programming language of original algorithms. The cases of expression, which contain two binomial variables, were separately treated. An original method of computing the confidence interval for the case of two-variable expression was proposed and implemented. The graphical representation of the expression of two binomial variables for which the variation domain of one of the variable depend on the other variable was a real problem because the most of the software used interpolation in graphical representation and the surface maps were quadratic instead of triangular. Based on an original algorithm, a module was implements in PHP in order to represent graphically the triangular surface plots. All the implementation described above was uses in computing the confidence intervals and estimating their performance for binomial distributions sample sizes and variable.

  6. Ratio-based lengths of intervals to improve fuzzy time series forecasting.

    Science.gov (United States)

    Huarng, Kunhuang; Yu, Tiffany Hui-Kuang

    2006-04-01

    The objective of this study is to explore ways of determining the useful lengths of intervals in fuzzy time series. It is suggested that ratios, instead of equal lengths of intervals, can more properly represent the intervals among observations. Ratio-based lengths of intervals are, therefore, proposed to improve fuzzy time series forecasting. Algebraic growth data, such as enrollments and the stock index, and exponential growth data, such as inventory demand, are chosen as the forecasting targets, before forecasting based on the various lengths of intervals is performed. Furthermore, sensitivity analyses are also carried out for various percentiles. The ratio-based lengths of intervals are found to outperform the effective lengths of intervals, as well as the arbitrary ones in regard to the different statistical measures. The empirical analysis suggests that the ratio-based lengths of intervals can also be used to improve fuzzy time series forecasting.

  7. Time interval measurement between to emission: a systematics

    International Nuclear Information System (INIS)

    Bizard, G.; Bougault, R.; Brou, R.; Colin, J.; Durand, D.; Genoux-Lubain, A.; Horn, D.; Kerambrun, A.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Lopez, O.; Louvel, M.; Mahi, M.; Meslin, C.; Steckmeyer, J.C.; Tamain, B.; Wieloch, A.

    1998-01-01

    A systematic study of the evolution of intervals of fragment emission times as a function of the energy deposited in the compound system was performed. Several measurements, Ne at 60 MeV/u, Ar at 30 and 60 MeV/u and two measurements for Kr at 60 MeV/u (central and semi-peripheral collisions) are presented. In all the experiments the target was Au and the mass of the compounds system was around A = 200. The excitation energies per nucleon reached in the case of these heavy systems cover the range of 3 to 5.5 MeV/u. The method used to determine the emission time intervals is based on the correlation functions associated to the relative angle distributions. The gaps between the data and simulations allow to evaluate the emission times. A rapid decrease of these time intervals was observed when the excitation energy increased. This variation starts at 500 fm/c which corresponds to a sequential emission. This relatively long time which indicates a weak interaction between fragments, corresponds practically to the measurement threshold. The shortest intervals (about 50 fm/c) are associated to a spontaneous multifragmentation and were observed in the case of central collisions at Ar+Au and Kr+Au at 60 MeV/u. Two interpretations are possible. The multifragmentation process might be viewed as a sequential process of very short time-separation or else, one can separate two zones heaving in mind that the multifragmentation is predominant from 4,5 MeV/u excitation energy upwards. This question is still open and its study is under way at LPC. An answer could come from the study of the rupture process of an excited nucleus, notably by the determination of its life-time

  8. Time interval measurement between two emissions: Ar + Au

    International Nuclear Information System (INIS)

    Bizard, G.; Bougault, R.; Brou, R.; Buta, A.; Durand, D.; Genoux-Lubain, A.; Hamdani, T.; Horn, D.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Louvel, M.; Peter, J.; Regimbart, R.; Steckmeyer, J.C.; Tamain, B.

    1998-01-01

    The Ar + Au system was studied at two bombarding energies, 30 and 60 A.MeV. The comparison of the distributions of fragment emission angles in central collisions was carried out by means of a simulation allowing the emission time interval variation. It was found that this interval depends on the bombarding energy (i.e. deposed excitation energy).For 30 A.MeV this interval is 500 fm/c (0.33 · 10 -23 s), while for 60 A.MeV it is so short that the multifragmentation concept can be used

  9. Nonparametric Estimation of Interval Reliability for Discrete-Time Semi-Markov Systems

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Limnios, Nikolaos

    2016-01-01

    In this article, we consider a repairable discrete-time semi-Markov system with finite state space. The measure of the interval reliability is given as the probability of the system being operational over a given finite-length time interval. A nonparametric estimator is proposed for the interval...

  10. Estimation of reference intervals from small samples: an example using canine plasma creatinine.

    Science.gov (United States)

    Geffré, A; Braun, J P; Trumel, C; Concordet, D

    2009-12-01

    According to international recommendations, reference intervals should be determined from at least 120 reference individuals, which often are impossible to achieve in veterinary clinical pathology, especially for wild animals. When only a small number of reference subjects is available, the possible bias cannot be known and the normality of the distribution cannot be evaluated. A comparison of reference intervals estimated by different methods could be helpful. The purpose of this study was to compare reference limits determined from a large set of canine plasma creatinine reference values, and large subsets of this data, with estimates obtained from small samples selected randomly. Twenty sets each of 120 and 27 samples were randomly selected from a set of 1439 plasma creatinine results obtained from healthy dogs in another study. Reference intervals for the whole sample and for the large samples were determined by a nonparametric method. The estimated reference limits for the small samples were minimum and maximum, mean +/- 2 SD of native and Box-Cox-transformed values, 2.5th and 97.5th percentiles by a robust method on native and Box-Cox-transformed values, and estimates from diagrams of cumulative distribution functions. The whole sample had a heavily skewed distribution, which approached Gaussian after Box-Cox transformation. The reference limits estimated from small samples were highly variable. The closest estimates to the 1439-result reference interval for 27-result subsamples were obtained by both parametric and robust methods after Box-Cox transformation but were grossly erroneous in some cases. For small samples, it is recommended that all values be reported graphically in a dot plot or histogram and that estimates of the reference limits be compared using different methods.

  11. Foundation for a Time Interval Access Control Model

    National Research Council Canada - National Science Library

    Afinidad, Francis B; Levin, Timothy E; Irvine, Cynthia E; Nguyen, Thuy D

    2005-01-01

    A new model for representing temporal access control policies is introduced. In this model, temporal authorizations are represented by time attributes associated with both subjects and objects, and a time interval access graph...

  12. Finite-Time Stability of Large-Scale Systems with Interval Time-Varying Delay in Interconnection

    Directory of Open Access Journals (Sweden)

    T. La-inchua

    2017-01-01

    Full Text Available We investigate finite-time stability of a class of nonlinear large-scale systems with interval time-varying delays in interconnection. Time-delay functions are continuous but not necessarily differentiable. Based on Lyapunov stability theory and new integral bounding technique, finite-time stability of large-scale systems with interval time-varying delays in interconnection is derived. The finite-time stability criteria are delays-dependent and are given in terms of linear matrix inequalities which can be solved by various available algorithms. Numerical examples are given to illustrate effectiveness of the proposed method.

  13. Optimal Selection of the Sampling Interval for Estimation of Modal Parameters by an ARMA- Model

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    1993-01-01

    Optimal selection of the sampling interval for estimation of the modal parameters by an ARMA-model for a white noise loaded structure modelled as a single degree of- freedom linear mechanical system is considered. An analytical solution for an optimal uniform sampling interval, which is optimal...

  14. Observer-based output feedback control of networked control systems with non-uniform sampling and time-varying delay

    Science.gov (United States)

    Meng, Su; Chen, Jie; Sun, Jian

    2017-10-01

    This paper investigates the problem of observer-based output feedback control for networked control systems with non-uniform sampling and time-varying transmission delay. The sampling intervals are assumed to vary within a given interval. The transmission delay belongs to a known interval. A discrete-time model is first established, which contains time-varying delay and norm-bounded uncertainties coming from non-uniform sampling intervals. It is then converted to an interconnection of two subsystems in which the forward channel is delay-free. The scaled small gain theorem is used to derive the stability condition for the closed-loop system. Moreover, the observer-based output feedback controller design method is proposed by utilising a modified cone complementary linearisation algorithm. Finally, numerical examples illustrate the validity and superiority of the proposed method.

  15. Interval estimation methods of the mean in small sample situation and the results' comparison

    International Nuclear Information System (INIS)

    Wu Changli; Guo Chunying; Jiang Meng; Lin Yuangen

    2009-01-01

    The methods of the sample mean's interval estimation, namely the classical method, the Bootstrap method, the Bayesian Bootstrap method, the Jackknife method and the spread method of the Empirical Characteristic distribution function are described. Numerical calculation on the samples' mean intervals is carried out where the numbers of the samples are 4, 5, 6 respectively. The results indicate the Bootstrap method and the Bayesian Bootstrap method are much more appropriate than others in small sample situation. (authors)

  16. Sample size planning for composite reliability coefficients: accuracy in parameter estimation via narrow confidence intervals.

    Science.gov (United States)

    Terry, Leann; Kelley, Ken

    2012-11-01

    Composite measures play an important role in psychology and related disciplines. Composite measures almost always have error. Correspondingly, it is important to understand the reliability of the scores from any particular composite measure. However, the point estimates of the reliability of composite measures are fallible and thus all such point estimates should be accompanied by a confidence interval. When confidence intervals are wide, there is much uncertainty in the population value of the reliability coefficient. Given the importance of reporting confidence intervals for estimates of reliability, coupled with the undesirability of wide confidence intervals, we develop methods that allow researchers to plan sample size in order to obtain narrow confidence intervals for population reliability coefficients. We first discuss composite reliability coefficients and then provide a discussion on confidence interval formation for the corresponding population value. Using the accuracy in parameter estimation approach, we develop two methods to obtain accurate estimates of reliability by planning sample size. The first method provides a way to plan sample size so that the expected confidence interval width for the population reliability coefficient is sufficiently narrow. The second method ensures that the confidence interval width will be sufficiently narrow with some desired degree of assurance (e.g., 99% assurance that the 95% confidence interval for the population reliability coefficient will be less than W units wide). The effectiveness of our methods was verified with Monte Carlo simulation studies. We demonstrate how to easily implement the methods with easy-to-use and freely available software. ©2011 The British Psychological Society.

  17. Inactivation of the Medial-Prefrontal Cortex Impairs Interval Timing Precision, but Not Timing Accuracy or Scalar Timing in a Peak-Interval Procedure in Rats

    Directory of Open Access Journals (Sweden)

    Catalin V. Buhusi

    2018-06-01

    Full Text Available Motor sequence learning, planning and execution of goal-directed behaviors, and decision making rely on accurate time estimation and production of durations in the seconds-to-minutes range. The pathways involved in planning and execution of goal-directed behaviors include cortico-striato-thalamo-cortical circuitry modulated by dopaminergic inputs. A critical feature of interval timing is its scalar property, by which the precision of timing is proportional to the timed duration. We examined the role of medial prefrontal cortex (mPFC in timing by evaluating the effect of its reversible inactivation on timing accuracy, timing precision and scalar timing. Rats were trained to time two durations in a peak-interval (PI procedure. Reversible mPFC inactivation using GABA agonist muscimol resulted in decreased timing precision, with no effect on timing accuracy and scalar timing. These results are partly at odds with studies suggesting that ramping prefrontal activity is crucial to timing but closely match simulations with the Striatal Beat Frequency (SBF model proposing that timing is coded by the coincidental activation of striatal neurons by cortical inputs. Computer simulations indicate that in SBF, gradual inactivation of cortical inputs results in a gradual decrease in timing precision with preservation of timing accuracy and scalar timing. Further studies are needed to differentiate between timing models based on coincidence detection and timing models based on ramping mPFC activity, and clarify whether mPFC is specifically involved in timing, or more generally involved in attention, working memory, or response selection/inhibition.

  18. Hybrid integrated circuit for charge-to-time interval conversion

    Energy Technology Data Exchange (ETDEWEB)

    Basiladze, S.G.; Dotsenko, Yu.Yu.; Man' yakov, P.K.; Fedorchenko, S.N. (Joint Inst. for Nuclear Research, Dubna (USSR))

    The hybrid integrated circuit for charge-to time interval conversion with nanosecond input fast response is described. The circuit can be used in energy measuring channels, time-to-digital converters and in the modified variant in amplitude-to-digital converters. The converter described consists of a buffer amplifier, a linear transmission circuit, a direct current source and a unit of time interval separation. The buffer amplifier represents a current follower providing low input and high output resistances by the current feedback. It is concluded that the described converter excelled the QT100B circuit analogous to it in a number of parameters especially, in thermostability.

  19. Perception of short time scale intervals in a hypnotic virtuoso

    NARCIS (Netherlands)

    Noreika, Valdas; Falter, Christine M.; Arstila, Valtteri; Wearden, John H.; Kallio, Sakari

    2012-01-01

    Previous studies showed that hypnotized individuals underestimate temporal intervals in the range of several seconds to tens of minutes. However, no previous work has investigated whether duration perception is equally disorderly when shorter time intervals are probed. In this study, duration

  20. Cardiac time intervals by tissue Doppler imaging M-mode echocardiography

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor

    2016-01-01

    for myocardial myocytes to achieve an LV pressure equal to that of aorta increases, resulting in a prolongation of the isovolumic contraction time (IVCT). Furthermore, the ability of myocardial myocytes to maintain the LV pressure decreases, resulting in reduction in the ejection time (ET). As LV diastolic...... of whether the LV is suffering from impaired systolic or diastolic function. A novel method of evaluating the cardiac time intervals has recently evolved. Using tissue Doppler imaging (TDI) M-mode through the mitral valve (MV) to estimate the cardiac time intervals may be an improved method reflecting global...

  1. Optimal time points sampling in pathway modelling.

    Science.gov (United States)

    Hu, Shiyan

    2004-01-01

    Modelling cellular dynamics based on experimental data is at the heart of system biology. Considerable progress has been made to dynamic pathway modelling as well as the related parameter estimation. However, few of them gives consideration for the issue of optimal sampling time selection for parameter estimation. Time course experiments in molecular biology rarely produce large and accurate data sets and the experiments involved are usually time consuming and expensive. Therefore, to approximate parameters for models with only few available sampling data is of significant practical value. For signal transduction, the sampling intervals are usually not evenly distributed and are based on heuristics. In the paper, we investigate an approach to guide the process of selecting time points in an optimal way to minimize the variance of parameter estimates. In the method, we first formulate the problem to a nonlinear constrained optimization problem by maximum likelihood estimation. We then modify and apply a quantum-inspired evolutionary algorithm, which combines the advantages of both quantum computing and evolutionary computing, to solve the optimization problem. The new algorithm does not suffer from the morass of selecting good initial values and being stuck into local optimum as usually accompanied with the conventional numerical optimization techniques. The simulation results indicate the soundness of the new method.

  2. Interval timing under a behavioral microscope: Dissociating motivational and timing processes in fixed-interval performance.

    Science.gov (United States)

    Daniels, Carter W; Sanabria, Federico

    2017-03-01

    The distribution of latencies and interresponse times (IRTs) of rats was compared between two fixed-interval (FI) schedules of food reinforcement (FI 30 s and FI 90 s), and between two levels of food deprivation. Computational modeling revealed that latencies and IRTs were well described by mixture probability distributions embodying two-state Markov chains. Analysis of these models revealed that only a subset of latencies is sensitive to the periodicity of reinforcement, and prefeeding only reduces the size of this subset. The distribution of IRTs suggests that behavior in FI schedules is organized in bouts that lengthen and ramp up in frequency with proximity to reinforcement. Prefeeding slowed down the lengthening of bouts and increased the time between bouts. When concatenated, latency and IRT models adequately reproduced sigmoidal FI response functions. These findings suggest that behavior in FI schedules fluctuates in and out of schedule control; an account of such fluctuation suggests that timing and motivation are dissociable components of FI performance. These mixture-distribution models also provide novel insights on the motivational, associative, and timing processes expressed in FI performance. These processes may be obscured, however, when performance in timing tasks is analyzed in terms of mean response rates.

  3. Comparative evaluation of nickel discharge from brackets in artificial saliva at different time intervals.

    Science.gov (United States)

    Jithesh, C; Venkataramana, V; Penumatsa, Narendravarma; Reddy, S N; Poornima, K Y; Rajasigamani, K

    2015-08-01

    To determine and compare the potential difference of nickel release from three different orthodontic brackets, in different artificial pH, in different time intervals. Twenty-seven samples of three different orthodontic brackets were selected and grouped as 1, 2, and 3. Each group was divided into three subgroups depending on the type of orthodontic brackets, salivary pH and the time interval. The Nickel release from each subgroup were analyzed by using inductively coupled plasma-Atomic Emission Spectrophotometer (Perkin Elmer, Optima 2100 DV, USA) model. Quantitative analysis of nickel was performed three times, and the mean value was used as result. ANOVA (F-test) was used to test the significant difference among the groups at 0.05 level of significance (P brackets have the highest at all 4.2 pH except in 120 h. The study result shows that the nickel release from the recycled stainless steel brackets is highest. Metal slot ceramic bracket release significantly less nickel. So, recycled stainless steel brackets should not be used for nickel allergic patients. Metal slot ceramic brackets are advisable.

  4. Optimal time interval for induction of immunologic adaptive response

    International Nuclear Information System (INIS)

    Ju Guizhi; Song Chunhua; Liu Shuzheng

    1994-01-01

    The optimal time interval between prior dose (D1) and challenge dose (D2) for the induction of immunologic adaptive response was investigated. Kunming mice were exposed to 75 mGy X-rays at a dose rate of 12.5 mGy/min. 3, 6, 12, 24 or 60 h after the prior irradiation the mice were challenged with a dose of 1.5 Gy at a dose rate of 0.33 Gy/min. 18h after D2, the mice were sacrificed for examination of immunological parameters. The results showed that with an interval of 6 h between D1 and D2, the adaptive response of the reaction of splenocytes to LPS was induced, and with an interval of 12 h the adaptive responses of spontaneous incorporation of 3 H-TdR into thymocytes and the reaction of splenocytes to Con A and LPS were induced with 75 mGy prior irradiation. The data suggested that the optimal time intervals between D1 and D2 for the induction of immunologic adaptive response were 6 h and 12 h with a D1 of 75 mGy and a D2 of 1.5 Gy. The mechanism of immunologic adaptation following low dose radiation is discussed

  5. Influence of sampling interval and number of projections on the quality of SR-XFMT reconstruction

    International Nuclear Information System (INIS)

    Deng Biao; Yu Xiaohan; Xu Hongjie

    2007-01-01

    Synchrotron Radiation based X-ray Fluorescent Microtomography (SR-XFMT) is a nondestructive technique for detecting elemental composition and distribution inside a specimen with high spatial resolution and sensitivity. In this paper, computer simulation of SR-XFMT experiment is performed. The influence of the sampling interval and the number of projections on the quality of SR-XFMT image reconstruction is analyzed. It is found that the sampling interval has greater effect on the quality of reconstruction than the number of projections. (authors)

  6. Cross-sample entropy of foreign exchange time series

    Science.gov (United States)

    Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao

    2010-11-01

    The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.

  7. Cerebellar Roles in Self-Timing for Sub- and Supra-Second Intervals.

    Science.gov (United States)

    Ohmae, Shogo; Kunimatsu, Jun; Tanaka, Masaki

    2017-03-29

    Previous studies suggest that the cerebellum and basal ganglia are involved in sub-second and supra-second timing, respectively. To test this hypothesis at the cellular level, we examined the activity of single neurons in the cerebellar dentate nucleus in monkeys performing the oculomotor version of the self-timing task. Animals were trained to report the passage of time of 400, 600, 1200, or 2400 ms following a visual cue by making self-initiated memory-guided saccades. We found a sizeable preparatory neuronal activity before self-timed saccades across delay intervals, while the time course of activity correlated with the trial-by-trial variation of saccade latency in different ways depending on the length of the delay intervals. For the shorter delay intervals, the ramping up of neuronal firing rate started just after the visual cue and the rate of rise of neuronal activity correlated with saccade timing. In contrast, for the longest delay (2400 ms), the preparatory activity started late during the delay period, and its onset time correlated with self-timed saccade latency. Because electrical microstimulation applied to the recording sites during saccade preparation advanced self-timed but not reactive saccades, regardless of their directions, the signals in the cerebellum may have a causal role in self-timing. We suggest that the cerebellum may regulate timing in both sub-second and supra-second ranges, although its relative contribution might be greater for sub-second than for supra-second time intervals. SIGNIFICANCE STATEMENT How we decide the timing of self-initiated movement is a fundamental question. According to the prevailing hypothesis, the cerebellum plays a role in monitoring sub-second timing, whereas the basal ganglia are important for supra-second timing. To verify this, we explored neuronal signals in the monkey cerebellum while animals reported the passage of time in the range 400-2400 ms by making eye movements. Contrary to our expectations, we

  8. Number of core samples: Mean concentrations and confidence intervals

    International Nuclear Information System (INIS)

    Jensen, L.; Cromar, R.D.; Wilmarth, S.R.; Heasler, P.G.

    1995-01-01

    This document provides estimates of how well the mean concentration of analytes are known as a function of the number of core samples, composite samples, and replicate analyses. The estimates are based upon core composite data from nine recently sampled single-shell tanks. The results can be used when determining the number of core samples needed to ''characterize'' the waste from similar single-shell tanks. A standard way of expressing uncertainty in the estimate of a mean is with a 95% confidence interval (CI). The authors investigate how the width of a 95% CI on the mean concentration decreases as the number of observations increase. Specifically, the tables and figures show how the relative half-width (RHW) of a 95% CI decreases as the number of core samples increases. The RHW of a CI is a unit-less measure of uncertainty. The general conclusions are as follows: (1) the RHW decreases dramatically as the number of core samples is increased, the decrease is much smaller when the number of composited samples or the number of replicate analyses are increase; (2) if the mean concentration of an analyte needs to be estimated with a small RHW, then a large number of core samples is required. The estimated number of core samples given in the tables and figures were determined by specifying different sizes of the RHW. Four nominal sizes were examined: 10%, 25%, 50%, and 100% of the observed mean concentration. For a majority of analytes the number of core samples required to achieve an accuracy within 10% of the mean concentration is extremely large. In many cases, however, two or three core samples is sufficient to achieve a RHW of approximately 50 to 100%. Because many of the analytes in the data have small concentrations, this level of accuracy may be satisfactory for some applications

  9. Individual Case Analysis of Postmortem Interval Time on Brain Tissue Preservation.

    Directory of Open Access Journals (Sweden)

    Jeffrey A Blair

    Full Text Available At autopsy, the time that has elapsed since the time of death is routinely documented and noted as the postmortem interval (PMI. The PMI of human tissue samples is a parameter often reported in research studies and comparable PMI is preferred when comparing different populations, i.e., disease versus control patients. In theory, a short PMI may alleviate non-experimental protein denaturation, enzyme activity, and other chemical changes such as the pH, which could affect protein and nucleic acid integrity. Previous studies have compared PMI en masse by looking at many different individual cases each with one unique PMI, which may be affected by individual variance. To overcome this obstacle, in this study human hippocampal segments from the same individuals were sampled at different time points after autopsy creating a series of PMIs for each case. Frozen and fixed tissue was then examined by Western blot, RT-PCR, and immunohistochemistry to evaluate the effect of extended PMI on proteins, nucleic acids, and tissue morphology. In our results, immunostaining profiles for most proteins remained unchanged even after PMI of over 50 h, yet by Western blot distinctive degradation patterns were observed in different protein species. Finally, RNA integrity was lower after extended PMI; however, RNA preservation was variable among cases suggesting antemortem factors may play a larger role than PMI in protein and nucleic acid integrity.

  10. Entrained rhythmic activities of neuronal ensembles as perceptual memory of time interval.

    Science.gov (United States)

    Sumbre, Germán; Muto, Akira; Baier, Herwig; Poo, Mu-ming

    2008-11-06

    The ability to process temporal information is fundamental to sensory perception, cognitive processing and motor behaviour of all living organisms, from amoebae to humans. Neural circuit mechanisms based on neuronal and synaptic properties have been shown to process temporal information over the range of tens of microseconds to hundreds of milliseconds. How neural circuits process temporal information in the range of seconds to minutes is much less understood. Studies of working memory in monkeys and rats have shown that neurons in the prefrontal cortex, the parietal cortex and the thalamus exhibit ramping activities that linearly correlate with the lapse of time until the end of a specific time interval of several seconds that the animal is trained to memorize. Many organisms can also memorize the time interval of rhythmic sensory stimuli in the timescale of seconds and can coordinate motor behaviour accordingly, for example, by keeping the rhythm after exposure to the beat of music. Here we report a form of rhythmic activity among specific neuronal ensembles in the zebrafish optic tectum, which retains the memory of the time interval (in the order of seconds) of repetitive sensory stimuli for a duration of up to approximately 20 s. After repetitive visual conditioning stimulation (CS) of zebrafish larvae, we observed rhythmic post-CS activities among specific tectal neuronal ensembles, with a regular interval that closely matched the CS. Visuomotor behaviour of the zebrafish larvae also showed regular post-CS repetitions at the entrained time interval that correlated with rhythmic neuronal ensemble activities in the tectum. Thus, rhythmic activities among specific neuronal ensembles may act as an adjustable 'metronome' for time intervals in the order of seconds, and serve as a mechanism for the short-term perceptual memory of rhythmic sensory experience.

  11. Haemostatic reference intervals in pregnancy

    DEFF Research Database (Denmark)

    Szecsi, Pal Bela; Jørgensen, Maja; Klajnbard, Anna

    2010-01-01

    Haemostatic reference intervals are generally based on samples from non-pregnant women. Thus, they may not be relevant to pregnant women, a problem that may hinder accurate diagnosis and treatment of haemostatic disorders during pregnancy. In this study, we establish gestational age-specific refe......Haemostatic reference intervals are generally based on samples from non-pregnant women. Thus, they may not be relevant to pregnant women, a problem that may hinder accurate diagnosis and treatment of haemostatic disorders during pregnancy. In this study, we establish gestational age......-specific reference intervals for coagulation tests during normal pregnancy. Eight hundred one women with expected normal pregnancies were included in the study. Of these women, 391 had no complications during pregnancy, vaginal delivery, or postpartum period. Plasma samples were obtained at gestational weeks 13......-20, 21-28, 29-34, 35-42, at active labor, and on postpartum days 1 and 2. Reference intervals for each gestational period using only the uncomplicated pregnancies were calculated in all 391 women for activated partial thromboplastin time (aPTT), fibrinogen, fibrin D-dimer, antithrombin, free protein S...

  12. Low-sampling-rate ultra-wideband channel estimation using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, a low-sampling-rate scheme for ultra-wideband channel estimation is proposed. The scheme exploits multiple observations generated by transmitting multiple pulses. In the proposed scheme, P pulses are transmitted to produce channel impulse response estimates at a desired sampling rate, while the ADC samples at a rate that is P times slower. To avoid loss of fidelity, the number of sampling periods (based on the desired rate) in the inter-pulse interval is restricted to be co-prime with P. This condition is affected when clock drift is present and the transmitted pulse locations change. To handle this case, and to achieve an overall good channel estimation performance, without using prior information, we derive an improved estimator based on the bounded data uncertainty (BDU) model. It is shown that this estimator is related to the Bayesian linear minimum mean squared error (LMMSE) estimator. Channel estimation performance of the proposed sub-sampling scheme combined with the new estimator is assessed in simulation. The results show that high reduction in sampling rate can be achieved. The proposed estimator outperforms the least squares estimator in almost all cases, while in the high SNR regime it also outperforms the LMMSE estimator. In addition to channel estimation, a synchronization method is also proposed that utilizes the same pulse sequence used for channel estimation. © 2014 IEEE.

  13. Working time intervals and total work time on nursing positions in Poland

    Directory of Open Access Journals (Sweden)

    Danuta Kunecka

    2015-06-01

    Full Text Available Background: For the last few years a topic of overwork on nursing posts has given rise to strong discussions. The author has set herself a goal of answering the question if it is a result of real overwork of this particular profession or rather commonly assumed frustration of this professional group. The aim of this paper is to conduct the analysis of working time on chosen nursing positions in relation to measures of time being used as intervals in the course of conducting standard professional activities during one working day. Material and Methods: Research material consisted of documentation of work time on chosen nursing workplaces, compiled between 2007–2012 within the framework of a nursing course at the Pomeranian Medical University in Szczecin. As a method of measurement a photograph of a working day has been used. Measurements were performed in institutions located in 6 voivodeships in Poland. Results: Results suggest that only 6.5% of total of surveyed representatives of nurse profession spends proper amount of time (meaning: a time set by the applicable standards on work intervals during a working day. Conclusions: The scale of the phenomenon indicates excessive workload for nursing positions, which along with a longer period of time, longer working hours may cause decrease in efficiency of work and cause a drop in quality of provided services. Med Pr 2015;66,(2:165–172

  14. Time-variant random interval natural frequency analysis of structures

    Science.gov (United States)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  15. Interval Timing Deficits Assessed by Time Reproduction Dual Tasks as Cognitive Endophenotypes for Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Hwang-Gu, Shoou-Lian; Gau, Susan Shur-Fen

    2015-01-01

    The literature has suggested timing processing as a potential endophenotype for attention deficit/hyperactivity disorder (ADHD); however, whether the subjective internal clock speed presented by verbal estimation and limited attention capacity presented by time reproduction could be endophenotypes for ADHD is still unknown. We assessed 223 youths with DSM-IV ADHD (age range: 10-17 years), 105 unaffected siblings, and 84 typically developing (TD) youths using psychiatric interviews, intelligence tests, verbal estimation and time reproduction tasks (single task and simple and difficult dual tasks) at 5-second, 12-second, and 17-second intervals. We found that youths with ADHD tended to overestimate time in verbal estimation more than their unaffected siblings and TD youths, implying that fast subjective internal clock speed might be a characteristic of ADHD, rather than an endophenotype for ADHD. Youths with ADHD and their unaffected siblings were less precise in time reproduction dual tasks than TD youths. The magnitude of estimated errors in time reproduction was greater in youths with ADHD and their unaffected siblings than in TD youths, with an increased time interval at the 17-second interval and with increased task demands on both simple and difficult dual tasks versus the single task. Increased impaired time reproduction in dual tasks with increased intervals and task demands were shown in youths with ADHD and their unaffected siblings, suggesting that time reproduction deficits explained by limited attention capacity might be a useful endophenotype of ADHD. PMID:25992899

  16. Exponential synchronization of chaotic Lur'e systems with time-varying delay via sampled-data control

    International Nuclear Information System (INIS)

    Rakkiyappan, R.; Sivasamy, R.; Lakshmanan, S.

    2014-01-01

    In this paper, we study the exponential synchronization of chaotic Lur'e systems with time-varying delays via sampled-data control by using sector nonlinearties. In order to make full use of information about sampling intervals and interval time-varying delays, new Lyapunov—Krasovskii functionals with triple integral terms are introduced. Based on the convex combination technique, two kinds of synchronization criteria are derived in terms of linear matrix inequalities, which can be efficiently solved via standard numerical software. Finally, three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed results

  17. Investigations of timing during the schedule and reinforcement intervals with wheel-running reinforcement.

    Science.gov (United States)

    Belke, Terry W; Christie-Fougere, Melissa M

    2006-11-01

    Across two experiments, a peak procedure was used to assess the timing of the onset and offset of an opportunity to run as a reinforcer. The first experiment investigated the effect of reinforcer duration on temporal discrimination of the onset of the reinforcement interval. Three male Wistar rats were exposed to fixed-interval (FI) 30-s schedules of wheel-running reinforcement and the duration of the opportunity to run was varied across values of 15, 30, and 60s. Each session consisted of 50 reinforcers and 10 probe trials. Results showed that as reinforcer duration increased, the percentage of postreinforcement pauses longer than the 30-s schedule interval increased. On probe trials, peak response rates occurred near the time of reinforcer delivery and peak times varied with reinforcer duration. In a second experiment, seven female Long-Evans rats were exposed to FI 30-s schedules leading to 30-s opportunities to run. Timing of the onset and offset of the reinforcement period was assessed by probe trials during the schedule interval and during the reinforcement interval in separate conditions. The results provided evidence of timing of the onset, but not the offset of the wheel-running reinforcement period. Further research is required to assess if timing occurs during a wheel-running reinforcement period.

  18. GENERALISED MODEL BASED CONFIDENCE INTERVALS IN TWO STAGE CLUSTER SAMPLING

    Directory of Open Access Journals (Sweden)

    Christopher Ouma Onyango

    2010-09-01

    Full Text Available Chambers and Dorfman (2002 constructed bootstrap confidence intervals in model based estimation for finite population totals assuming that auxiliary values are available throughout a target population and that the auxiliary values are independent. They also assumed that the cluster sizes are known throughout the target population. We now extend to two stage sampling in which the cluster sizes are known only for the sampled clusters, and we therefore predict the unobserved part of the population total. Jan and Elinor (2008 have done similar work, but unlike them, we use a general model, in which the auxiliary values are not necessarily independent. We demonstrate that the asymptotic properties of our proposed estimator and its coverage rates are better than those constructed under the model assisted local polynomial regression model.

  19. Networked control systems with communication constraints :tradeoffs between sampling intervals, delays and performance

    NARCIS (Netherlands)

    Heemels, W.P.M.H.; Teel, A.R.; Wouw, van de N.; Nesic, D.

    2010-01-01

    There are many communication imperfections in networked control systems (NCS) such as varying transmission delays, varying sampling/transmission intervals, packet loss, communication constraints and quantization effects. Most of the available literature on NCS focuses on only some of these aspects,

  20. Infinite time interval backward stochastic differential equations with continuous coefficients.

    Science.gov (United States)

    Zong, Zhaojun; Hu, Feng

    2016-01-01

    In this paper, we study the existence theorem for [Formula: see text] [Formula: see text] solutions to a class of 1-dimensional infinite time interval backward stochastic differential equations (BSDEs) under the conditions that the coefficients are continuous and have linear growths. We also obtain the existence of a minimal solution. Furthermore, we study the existence and uniqueness theorem for [Formula: see text] [Formula: see text] solutions of infinite time interval BSDEs with non-uniformly Lipschitz coefficients. It should be pointed out that the assumptions of this result is weaker than that of Theorem 3.1 in Zong (Turkish J Math 37:704-718, 2013).

  1. The use of a DNA stabilizer in human dental tissues stored under different temperature conditions and time intervals

    Science.gov (United States)

    TERADA, Andrea Sayuri Silveira Dias; da SILVA, Luiz Antonio Ferreira; GALO, Rodrigo; de AZEVEDO, Aline; GERLACH, Raquel Fernanda; da SILVA, Ricardo Henrique Alves

    2014-01-01

    Objective The present study evaluated the use of a reagent to stabilize the DNA extracted from human dental tissues stored under different temperature conditions and time intervals. Material and Methods A total of 161 teeth were divided into two distinct groups: intact teeth and isolated dental pulp tissue. The samples were stored with or without the product at different time intervals and temperature. After storage, DNA extraction and genomic DNA quantification were performed using real-time PCR; the fragments of the 32 samples that represented each possible condition were analyzed to find the four pre-selected markers in STR analysis. Results The results of the quantification showed values ranging from 0.01 to 10,246.88 ng/μL of DNA. The statistical difference in the quantity of DNA was observed when the factors related to the time and temperature of storage were analyzed. In relation to the use of the specific reagent, its use was relevant in the group of intact teeth when they were at room temperature for 30 and 180 days. The analysis of the fragments in the 32 selected samples was possible irrespective of the amount of DNA, confirming that the STR analysis using an automated method yields good results. Conclusions The use of a specific reagent showed a significant difference in stabilizing DNA in samples of intact human teeth stored at room temperature for 30 and 180 days, while the results showed no justification for using the product under the other conditions tested. PMID:25141206

  2. Evaluating Protocol Lifecycle Time Intervals in HIV/AIDS Clinical Trials

    Science.gov (United States)

    Schouten, Jeffrey T.; Dixon, Dennis; Varghese, Suresh; Cope, Marie T.; Marci, Joe; Kagan, Jonathan M.

    2014-01-01

    Background Identifying efficacious interventions for the prevention and treatment of human diseases depends on the efficient development and implementation of controlled clinical trials. Essential to reducing the time and burden of completing the clinical trial lifecycle is determining which aspects take the longest, delay other stages, and may lead to better resource utilization without diminishing scientific quality, safety, or the protection of human subjects. Purpose In this study we modeled time-to-event data to explore relationships between clinical trial protocol development and implementation times, as well as identify potential correlates of prolonged development and implementation. Methods We obtained time interval and participant accrual data from 111 interventional clinical trials initiated between 2006 and 2011 by NIH’s HIV/AIDS Clinical Trials Networks. We determined the time (in days) required to complete defined phases of clinical trial protocol development and implementation. Kaplan-Meier estimates were used to assess the rates at which protocols reached specified terminal events, stratified by study purpose (therapeutic, prevention) and phase group (pilot/phase I, phase II, and phase III/ IV). We also examined several potential correlates to prolonged development and implementation intervals. Results Even though phase grouping did not determine development or implementation times of either therapeutic or prevention studies, overall we observed wide variation in protocol development times. Moreover, we detected a trend toward phase III/IV therapeutic protocols exhibiting longer developmental (median 2 ½ years) and implementation times (>3years). We also found that protocols exceeding the median number of days for completing the development interval had significantly longer implementation. Limitations The use of a relatively small set of protocols may have limited our ability to detect differences across phase groupings. Some timing effects

  3. Binomial Distribution Sample Confidence Intervals Estimation 7. Absolute Risk Reduction and ARR-like Expressions

    Directory of Open Access Journals (Sweden)

    Andrei ACHIMAŞ CADARIU

    2004-08-01

    Full Text Available Assessments of a controlled clinical trial suppose to interpret some key parameters as the controlled event rate, experimental event date, relative risk, absolute risk reduction, relative risk reduction, number needed to treat when the effect of the treatment are dichotomous variables. Defined as the difference in the event rate between treatment and control groups, the absolute risk reduction is the parameter that allowed computing the number needed to treat. The absolute risk reduction is compute when the experimental treatment reduces the risk for an undesirable outcome/event. In medical literature when the absolute risk reduction is report with its confidence intervals, the method used is the asymptotic one, even if it is well know that may be inadequate. The aim of this paper is to introduce and assess nine methods of computing confidence intervals for absolute risk reduction and absolute risk reduction – like function.Computer implementations of the methods use the PHP language. Methods comparison uses the experimental errors, the standard deviations, and the deviation relative to the imposed significance level for specified sample sizes. Six methods of computing confidence intervals for absolute risk reduction and absolute risk reduction-like functions were assessed using random binomial variables and random sample sizes.The experiments shows that the ADAC, and ADAC1 methods obtains the best overall performance of computing confidence intervals for absolute risk reduction.

  4. [Estimation of the atrioventricular time interval by pulse Doppler in the normal fetal heart].

    Science.gov (United States)

    Hamela-Olkowska, Anita; Dangel, Joanna

    2009-08-01

    To assess normative values of the fetal atrioventricular (AV) time interval by pulse-wave Doppler methods on 5-chamber view. Fetal echocardiography exams were performed using Acuson Sequoia 512 in 140 singleton fetuses at 18 to 40 weeks of gestation with sinus rhythm and normal cardiac and extracardiac anatomy. Pulsed Doppler derived AV intervals were measured from left ventricular inflow/outflow view using transabdominal convex 3.5-6 MHz probe. The values of AV time interval ranged from 100 to 150 ms (mean 123 +/- 11.2). The AV interval was negatively correlated with the heart rhythm (page of gestation (p=0.007). However, in the same subgroup of the fetal heart rate there was no relation between AV intervals and gestational age. Therefore, the AV intervals showed only the heart rate dependence. The 95th percentiles of AV intervals according to FHR ranged from 135 to 148 ms. 1. The AV interval duration was negatively correlated with the heart rhythm. 2. Measurement of AV time interval is easy to perform and has a good reproducibility. It may be used for the fetal heart block screening in anti-Ro and anti-La positive pregnancies. 3. Normative values established in the study may help obstetricians in assessing fetal abnormalities of the AV conduction.

  5. Optimizing Time Intervals of Meteorological Data Used with Atmospheric Dose Modeling at SRS

    International Nuclear Information System (INIS)

    Simpkins, A.A.

    1999-01-01

    Measured tritium oxide concentrations in air have been compared with calculated values using routine release Gaussian plume models for different time intervals of meteorological data. These comparisons determined an optimum time interval of meteorological data used with atmospheric dose models at the Savannah River Site (SRS). Meteorological data of varying time intervals (1-yr to 10-yr) were used for the comparison. Insignificant differences are seen in using a one-year database as opposed to a five-year database. Use of a ten-year database results in slightly more conservative results. For meteorological databases of length one to five years the mean ratio of predicted to measured tritium oxide concentrations is approximately 1.25 whereas for the ten-year meteorological database the ration is closer to 1.35. Currently at the Savannah River Site a meteorological database of five years duration is used for all dose models. This study suggests no substantially improved accuracy using meteorological files of shorter or longer time intervals

  6. Cardiac Time Intervals Measured by Tissue Doppler Imaging M-mode

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Møgelvang, Rasmus; Schnohr, Peter

    2016-01-01

    function was evaluated in 1915 participants by using both conventional echocardiography and tissue Doppler imaging (TDI). The cardiac time intervals, including the isovolumic relaxation time (IVRT), isovolumic contraction time (IVCT), and ejection time (ET), were obtained by TDI M-mode through the mitral......). Additionally, they displayed a significant dose-response relationship, between increasing severity of elevated blood pressure and increasing left ventricular mass index (P

  7. The synaptic properties of cells define the hallmarks of interval timing in a recurrent neural network.

    Science.gov (United States)

    Pérez, Oswaldo; Merchant, Hugo

    2018-04-03

    Extensive research has described two key features of interval timing. The bias property is associated with accuracy and implies that time is overestimated for short intervals and underestimated for long intervals. The scalar property is linked to precision and states that the variability of interval estimates increases as a function of interval duration. The neural mechanisms behind these properties are not well understood. Here we implemented a recurrent neural network that mimics a cortical ensemble and includes cells that show paired-pulse facilitation and slow inhibitory synaptic currents. The network produces interval selective responses and reproduces both bias and scalar properties when a Bayesian decoder reads its activity. Notably, the interval-selectivity, timing accuracy, and precision of the network showed complex changes as a function of the decay time constants of the modeled synaptic properties and the level of background activity of the cells. These findings suggest that physiological values of the time constants for paired-pulse facilitation and GABAb, as well as the internal state of the network, determine the bias and scalar properties of interval timing. Significant Statement Timing is a fundamental element of complex behavior, including music and language. Temporal processing in a wide variety of contexts shows two primary features: time estimates exhibit a shift towards the mean (the bias property) and are more variable for longer intervals (the scalar property). We implemented a recurrent neural network that includes long-lasting synaptic currents, which can not only produce interval selective responses but also follow the bias and scalar properties. Interestingly, only physiological values of the time constants for paired-pulse facilitation and GABAb, as well as intermediate background activity within the network can reproduce the two key features of interval timing. Copyright © 2018 the authors.

  8. Mean Square Exponential Stability of Stochastic Switched System with Interval Time-Varying Delays

    Directory of Open Access Journals (Sweden)

    Manlika Rajchakit

    2012-01-01

    Full Text Available This paper is concerned with mean square exponential stability of switched stochastic system with interval time-varying delays. The time delay is any continuous function belonging to a given interval, but not necessary to be differentiable. By constructing a suitable augmented Lyapunov-Krasovskii functional combined with Leibniz-Newton’s formula, a switching rule for the mean square exponential stability of switched stochastic system with interval time-varying delays and new delay-dependent sufficient conditions for the mean square exponential stability of the switched stochastic system are first established in terms of LMIs. Numerical example is given to show the effectiveness of the obtained result.

  9. Effects of brief time delays on matching-to-sample abilities in capuchin monkeys (Sapajus spp.).

    Science.gov (United States)

    Truppa, Valentina; De Simone, Diego Antonio; Piano Mortari, Eva; De Lillo, Carlo

    2014-09-01

    Traditionally, studies of delayed matching-to-sample (DMTS) tasks in nonhuman species have focused on the assessment of the limits of the retrieval of information stored in short- and long-term memory systems. However, it is still unclear if visual recognition in these tasks is affected by very brief delay intervals, which are typically used to study rapidly decaying types of visual memory. This study aimed at evaluating if tufted capuchin monkeys' ability to recognise visual stimuli in a DMTS task is affected by (i) the disappearance of the sample stimulus and (ii) the introduction of delay intervals (0.5, 1.0, 2.0 and 3.0s) between the disappearance of the sample and the presentation of the comparison stimuli. The results demonstrated that the simple disappearance of the sample and the introduction of a delay of 0.5s did not affect capuchins' performance either in terms of accuracy or response time. A delay interval of 1.0s produced a significant increase in response time but still did not affect recognition accuracy. By contrast, delays of 2.0 and 3.0s determined a significant increase in response time and a reduction in recognition accuracy. These findings indicate the existence in capuchin monkeys of processes enabling a very accurate retention of stimulus features within time frames comparable to those reported for humans' sensory memory (0.5-1.0s). The extent to which such processes can be considered analogous to the sensory memory processes observed in human visual cognition is discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Fault detection for discrete-time LPV systems using interval observers

    Science.gov (United States)

    Zhang, Zhi-Hui; Yang, Guang-Hong

    2017-10-01

    This paper is concerned with the fault detection (FD) problem for discrete-time linear parameter-varying systems subject to bounded disturbances. A parameter-dependent FD interval observer is designed based on parameter-dependent Lyapunov and slack matrices. The design method is presented by translating the parameter-dependent linear matrix inequalities (LMIs) into finite ones. In contrast to the existing results based on parameter-independent and diagonal Lyapunov matrices, the derived disturbance attenuation, fault sensitivity and nonnegative conditions lead to less conservative LMI characterisations. Furthermore, without the need to design the residual evaluation functions and thresholds, the residual intervals generated by the interval observers are used directly for FD decision. Finally, simulation results are presented for showing the effectiveness and superiority of the proposed method.

  11. New precession expressions, valid for long time intervals

    Science.gov (United States)

    Vondrák, J.; Capitaine, N.; Wallace, P.

    2011-10-01

    Context. The present IAU model of precession, like its predecessors, is given as a set of polynomial approximations of various precession parameters intended for high-accuracy applications over a limited time span. Earlier comparisons with numerical integrations have shown that this model is valid only for a few centuries around the basic epoch, J2000.0, while for more distant epochs it rapidly diverges from the numerical solution. In our preceding studies we also obtained preliminary developments for the precessional contribution to the motion of the equator: coordinates X,Y of the precessing pole and precession parameters ψA,ωA, suitable for use over long time intervals. Aims: The goal of the present paper is to obtain upgraded developments for various sets of precession angles that would fit modern observations near J2000.0 and at the same time fit numerical integration of the motions of solar system bodies on scales of several thousand centuries. Methods: We used the IAU 2006 solutions to represent the precession of the ecliptic and of the equator close to J2000.0 and, for more distant epochs, a numerical integration using the Mercury 6 package and solutions by Laskar et al. (1993, A&A, 270, 522) with upgraded initial conditions and constants to represent the ecliptic, and general precession and obliquity, respectively. From them, different precession parameters were calculated in the interval ± 200 millennia from J2000.0, and analytical expressions are found that provide a good fit for the whole interval. Results: Series for the various precessional parameters, comprising a cubic polynomial plus from 8 to 14 periodic terms, are derived that allow precession to be computed with an accuracy comparable to IAU 2006 around the central epoch J2000.0, a few arcseconds throughout the historical period, and a few tenths of a degree at the ends of the ± 200 millennia time span. Computer algorithms are provided that compute the ecliptic and mean equator poles and the

  12. Cardiac Time Intervals by Tissue Doppler Imaging M-Mode

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Mogelvang, Rasmus; de Knegt, Martina Chantal

    2016-01-01

    PURPOSE: To define normal values of the cardiac time intervals obtained by tissue Doppler imaging (TDI) M-mode through the mitral valve (MV). Furthermore, to evaluate the association of the myocardial performance index (MPI) obtained by TDI M-mode (MPITDI) and the conventional method of obtaining...

  13. A Note on Confidence Interval for the Power of the One Sample Test

    OpenAIRE

    A. Wong

    2010-01-01

    In introductory statistics texts, the power of the test of a one-sample mean when the variance is known is widely discussed. However, when the variance is unknown, the power of the Student's -test is seldom mentioned. In this note, a general methodology for obtaining inference concerning a scalar parameter of interest of any exponential family model is proposed. The method is then applied to the one-sample mean problem with unknown variance to obtain a ( 1 − ) 100% confidence interval for...

  14. Continuous-time interval model identification of blood glucose dynamics for type 1 diabetes

    Science.gov (United States)

    Kirchsteiger, Harald; Johansson, Rolf; Renard, Eric; del Re, Luigi

    2014-07-01

    While good physiological models of the glucose metabolism in type 1 diabetic patients are well known, their parameterisation is difficult. The high intra-patient variability observed is a further major obstacle. This holds for data-based models too, so that no good patient-specific models are available. Against this background, this paper proposes the use of interval models to cover the different metabolic conditions. The control-oriented models contain a carbohydrate and insulin sensitivity factor to be used for insulin bolus calculators directly. Available clinical measurements were sampled on an irregular schedule which prompts the use of continuous-time identification, also for the direct estimation of the clinically interpretable factors mentioned above. An identification method is derived and applied to real data from 28 diabetic patients. Model estimation was done on a clinical data-set, whereas validation results shown were done on an out-of-clinic, everyday life data-set. The results show that the interval model approach allows a much more regular estimation of the parameters and avoids physiologically incompatible parameter estimates.

  15. Discriminator/time interval meter system evaluation report

    Energy Technology Data Exchange (ETDEWEB)

    Condreva, K. J.

    1976-04-12

    The purpose of this report is to discuss the evaluation of a modular prototype Discriminator/Time Interval Meter data acquisition unit as a useful tool in a digital diagnostics system. The characteristics, operation and calibration of each of the hardware components are discussed in some detail. A discussion of the system calibration, operation, and data ingestion and reduction is also given. System test results to date are given and discussed. Finally, recommendations and conclusions concerning the capabilities of the Discriminator/T.I.M. system based on test and calibration results to date are given.

  16. Discriminator/time interval meter system evaluation report

    International Nuclear Information System (INIS)

    Condreva, K.J.

    1976-01-01

    The purpose of this report is to discuss the evaluation of a modular prototype Discriminator/Time Interval Meter data acquisition unit as a useful tool in a digital diagnostics system. The characteristics, operation and calibration of each of the hardware components are discussed in some detail. A discussion of the system calibration, operation, and data ingestion and reduction is also given. System test results to date are given and discussed. Finally, recommendations and conclusions concerning the capabilities of the Discriminator/T.I.M. system based on test and calibration results to date are given

  17. The 22nd Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting

    International Nuclear Information System (INIS)

    Sydnor, R.L.

    1990-05-01

    Papers presented at the 22nd Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting are compiled. The following subject areas are covered: Rb, Cs, and H-based frequency standards and cryogenic and trapped-ion technology; satellite laser tracking networks, GLONASS timing, intercomparison of national time scales and international telecommunications; telecommunications, power distribution, platform positioning, and geophysical survey industries; military communications and navigation systems; and dissemination of precise time and frequency by means of GPS, GLONASS, MIL STAR, LORAN, and synchronous communication satellites

  18. Probing interval timing with scalp-recorded electroencephalography (EEG).

    Science.gov (United States)

    Ng, Kwun Kei; Penney, Trevor B

    2014-01-01

    Humans, and other animals, are able to easily learn the durations of events and the temporal relationships among them in spite of the absence of a dedicated sensory organ for time. This chapter summarizes the investigation of timing and time perception using scalp-recorded electroencephalography (EEG), a non-invasive technique that measures brain electrical potentials on a millisecond time scale. Over the past several decades, much has been learned about interval timing through the examination of the characteristic features of averaged EEG signals (i.e., event-related potentials, ERPs) elicited in timing paradigms. For example, the mismatch negativity (MMN) and omission potential (OP) have been used to study implicit and explicit timing, respectively, the P300 has been used to investigate temporal memory updating, and the contingent negative variation (CNV) has been used as an index of temporal decision making. In sum, EEG measures provide biomarkers of temporal processing that allow researchers to probe the cognitive and neural substrates underlying time perception.

  19. Frequency interval balanced truncation of discrete-time bilinear systems

    DEFF Research Database (Denmark)

    Jazlan, Ahmad; Sreeram, Victor; Shaker, Hamid Reza

    2016-01-01

    This paper presents the development of a new model reduction method for discrete-time bilinear systems based on the balanced truncation framework. In many model reduction applications, it is advantageous to analyze the characteristics of the system with emphasis on particular frequency intervals...... are the solution to a pair of new generalized Lyapunov equations. The conditions for solvability of these new generalized Lyapunov equations are derived and a numerical solution method for solving these generalized Lyapunov equations is presented. Numerical examples which illustrate the usage of the new...... generalized frequency interval controllability and observability gramians as part of the balanced truncation framework are provided to demonstrate the performance of the proposed method....

  20. Two sample Bayesian prediction intervals for order statistics based on the inverse exponential-type distributions using right censored sample

    Directory of Open Access Journals (Sweden)

    M.M. Mohie El-Din

    2011-10-01

    Full Text Available In this paper, two sample Bayesian prediction intervals for order statistics (OS are obtained. This prediction is based on a certain class of the inverse exponential-type distributions using a right censored sample. A general class of prior density functions is used and the predictive cumulative function is obtained in the two samples case. The class of the inverse exponential-type distributions includes several important distributions such the inverse Weibull distribution, the inverse Burr distribution, the loglogistic distribution, the inverse Pareto distribution and the inverse paralogistic distribution. Special cases of the inverse Weibull model such as the inverse exponential model and the inverse Rayleigh model are considered.

  1. Wuchereria bancrofti in Tanzania: microfilarial periodicity and effect of blood sampling time on microfilarial intensities

    DEFF Research Database (Denmark)

    Simonsen, Poul Erik; Niemann, L.; Meyrowitsch, Dan Wolf

    1997-01-01

    The circadian periodicity of Wuchereria bancrofti microfilarial (mf) intensities in peripheral blood was analysed in a group of infected individuals from an endemic community in north-eastern Tanzania. The mf density was quantified at two-hourly intervals for 24 hours. A clear nocturnal periodic...... of blood sampling before peak time is discussed, and the importance of taking sampling time into consideration when analysing data from epidemiological studies is emphasized. A simple method is devised which can be used to adjust for the influence of time on mf intensities, in studies where accurate...... information on mf intensities is necessary, and where it is impossible to obtain all samples at peak time....

  2. Count-to-count time interval distribution analysis in a fast reactor

    International Nuclear Information System (INIS)

    Perez-Navarro Gomez, A.

    1973-01-01

    The most important kinetic parameters have been measured at the zero power fast reactor CORAL-I by means of the reactor noise analysis in the time domain, using measurements of the count-to-count time intervals. (Author) 69 refs

  3. More consistent, yet less sensitive : Interval timing in autism spectrum disorders

    NARCIS (Netherlands)

    Falter, Christine M.; Noreika, Valdas; Wearden, John H.; Bailey, Anthony J.

    2012-01-01

    Even though phenomenological observations and anecdotal reports suggest atypical time processing in individuals with an autism spectrum disorder (ASD), very few psychophysical studies have investigated interval timing, and the obtained results are contradictory. The present study aimed to clarify

  4. Semiparametric regression analysis of failure time data with dependent interval censoring.

    Science.gov (United States)

    Chen, Chyong-Mei; Shen, Pao-Sheng

    2017-09-20

    Interval-censored failure-time data arise when subjects are examined or observed periodically such that the failure time of interest is not examined exactly but only known to be bracketed between two adjacent observation times. The commonly used approaches assume that the examination times and the failure time are independent or conditionally independent given covariates. In many practical applications, patients who are already in poor health or have a weak immune system before treatment usually tend to visit physicians more often after treatment than those with better health or immune system. In this situation, the visiting rate is positively correlated with the risk of failure due to the health status, which results in dependent interval-censored data. While some measurable factors affecting health status such as age, gender, and physical symptom can be included in the covariates, some health-related latent variables cannot be observed or measured. To deal with dependent interval censoring involving unobserved latent variable, we characterize the visiting/examination process as recurrent event process and propose a joint frailty model to account for the association of the failure time and visiting process. A shared gamma frailty is incorporated into the Cox model and proportional intensity model for the failure time and visiting process, respectively, in a multiplicative way. We propose a semiparametric maximum likelihood approach for estimating model parameters and show the asymptotic properties, including consistency and weak convergence. Extensive simulation studies are conducted and a data set of bladder cancer is analyzed for illustrative purposes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Robust stability analysis of uncertain stochastic neural networks with interval time-varying delay

    International Nuclear Information System (INIS)

    Feng Wei; Yang, Simon X.; Fu Wei; Wu Haixia

    2009-01-01

    This paper addresses the stability analysis problem for uncertain stochastic neural networks with interval time-varying delays. The parameter uncertainties are assumed to be norm bounded, and the delay factor is assumed to be time-varying and belong to a given interval, which means that the lower and upper bounds of interval time-varying delays are available. A sufficient condition is derived such that for all admissible uncertainties, the considered neural network is robustly, globally, asymptotically stable in the mean square. Some stability criteria are formulated by means of the feasibility of a linear matrix inequality (LMI), which can be effectively solved by some standard numerical packages. Finally, numerical examples are provided to demonstrate the usefulness of the proposed criteria.

  6. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  7. Recall intervals and time used for examination and prevention by dentists in child dental care in Denmark, Iceland, Norway and Sweden in 1996 and 2014

    DEFF Research Database (Denmark)

    Wang, N J; Petersen, P E; Sveinsdóttir, E G

    2018-01-01

    OBJECTIVE: The purpose of the present study was to explore intervals between regular dental examination and the time dentists spent for examination and preventive dental care of children in 1996 and 2014. PARTICIPANTS AND METHODS: In Denmark, Norway and Sweden, random samples of dentists working...... examinations in three of the four countries in 2014 than in 1996. CONCLUSIONS: This study of trends in dental care delivered by dentists during recent decades showed moves towards extended recall intervals and preventive care individualized according to caries risk. In addition, extending intervals could...... dentists used ample time delivering preventive care to children. Dentists reported spending significantly more time providing preventive care for caries risk children than for other children both in 1996 and 2014. Concurrent with extended intervals, dentists reported spending longer performing routine...

  8. Internal representations of temporal statistics and feedback calibrate motor-sensory interval timing.

    Directory of Open Access Journals (Sweden)

    Luigi Acerbi

    Full Text Available Humans have been shown to adapt to the temporal statistics of timing tasks so as to optimize the accuracy of their responses, in agreement with the predictions of Bayesian integration. This suggests that they build an internal representation of both the experimentally imposed distribution of time intervals (the prior and of the error (the loss function. The responses of a Bayesian ideal observer depend crucially on these internal representations, which have only been previously studied for simple distributions. To study the nature of these representations we asked subjects to reproduce time intervals drawn from underlying temporal distributions of varying complexity, from uniform to highly skewed or bimodal while also varying the error mapping that determined the performance feedback. Interval reproduction times were affected by both the distribution and feedback, in good agreement with a performance-optimizing Bayesian observer and actor model. Bayesian model comparison highlighted that subjects were integrating the provided feedback and represented the experimental distribution with a smoothed approximation. A nonparametric reconstruction of the subjective priors from the data shows that they are generally in agreement with the true distributions up to third-order moments, but with systematically heavier tails. In particular, higher-order statistical features (kurtosis, multimodality seem much harder to acquire. Our findings suggest that humans have only minor constraints on learning lower-order statistical properties of unimodal (including peaked and skewed distributions of time intervals under the guidance of corrective feedback, and that their behavior is well explained by Bayesian decision theory.

  9. Concurrent variable-interval variable-ratio schedules in a dynamic choice environment.

    Science.gov (United States)

    Bell, Matthew C; Baum, William M

    2017-11-01

    Most studies of operant choice have focused on presenting subjects with a fixed pair of schedules across many experimental sessions. Using these methods, studies of concurrent variable- interval variable-ratio schedules helped to evaluate theories of choice. More recently, a growing literature has focused on dynamic choice behavior. Those dynamic choice studies have analyzed behavior on a number of different time scales using concurrent variable-interval schedules. Following the dynamic choice approach, the present experiment examined performance on concurrent variable-interval variable-ratio schedules in a rapidly changing environment. Our objectives were to compare performance on concurrent variable-interval variable-ratio schedules with extant data on concurrent variable-interval variable-interval schedules using a dynamic choice procedure and to extend earlier work on concurrent variable-interval variable-ratio schedules. We analyzed performances at different time scales, finding strong similarities between concurrent variable-interval variable-interval and concurrent variable-interval variable- ratio performance within dynamic choice procedures. Time-based measures revealed almost identical performance in the two procedures compared with response-based measures, supporting the view that choice is best understood as time allocation. Performance at the smaller time scale of visits accorded with the tendency seen in earlier research toward developing a pattern of strong preference for and long visits to the richer alternative paired with brief "samples" at the leaner alternative ("fix and sample"). © 2017 Society for the Experimental Analysis of Behavior.

  10. Interval-value Based Particle Swarm Optimization algorithm for cancer-type specific gene selection and sample classification

    Directory of Open Access Journals (Sweden)

    D. Ramyachitra

    2015-09-01

    Full Text Available Microarray technology allows simultaneous measurement of the expression levels of thousands of genes within a biological tissue sample. The fundamental power of microarrays lies within the ability to conduct parallel surveys of gene expression using microarray data. The classification of tissue samples based on gene expression data is an important problem in medical diagnosis of diseases such as cancer. In gene expression data, the number of genes is usually very high compared to the number of data samples. Thus the difficulty that lies with data are of high dimensionality and the sample size is small. This research work addresses the problem by classifying resultant dataset using the existing algorithms such as Support Vector Machine (SVM, K-nearest neighbor (KNN, Interval Valued Classification (IVC and the improvised Interval Value based Particle Swarm Optimization (IVPSO algorithm. Thus the results show that the IVPSO algorithm outperformed compared with other algorithms under several performance evaluation functions.

  11. Interval-value Based Particle Swarm Optimization algorithm for cancer-type specific gene selection and sample classification.

    Science.gov (United States)

    Ramyachitra, D; Sofia, M; Manikandan, P

    2015-09-01

    Microarray technology allows simultaneous measurement of the expression levels of thousands of genes within a biological tissue sample. The fundamental power of microarrays lies within the ability to conduct parallel surveys of gene expression using microarray data. The classification of tissue samples based on gene expression data is an important problem in medical diagnosis of diseases such as cancer. In gene expression data, the number of genes is usually very high compared to the number of data samples. Thus the difficulty that lies with data are of high dimensionality and the sample size is small. This research work addresses the problem by classifying resultant dataset using the existing algorithms such as Support Vector Machine (SVM), K-nearest neighbor (KNN), Interval Valued Classification (IVC) and the improvised Interval Value based Particle Swarm Optimization (IVPSO) algorithm. Thus the results show that the IVPSO algorithm outperformed compared with other algorithms under several performance evaluation functions.

  12. Confidence intervals for population allele frequencies: the general case of sampling from a finite diploid population of any size.

    Science.gov (United States)

    Fung, Tak; Keenan, Kevin

    2014-01-01

    The estimation of population allele frequencies using sample data forms a central component of studies in population genetics. These estimates can be used to test hypotheses on the evolutionary processes governing changes in genetic variation among populations. However, existing studies frequently do not account for sampling uncertainty in these estimates, thus compromising their utility. Incorporation of this uncertainty has been hindered by the lack of a method for constructing confidence intervals containing the population allele frequencies, for the general case of sampling from a finite diploid population of any size. In this study, we address this important knowledge gap by presenting a rigorous mathematical method to construct such confidence intervals. For a range of scenarios, the method is used to demonstrate that for a particular allele, in order to obtain accurate estimates within 0.05 of the population allele frequency with high probability (> or = 95%), a sample size of > 30 is often required. This analysis is augmented by an application of the method to empirical sample allele frequency data for two populations of the checkerspot butterfly (Melitaea cinxia L.), occupying meadows in Finland. For each population, the method is used to derive > or = 98.3% confidence intervals for the population frequencies of three alleles. These intervals are then used to construct two joint > or = 95% confidence regions, one for the set of three frequencies for each population. These regions are then used to derive a > or = 95%% confidence interval for Jost's D, a measure of genetic differentiation between the two populations. Overall, the results demonstrate the practical utility of the method with respect to informing sampling design and accounting for sampling uncertainty in studies of population genetics, important for scientific hypothesis-testing and also for risk-based natural resource management.

  13. Confidence intervals for population allele frequencies: the general case of sampling from a finite diploid population of any size.

    Directory of Open Access Journals (Sweden)

    Tak Fung

    Full Text Available The estimation of population allele frequencies using sample data forms a central component of studies in population genetics. These estimates can be used to test hypotheses on the evolutionary processes governing changes in genetic variation among populations. However, existing studies frequently do not account for sampling uncertainty in these estimates, thus compromising their utility. Incorporation of this uncertainty has been hindered by the lack of a method for constructing confidence intervals containing the population allele frequencies, for the general case of sampling from a finite diploid population of any size. In this study, we address this important knowledge gap by presenting a rigorous mathematical method to construct such confidence intervals. For a range of scenarios, the method is used to demonstrate that for a particular allele, in order to obtain accurate estimates within 0.05 of the population allele frequency with high probability (> or = 95%, a sample size of > 30 is often required. This analysis is augmented by an application of the method to empirical sample allele frequency data for two populations of the checkerspot butterfly (Melitaea cinxia L., occupying meadows in Finland. For each population, the method is used to derive > or = 98.3% confidence intervals for the population frequencies of three alleles. These intervals are then used to construct two joint > or = 95% confidence regions, one for the set of three frequencies for each population. These regions are then used to derive a > or = 95%% confidence interval for Jost's D, a measure of genetic differentiation between the two populations. Overall, the results demonstrate the practical utility of the method with respect to informing sampling design and accounting for sampling uncertainty in studies of population genetics, important for scientific hypothesis-testing and also for risk-based natural resource management.

  14. The importance of time interval to development of second tumor in metachronous bilateral wilms' tumor

    International Nuclear Information System (INIS)

    Paulino, Arnold C.; Thakkar, Bharat; Henderson, William G.

    1997-01-01

    Purpose: To determine whether the time interval to development of second tumor is a prognostic factor for overall survival in children with metachronous bilateral Wilms' tumor and to give a recommendation regarding screening of the contralateral kidney in patients with Wilms' tumor. Materials and Management: A literature search using MEDLINE was performed of manuscripts in the English language from 1950-1996 and identified 108 children with metachronous bilateral Wilms' tumor. Children were classified according to time interval to development of a contralateral Wilms' tumor ( 78 mos (2), 78 - < 84 mos (1), 84 - < 90 mos (0), 90 - < 96 mos (1), ≥ 96 mos (0). Analysis of overall survival in patients with a time interval of < 18 months and ≥ 18 months showed a 10 year survival of 39.6% and 55.2%, respectively (p = 0.024, log-rank test). Conclusions: Children with metachronous bilateral Wilms' tumor who develop a contralateral tumor at a time interval of ≥ 18 months from the initial Wilms' tumor had a better overall survival than children with a time interval of < 18 months. Screening by abdominal ultrasound of the contralateral kidney for more than 5 years after initial diagnosis of Wilms' tumor may not be necessary since 102/106 (96.2%) of children had a time interval to second tumor of < 60 months

  15. Procedure prediction from symbolic Electronic Health Records via time intervals analytics.

    Science.gov (United States)

    Moskovitch, Robert; Polubriaginof, Fernanda; Weiss, Aviram; Ryan, Patrick; Tatonetti, Nicholas

    2017-11-01

    Prediction of medical events, such as clinical procedures, is essential for preventing disease, understanding disease mechanism, and increasing patient quality of care. Although longitudinal clinical data from Electronic Health Records provides opportunities to develop predictive models, the use of these data faces significant challenges. Primarily, while the data are longitudinal and represent thousands of conceptual events having duration, they are also sparse, complicating the application of traditional analysis approaches. Furthermore, the framework presented here takes advantage of the events duration and gaps. International standards for electronic healthcare data represent data elements, such as procedures, conditions, and drug exposures, using eras, or time intervals. Such eras contain both an event and a duration and enable the application of time intervals mining - a relatively new subfield of data mining. In this study, we present Maitreya, a framework for time intervals analytics in longitudinal clinical data. Maitreya discovers frequent time intervals related patterns (TIRPs), which we use as prognostic markers for modelling clinical events. We introduce three novel TIRP metrics that are normalized versions of the horizontal-support, that represents the number of TIRP instances per patient. We evaluate Maitreya on 28 frequent and clinically important procedures, using the three novel TIRP representation metrics in comparison to no temporal representation and previous TIRPs metrics. We also evaluate the epsilon value that makes Allen's relations more flexible with several settings of 30, 60, 90 and 180days in comparison to the default zero. For twenty-two of these procedures, the use of temporal patterns as predictors was superior to non-temporal features, and the use of the vertically normalized horizontal support metric to represent TIRPs as features was most effective. The use of the epsilon value with thirty days was slightly better than the zero

  16. Opposite Distortions in Interval Timing Perception for Visual and Auditory Stimuli with Temporal Modulations.

    Science.gov (United States)

    Yuasa, Kenichi; Yotsumoto, Yuko

    2015-01-01

    When an object is presented visually and moves or flickers, the perception of its duration tends to be overestimated. Such an overestimation is called time dilation. Perceived time can also be distorted when a stimulus is presented aurally as an auditory flutter, but the mechanisms and their relationship to visual processing remains unclear. In the present study, we measured interval timing perception while modulating the temporal characteristics of visual and auditory stimuli, and investigated whether the interval times of visually and aurally presented objects shared a common mechanism. In these experiments, participants compared the durations of flickering or fluttering stimuli to standard stimuli, which were presented continuously. Perceived durations for auditory flutters were underestimated, while perceived durations of visual flickers were overestimated. When auditory flutters and visual flickers were presented simultaneously, these distortion effects were cancelled out. When auditory flutters were presented with a constantly presented visual stimulus, the interval timing perception of the visual stimulus was affected by the auditory flutters. These results indicate that interval timing perception is governed by independent mechanisms for visual and auditory processing, and that there are some interactions between the two processing systems.

  17. Estimation of time-delayed mutual information and bias for irregularly and sparsely sampled time-series

    International Nuclear Information System (INIS)

    Albers, D.J.; Hripcsak, George

    2012-01-01

    Highlights: ► Time-delayed mutual information for irregularly sampled time-series. ► Estimation bias for the time-delayed mutual information calculation. ► Fast, simple, PDF estimator independent, time-delayed mutual information bias estimate. ► Quantification of data-set-size limits of the time-delayed mutual calculation. - Abstract: A method to estimate the time-dependent correlation via an empirical bias estimate of the time-delayed mutual information for a time-series is proposed. In particular, the bias of the time-delayed mutual information is shown to often be equivalent to the mutual information between two distributions of points from the same system separated by infinite time. Thus intuitively, estimation of the bias is reduced to estimation of the mutual information between distributions of data points separated by large time intervals. The proposed bias estimation techniques are shown to work for Lorenz equations data and glucose time series data of three patients from the Columbia University Medical Center database.

  18. Investigation of Bicycle Travel Time Estimation Using Bluetooth Sensors for Low Sampling Rates

    Directory of Open Access Journals (Sweden)

    Zhenyu Mei

    2014-10-01

    Full Text Available Filtering the data for bicycle travel time using Bluetooth sensors is crucial to the estimation of link travel times on a corridor. The current paper describes an adaptive filtering algorithm for estimating bicycle travel times using Bluetooth data, with consideration of low sampling rates. The data for bicycle travel time using Bluetooth sensors has two characteristics. First, the bicycle flow contains stable and unstable conditions. Second, the collected data have low sampling rates (less than 1%. To avoid erroneous inference, filters are introduced to “purify” multiple time series. The valid data are identified within a dynamically varying validity window with the use of a robust data-filtering procedure. The size of the validity window varies based on the number of preceding sampling intervals without a Bluetooth record. Applications of the proposed algorithm to the dataset from Genshan East Road and Moganshan Road in Hangzhou demonstrate its ability to track typical variations in bicycle travel time efficiently, while suppressing high frequency noise signals.

  19. Embodiment and the origin of interval timing: kinematic and electromyographic data.

    Science.gov (United States)

    Addyman, Caspar; Rocha, Sinead; Fautrelle, Lilian; French, Robert M; Thomas, Elizabeth; Mareschal, Denis

    2017-03-01

    Recent evidence suggests that interval timing (the judgment of durations lasting from approximately 500 ms. to a few minutes) is closely coupled to the action control system. We used surface electromyography (EMG) and motion capture technology to explore the emergence of this coupling in 4-, 6-, and 8-month-olds. We engaged infants in an active and socially relevant arm-raising task with seven cycles and response period. In one condition, cycles were slow (every 4 s); in another, they were fast (every 2 s). In the slow condition, we found evidence of time-locked sub-threshold EMG activity even in the absence of any observed overt motor responses at all three ages. This study shows that EMGs can be a more sensitive measure of interval timing in early development than overt behavior.

  20. Beat-to-beat systolic time-interval measurement from heart sounds and ECG

    International Nuclear Information System (INIS)

    Paiva, R P; Carvalho, P; Couceiro, R; Henriques, J; Antunes, M; Quintal, I; Muehlsteff, J

    2012-01-01

    Systolic time intervals are highly correlated to fundamental cardiac functions. Several studies have shown that these measurements have significant diagnostic and prognostic value in heart failure condition and are adequate for long-term patient follow-up and disease management. In this paper, we investigate the feasibility of using heart sound (HS) to accurately measure the opening and closing moments of the aortic heart valve. These moments are crucial to define the main systolic timings of the heart cycle, i.e. pre-ejection period (PEP) and left ventricular ejection time (LVET). We introduce an algorithm for automatic extraction of PEP and LVET using HS and electrocardiogram. PEP is estimated with a Bayesian approach using the signal's instantaneous amplitude and patient-specific time intervals between atrio-ventricular valve closure and aortic valve opening. As for LVET, since the aortic valve closure corresponds to the start of the S2 HS component, we base LVET estimation on the detection of the S2 onset. A comparative assessment of the main systolic time intervals is performed using synchronous signal acquisitions of the current gold standard in cardiac time-interval measurement, i.e. echocardiography, and HS. The algorithms were evaluated on a healthy population, as well as on a group of subjects with different cardiovascular diseases (CVD). In the healthy group, from a set of 942 heartbeats, the proposed algorithm achieved 7.66 ± 5.92 ms absolute PEP estimation error. For LVET, the absolute estimation error was 11.39 ± 8.98 ms. For the CVD population, 404 beats were used, leading to 11.86 ± 8.30 and 17.51 ± 17.21 ms absolute PEP and LVET errors, respectively. The results achieved in this study suggest that HS can be used to accurately estimate LVET and PEP. (paper)

  1. Time interval measurement between two emissions: Kr + Au

    International Nuclear Information System (INIS)

    Aboufirassi, M; Bougault, R.; Brou, R.; Colin, J.; Durand, D.; Genoux-Lubain, A.; Horn, D.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Lefebvres, F.; Lopez, O.; Louvel, M.; Mahi, M.; Steckmeyer, J.C.; Tamain, B.

    1998-01-01

    To indicate the method allowing the determination of the emission intervals, the results obtained with the Kr + Au system at 43 and 60 A.MeV are presented. The experiments were performed with the NAUTILUS exclusive detectors. Central collisions were selected by means of a relative velocity criterion to reject the events containing a forward emitted fragment. For the two bombardment energies the data analysis shows that the formation of a compound of mass around A = 200. By comparing the fragment dynamical variables with simulations one can conclude about the simultaneity of the compound deexcitation processes. It was found that a 5 MeV/A is able to reproduce the characteristics of the detected fragments. Also, it was found that to reproduce the dynamical characteristics of the fragments issued from central collisions it was not necessary to superimpose a radial collective energy upon the Coulomb and thermal motion. The distribution of the relative angles between detected fragments is used here as a chronometer. For simultaneous ruptures the small relative angles are forbidden by the Coulomb repulsion, while for sequential processes this interdiction is the more lifted the longer the interval between the two emissions is. For the system discussed here the comparison between simulation and data has been carried out for the extreme cases, i.e. for a vanishing and infinite time interval between the two emissions, respectively. More sophisticated simulations to describe angular distributions between the emitted fragments were also developed

  2. Systolic time intervals vs invasive predictors of fluid responsiveness after coronary artery bypass surgery(dagger)

    NARCIS (Netherlands)

    Smorenberg, A.; Lust, E.J.; Beishuizen, A.; Meijer, J.H.; Verdaasdonk, R.M.; Groeneveld, A.B.J.

    2013-01-01

    OBJECTIVES: Haemodynamic parameters for predicting fluid responsiveness in intensive care patients are invasive, technically challenging or not universally applicable. We compared the initial systolic time interval (ISTI), a non-invasive measure of the time interval between the electrical and

  3. Characterization of Cardiac Time Intervals in Healthy Bonnet Macaques (Macaca radiata) by Using an Electronic Stethoscope

    Science.gov (United States)

    Kamran, Haroon; Salciccioli, Louis; Pushilin, Sergei; Kumar, Paraag; Carter, John; Kuo, John; Novotney, Carol; Lazar, Jason M

    2011-01-01

    Nonhuman primates are used frequently in cardiovascular research. Cardiac time intervals derived by phonocardiography have long been used to assess left ventricular function. Electronic stethoscopes are simple low-cost systems that display heart sound signals. We assessed the use of an electronic stethoscope to measure cardiac time intervals in 48 healthy bonnet macaques (age, 8 ± 5 y) based on recorded heart sounds. Technically adequate recordings were obtained from all animals and required 1.5 ± 1.3 min. The following cardiac time intervals were determined by simultaneously recording acoustic and single-lead electrocardiographic data: electromechanical activation time (QS1), electromechanical systole (QS2), the time interval between the first and second heart sounds (S1S2), and the time interval between the second and first sounds (S2S1). QS2 was correlated with heart rate, mean arterial pressure, diastolic blood pressure, and left ventricular ejection time determined by using echocardiography. S1S2 correlated with heart rate, mean arterial pressure, diastolic blood pressure, left ventricular ejection time, and age. S2S1 correlated with heart rate, mean arterial pressure, diastolic blood pressure, systolic blood pressure, and left ventricular ejection time. QS1 did not correlate with any anthropometric or echocardiographic parameter. The relation S1S2/S2S1 correlated with systolic blood pressure. On multivariate analyses, heart rate was the only independent predictor of QS2, S1S2, and S2S1. In conclusion, determination of cardiac time intervals is feasible and reproducible by using an electrical stethoscope in nonhuman primates. Heart rate is a major determinant of QS2, S1S2, and S2S1 but not QS1; regression equations for reference values for cardiac time intervals in bonnet macaques are provided. PMID:21439218

  4. The time interval distribution of sand–dust storms in theory: testing with observational data for Yanchi, China

    International Nuclear Information System (INIS)

    Liu, Guoliang; Zhang, Feng; Hao, Lizhen

    2012-01-01

    We previously introduced a time record model for use in studying the duration of sand–dust storms. In the model, X is the normalized wind speed and Xr is the normalized wind speed threshold for the sand–dust storm. X is represented by a random signal with a normal Gaussian distribution. The storms occur when X ≥ Xr. From this model, the time interval distribution of N = Aexp(−bt) can be deduced, wherein N is the number of time intervals with length greater than t, A and b are constants, and b is related to Xr. In this study, sand–dust storm data recorded in spring at the Yanchi meteorological station in China were analysed to verify whether the time interval distribution of the sand–dust storms agrees with the above time interval distribution. We found that the distribution of the time interval between successive sand–dust storms in April agrees well with the above exponential equation. However, the interval distribution for the sand–dust storm data for the entire spring period displayed a better fit to the Weibull equation and depended on the variation of the sand–dust storm threshold wind speed. (paper)

  5. Design of time interval generator based on hybrid counting method

    International Nuclear Information System (INIS)

    Yao, Yuan; Wang, Zhaoqi; Lu, Houbing; Chen, Lian; Jin, Ge

    2016-01-01

    Time Interval Generators (TIGs) are frequently used for the characterizations or timing operations of instruments in particle physics experiments. Though some “off-the-shelf” TIGs can be employed, the necessity of a custom test system or control system makes the TIGs, being implemented in a programmable device desirable. Nowadays, the feasibility of using Field Programmable Gate Arrays (FPGAs) to implement particle physics instrumentation has been validated in the design of Time-to-Digital Converters (TDCs) for precise time measurement. The FPGA-TDC technique is based on the architectures of Tapped Delay Line (TDL), whose delay cells are down to few tens of picosecond. In this case, FPGA-based TIGs with high delay step are preferable allowing the implementation of customized particle physics instrumentations and other utilities on the same FPGA device. A hybrid counting method for designing TIGs with both high resolution and wide range is presented in this paper. The combination of two different counting methods realizing an integratable TIG is described in detail. A specially designed multiplexer for tap selection is emphatically introduced. The special structure of the multiplexer is devised for minimizing the different additional delays caused by the unpredictable routings from different taps to the output. A Kintex-7 FPGA is used for the hybrid counting-based implementation of a TIG, providing a resolution up to 11 ps and an interval range up to 8 s.

  6. Design of time interval generator based on hybrid counting method

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Yuan [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Zhaoqi [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Lu, Houbing [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Hefei Electronic Engineering Institute, Hefei 230037 (China); Chen, Lian [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Jin, Ge, E-mail: goldjin@ustc.edu.cn [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China)

    2016-10-01

    Time Interval Generators (TIGs) are frequently used for the characterizations or timing operations of instruments in particle physics experiments. Though some “off-the-shelf” TIGs can be employed, the necessity of a custom test system or control system makes the TIGs, being implemented in a programmable device desirable. Nowadays, the feasibility of using Field Programmable Gate Arrays (FPGAs) to implement particle physics instrumentation has been validated in the design of Time-to-Digital Converters (TDCs) for precise time measurement. The FPGA-TDC technique is based on the architectures of Tapped Delay Line (TDL), whose delay cells are down to few tens of picosecond. In this case, FPGA-based TIGs with high delay step are preferable allowing the implementation of customized particle physics instrumentations and other utilities on the same FPGA device. A hybrid counting method for designing TIGs with both high resolution and wide range is presented in this paper. The combination of two different counting methods realizing an integratable TIG is described in detail. A specially designed multiplexer for tap selection is emphatically introduced. The special structure of the multiplexer is devised for minimizing the different additional delays caused by the unpredictable routings from different taps to the output. A Kintex-7 FPGA is used for the hybrid counting-based implementation of a TIG, providing a resolution up to 11 ps and an interval range up to 8 s.

  7. Magnetic Resonance Fingerprinting with short relaxation intervals.

    Science.gov (United States)

    Amthor, Thomas; Doneva, Mariya; Koken, Peter; Sommer, Karsten; Meineke, Jakob; Börnert, Peter

    2017-09-01

    The aim of this study was to investigate a technique for improving the performance of Magnetic Resonance Fingerprinting (MRF) in repetitive sampling schemes, in particular for 3D MRF acquisition, by shortening relaxation intervals between MRF pulse train repetitions. A calculation method for MRF dictionaries adapted to short relaxation intervals and non-relaxed initial spin states is presented, based on the concept of stationary fingerprints. The method is applicable to many different k-space sampling schemes in 2D and 3D. For accuracy analysis, T 1 and T 2 values of a phantom are determined by single-slice Cartesian MRF for different relaxation intervals and are compared with quantitative reference measurements. The relevance of slice profile effects is also investigated in this case. To further illustrate the capabilities of the method, an application to in-vivo spiral 3D MRF measurements is demonstrated. The proposed computation method enables accurate parameter estimation even for the shortest relaxation intervals, as investigated for different sampling patterns in 2D and 3D. In 2D Cartesian measurements, we achieved a scan acceleration of more than a factor of two, while maintaining acceptable accuracy: The largest T 1 values of a sample set deviated from their reference values by 0.3% (longest relaxation interval) and 2.4% (shortest relaxation interval). The largest T 2 values showed systematic deviations of up to 10% for all relaxation intervals, which is discussed. The influence of slice profile effects for multislice acquisition is shown to become increasingly relevant for short relaxation intervals. In 3D spiral measurements, a scan time reduction of 36% was achieved, maintaining the quality of in-vivo T1 and T2 maps. Reducing the relaxation interval between MRF sequence repetitions using stationary fingerprint dictionaries is a feasible method to improve the scan efficiency of MRF sequences. The method enables fast implementations of 3D spatially

  8. Dimensional Stability of Two Polyvinyl Siloxane Impression Materials in Different Time Intervals

    Directory of Open Access Journals (Sweden)

    Aalaei Sh

    2015-12-01

    Full Text Available Statement of the Problem: Dental prosthesis is usually made indirectly; there- fore dimensional stability of the impression material is very important. Every few years, new impression materials with different manufacturers’ claims regarding their better properties are introduced to the dental markets which require more research to evaluate their true dimensional changes. Objectives: The aim of this study was to evaluate dimensional stability of additional silicone impression material (Panasil® and Affinis® in different time intervals. Materials and Methods: In this experimental study, using two additional silicones (Panasil® and Affinis®, we made sixty impressions of standard die in similar conditions of 23 °C and 59% relative humidity by a special tray. The die included three horizontal and two vertical lines that were parallel. The vertical line crossed the horizontal ones at a point that served as reference for measurement. All impressions were poured with high strength dental stone. The dimensions were measured by stereo-microscope by two examiners in three interval storage times (1, 24 and 168 hours.The data were statistically analyzed using t-test and ANOVA. Results: All of the stone casts were larger than the standard die. Dimensional changes of Panasil and Affinis were 0.07%, 0.24%, 0.27% and 0.02%, 0.07%, 0.16% after 1, 24 and 168 hours, respectively. Dimensional change for two impression materials wasn’t significant in the interval time, expect for Panasil after one week (p = 0.004. Conclusions: According to the limitations of this study, Affinis impressions were dimensionally more stable than Panasil ones, but it was not significant. Dimensional change of Panasil impression showed a statistically significant difference after one week. Dimensional changes of both impression materials were based on ADA standard limitation in all time intervals (< 0.5%; therefore, dimensional stability of this impression was accepted at least

  9. Automatic, time-interval traffic counts for recreation area management planning

    Science.gov (United States)

    D. L. Erickson; C. J. Liu; H. K. Cordell

    1980-01-01

    Automatic, time-interval recorders were used to count directional vehicular traffic on a multiple entry/exit road network in the Red River Gorge Geological Area, Daniel Boone National Forest. Hourly counts of entering and exiting traffic differed according to recorder location, but an aggregated distribution showed a delayed peak in exiting traffic thought to be...

  10. A Note on Confidence Interval for the Power of the One Sample Test

    Directory of Open Access Journals (Sweden)

    A. Wong

    2010-01-01

    Full Text Available In introductory statistics texts, the power of the test of a one-sample mean when the variance is known is widely discussed. However, when the variance is unknown, the power of the Student's -test is seldom mentioned. In this note, a general methodology for obtaining inference concerning a scalar parameter of interest of any exponential family model is proposed. The method is then applied to the one-sample mean problem with unknown variance to obtain a (1−100% confidence interval for the power of the Student's -test that detects the difference (−0. The calculations require only the density and the cumulative distribution functions of the standard normal distribution. In addition, the methodology presented can also be applied to determine the required sample size when the effect size and the power of a size test of mean are given.

  11. Haemostatic reference intervals in pregnancy

    DEFF Research Database (Denmark)

    Szecsi, Pal Bela; Jørgensen, Maja; Klajnbard, Anna

    2010-01-01

    largely unchanged during pregnancy, delivery, and postpartum and were within non-pregnant reference intervals. However, levels of fibrinogen, D-dimer, and coagulation factors VII, VIII, and IX increased markedly. Protein S activity decreased substantially, while free protein S decreased slightly and total......Haemostatic reference intervals are generally based on samples from non-pregnant women. Thus, they may not be relevant to pregnant women, a problem that may hinder accurate diagnosis and treatment of haemostatic disorders during pregnancy. In this study, we establish gestational age......-20, 21-28, 29-34, 35-42, at active labor, and on postpartum days 1 and 2. Reference intervals for each gestational period using only the uncomplicated pregnancies were calculated in all 391 women for activated partial thromboplastin time (aPTT), fibrinogen, fibrin D-dimer, antithrombin, free protein S...

  12. Estimation of sojourn time in chronic disease screening without data on interval cases.

    Science.gov (United States)

    Chen, T H; Kuo, H S; Yen, M F; Lai, M S; Tabar, L; Duffy, S W

    2000-03-01

    Estimation of the sojourn time on the preclinical detectable period in disease screening or transition rates for the natural history of chronic disease usually rely on interval cases (diagnosed between screens). However, to ascertain such cases might be difficult in developing countries due to incomplete registration systems and difficulties in follow-up. To overcome this problem, we propose three Markov models to estimate parameters without using interval cases. A three-state Markov model, a five-state Markov model related to regional lymph node spread, and a five-state Markov model pertaining to tumor size are applied to data on breast cancer screening in female relatives of breast cancer cases in Taiwan. Results based on a three-state Markov model give mean sojourn time (MST) 1.90 (95% CI: 1.18-4.86) years for this high-risk group. Validation of these models on the basis of data on breast cancer screening in the age groups 50-59 and 60-69 years from the Swedish Two-County Trial shows the estimates from a three-state Markov model that does not use interval cases are very close to those from previous Markov models taking interval cancers into account. For the five-state Markov model, a reparameterized procedure using auxiliary information on clinically detected cancers is performed to estimate relevant parameters. A good fit of internal and external validation demonstrates the feasibility of using these models to estimate parameters that have previously required interval cancers. This method can be applied to other screening data in which there are no data on interval cases.

  13. Statistical intervals a guide for practitioners

    CERN Document Server

    Hahn, Gerald J

    2011-01-01

    Presents a detailed exposition of statistical intervals and emphasizes applications in industry. The discussion differentiates at an elementary level among different kinds of statistical intervals and gives instruction with numerous examples and simple math on how to construct such intervals from sample data. This includes confidence intervals to contain a population percentile, confidence intervals on probability of meeting specified threshold value, and prediction intervals to include observation in a future sample. Also has an appendix containing computer subroutines for nonparametric stati

  14. Dependency of magnetocardiographically determined fetal cardiac time intervals on gestational age, gender and postnatal biometrics in healthy pregnancies

    Directory of Open Access Journals (Sweden)

    Geue Daniel

    2004-04-01

    Full Text Available Abstract Background Magnetocardiography enables the precise determination of fetal cardiac time intervals (CTI as early as the second trimester of pregnancy. It has been shown that fetal CTI change in course of gestation. The aim of this work was to investigate the dependency of fetal CTI on gestational age, gender and postnatal biometric data in a substantial sample of subjects during normal pregnancy. Methods A total of 230 fetal magnetocardiograms were obtained in 47 healthy fetuses between the 15th and 42nd week of gestation. In each recording, after subtraction of the maternal cardiac artifact and the identification of fetal beats, fetal PQRST courses were signal averaged. On the basis of therein detected wave onsets and ends, the following CTI were determined: P wave, PR interval, PQ interval, QRS complex, ST segment, T wave, QT and QTc interval. Using regression analysis, the dependency of the CTI were examined with respect to gestational age, gender and postnatal biometric data. Results Atrioventricular conduction and ventricular depolarization times could be determined dependably whereas the T wave was often difficult to detect. Linear and nonlinear regression analysis established strong dependency on age for the P wave and QRS complex (r2 = 0.67, p r2 = 0.66, p r2 = 0.21, p r2 = 0.13, p st week onward (p Conclusion We conclude that 1 from approximately the 18th week to term, fetal CTI which quantify depolarization times can be reliably determined using magnetocardiography, 2 the P wave and QRS complex duration show a high dependency on age which to a large part reflects fetal growth and 3 fetal gender plays a role in QRS complex duration in the third trimester. Fetal development is thus in part reflected in the CTI and may be useful in the identification of intrauterine growth retardation.

  15. Rescaled Range Analysis and Detrended Fluctuation Analysis: Finite Sample Properties and Confidence Intervals

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    4/2010, č. 3 (2010), s. 236-250 ISSN 1802-4696 R&D Projects: GA ČR GD402/09/H045; GA ČR GA402/09/0965 Grant - others:GA UK(CZ) 118310 Institutional research plan: CEZ:AV0Z10750506 Keywords : rescaled range analysis * detrended fluctuation analysis * Hurst exponent * long-range dependence Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/kristoufek-rescaled range analysis and detrended fluctuation analysis finite sample properties and confidence intervals.pdf

  16. Infant rats can learn time intervals before the maturation of the striatum: evidence from odor fear conditioning

    Directory of Open Access Journals (Sweden)

    Julie eBoulanger Bertolus

    2014-05-01

    Full Text Available Interval timing refers to the ability to perceive, estimate and discriminate durations in the range of seconds to minutes. Very little is currently known about the ontogeny of interval timing throughout development. On the other hand, even though the neural circuit sustaining interval timing is a matter of debate, the striatum has been suggested to be an important component of the system and its maturation occurs around the third post-natal week in rats. The global aim of the present study was to investigate interval timing abilities at an age for which striatum is not yet mature. We used odor fear conditioning, as it can be applied to very young animals. In odor fear conditioning, an odor is presented to the animal and a mild footshock is delivered after a fixed interval. Adult rats have been shown to learn the temporal relationships between the odor and the shock after a few associations. The first aim of the present study was to assess the activity of the striatum during odor fear conditioning using 2-Deoxyglucose autoradiography during development in rats. The data showed that although fear learning was displayed at all tested ages, activation of the striatum was observed in adults but not in juvenile animals. Next, we assessed the presence of evidence of interval timing in ages before and after the inclusion of the striatum into the fear conditioning circuit. We used an experimental setup allowing the simultaneous recording of freezing and respiration that have been demonstrated to be sensitive to interval timing in adult rats. This enabled the detection of duration-related temporal patterns for freezing and/or respiration curves in infants as young as 12 days post-natal during odor-fear conditioning. This suggests that infants are able to encode time durations as well as and as quickly as adults while their striatum is not yet functional. Alternative networks possibly sustaining interval timing in infant rats are discussed.

  17. Dead-time corrections on long-interval measurements of short-lived activities

    International Nuclear Information System (INIS)

    Irfan, M.

    1977-01-01

    A method has been proposed to make correction for counting losses due to dead time where the counting interval is comparable to or larger than the half-life of the activity under investigation. Counts due to background and any long-lived activity present in the source have been taken into consideration. The method is, under certain circumstances, capable of providing a valuable check on the accuracy of the dead time of the counting system. (Auth.)

  18. Robust stability analysis for Markovian jumping interval neural networks with discrete and distributed time-varying delays

    International Nuclear Information System (INIS)

    Balasubramaniam, P.; Lakshmanan, S.; Manivannan, A.

    2012-01-01

    Highlights: ► Robust stability analysis for Markovian jumping interval neural networks is considered. ► Both linear fractional and interval uncertainties are considered. ► A new LKF is constructed with triple integral terms. ► MATLAB LMI control toolbox is used to validate theoretical results. ► Numerical examples are given to illustrate the effectiveness of the proposed method. - Abstract: This paper investigates robust stability analysis for Markovian jumping interval neural networks with discrete and distributed time-varying delays. The parameter uncertainties are assumed to be bounded in given compact sets. The delay is assumed to be time-varying and belong to a given interval, which means that the lower and upper bounds of interval time-varying delays are available. Based on the new Lyapunov–Krasovskii functional (LKF), some inequality techniques and stochastic stability theory, new delay-dependent stability criteria have been obtained in terms of linear matrix inequalities (LMIs). Finally, two numerical examples are given to illustrate the less conservative and effectiveness of our theoretical results.

  19. Histopathologic evaluation of postmortem autolytic changes in bluegill (Lepomis macrohirus and crappie (Pomoxis anularis at varied time intervals and storage temperatures

    Directory of Open Access Journals (Sweden)

    Jami George

    2016-04-01

    Full Text Available Information is lacking on preserving fish carcasses to minimize postmortem autolysis artifacts when a necropsy cannot be performed immediately. The purpose of this study was to qualitatively identify and score histologic postmortem changes in two species of freshwater fish (bluegill—Lepomis macrochirus; crappie—Pomoxis annularis, at varied time intervals and storage temperatures, to assess the histologic quality of collected samples. A pooled sample of 36 mix sex individuals of healthy bluegill and crappie were euthanized, stored either at room temperature, refrigerated at 4 °C, or frozen at −20 °C, and then necropsied at 0, 4, 24, and 48 h intervals. Histologic specimens were evaluated by light microscopy. Data showed that immediate harvesting of fresh samples provides the best quality and refrigeration would be the preferred method of storage if sample collection had to be delayed for up to 24 h. When sample collection must be delayed more than 24 h, the preferred method of storage to minimize autolysis artifacts is freezing if evaluation of the gastrointestinal tract is most important, or refrigeration if gill histology is most important. The gill arch, intestinal tract, followed by the liver and kidney were the most sensitive organs to autolysis.

  20. Assessing the precision of a time-sampling-based study among GPs: balancing sample size and measurement frequency.

    Science.gov (United States)

    van Hassel, Daniël; van der Velden, Lud; de Bakker, Dinny; van der Hoek, Lucas; Batenburg, Ronald

    2017-12-04

    Our research is based on a technique for time sampling, an innovative method for measuring the working hours of Dutch general practitioners (GPs), which was deployed in an earlier study. In this study, 1051 GPs were questioned about their activities in real time by sending them one SMS text message every 3 h during 1 week. The required sample size for this study is important for health workforce planners to know if they want to apply this method to target groups who are hard to reach or if fewer resources are available. In this time-sampling method, however, standard power analyses is not sufficient for calculating the required sample size as this accounts only for sample fluctuation and not for the fluctuation of measurements taken from every participant. We investigated the impact of the number of participants and frequency of measurements per participant upon the confidence intervals (CIs) for the hours worked per week. Statistical analyses of the time-use data we obtained from GPs were performed. Ninety-five percent CIs were calculated, using equations and simulation techniques, for various different numbers of GPs included in the dataset and for various frequencies of measurements per participant. Our results showed that the one-tailed CI, including sample and measurement fluctuation, decreased from 21 until 3 h between one and 50 GPs. As a result of the formulas to calculate CIs, the increase of the precision continued and was lower with the same additional number of GPs. Likewise, the analyses showed how the number of participants required decreased if more measurements per participant were taken. For example, one measurement per 3-h time slot during the week requires 300 GPs to achieve a CI of 1 h, while one measurement per hour requires 100 GPs to obtain the same result. The sample size needed for time-use research based on a time-sampling technique depends on the design and aim of the study. In this paper, we showed how the precision of the

  1. A comparison of systolic time intervals measured by impedance cardiography and carotid pulse tracing

    DEFF Research Database (Denmark)

    Mehlsen, J; Bonde, J; Rehling, Michael

    1990-01-01

    The purpose of this study was to compare the systolic time intervals (STI) obtained by impedance cardiography and by the conventional carotid technique. This comparison was done with respect to: 1) correlations between variables obtained by the two methods, 2) ability to reflect drug-induced chan......The purpose of this study was to compare the systolic time intervals (STI) obtained by impedance cardiography and by the conventional carotid technique. This comparison was done with respect to: 1) correlations between variables obtained by the two methods, 2) ability to reflect drug...

  2. Regression analysis of case K interval-censored failure time data in the presence of informative censoring.

    Science.gov (United States)

    Wang, Peijie; Zhao, Hui; Sun, Jianguo

    2016-12-01

    Interval-censored failure time data occur in many fields such as demography, economics, medical research, and reliability and many inference procedures on them have been developed (Sun, 2006; Chen, Sun, and Peace, 2012). However, most of the existing approaches assume that the mechanism that yields interval censoring is independent of the failure time of interest and it is clear that this may not be true in practice (Zhang et al., 2007; Ma, Hu, and Sun, 2015). In this article, we consider regression analysis of case K interval-censored failure time data when the censoring mechanism may be related to the failure time of interest. For the problem, an estimated sieve maximum-likelihood approach is proposed for the data arising from the proportional hazards frailty model and for estimation, a two-step procedure is presented. In the addition, the asymptotic properties of the proposed estimators of regression parameters are established and an extensive simulation study suggests that the method works well. Finally, we apply the method to a set of real interval-censored data that motivated this study. © 2016, The International Biometric Society.

  3. Sensitivity and specificity of normality tests and consequences on reference interval accuracy at small sample size: a computer-simulation study.

    Science.gov (United States)

    Le Boedec, Kevin

    2016-12-01

    According to international guidelines, parametric methods must be chosen for RI construction when the sample size is small and the distribution is Gaussian. However, normality tests may not be accurate at small sample size. The purpose of the study was to evaluate normality test performance to properly identify samples extracted from a Gaussian population at small sample sizes, and assess the consequences on RI accuracy of applying parametric methods to samples that falsely identified the parent population as Gaussian. Samples of n = 60 and n = 30 values were randomly selected 100 times from simulated Gaussian, lognormal, and asymmetric populations of 10,000 values. The sensitivity and specificity of 4 normality tests were compared. Reference intervals were calculated using 6 different statistical methods from samples that falsely identified the parent population as Gaussian, and their accuracy was compared. Shapiro-Wilk and D'Agostino-Pearson tests were the best performing normality tests. However, their specificity was poor at sample size n = 30 (specificity for P Box-Cox transformation) on all samples regardless of their distribution or adjusting, the significance level of normality tests depending on sample size would limit the risk of constructing inaccurate RI. © 2016 American Society for Veterinary Clinical Pathology.

  4. Correction of Sample-Time Error for Time-Interleaved Sampling System Using Cubic Spline Interpolation

    Directory of Open Access Journals (Sweden)

    Qin Guo-jie

    2014-08-01

    Full Text Available Sample-time errors can greatly degrade the dynamic range of a time-interleaved sampling system. In this paper, a novel correction technique employing a cubic spline interpolation is proposed for inter-channel sample-time error compensation. The cubic spline interpolation compensation filter is developed in the form of a finite-impulse response (FIR filter structure. The correction method of the interpolation compensation filter coefficients is deduced. A 4GS/s two-channel, time-interleaved ADC prototype system has been implemented to evaluate the performance of the technique. The experimental results showed that the correction technique is effective to attenuate the spurious spurs and improve the dynamic performance of the system.

  5. Effect of Sample Storage Temperature and Time Delay on Blood Gases, Bicarbonate and pH in Human Arterial Blood Samples.

    Science.gov (United States)

    Mohammadhoseini, Elham; Safavi, Enayat; Seifi, Sepideh; Seifirad, Soroush; Firoozbakhsh, Shahram; Peiman, Soheil

    2015-03-01

    Results of arterial blood gas analysis can be biased by pre-analytical factors, such as time interval before analysis, temperature during storage and syringe type. To investigate the effects of samples storage temperature and time delay on blood gases, bicarbonate and PH results in human arterial blood samples. 2.5 mL arterial blood samples were drawn from 45 patients via an indwelling Intraarterial catheter. Each sample was divided into five equal samples and stored in multipurpose tuberculin plastic syringes. Blood gas analysis was performed on one of five samples as soon as possible. Four other samples were divided into two groups stored at 22°C and 0°C. Blood gas analyses were repeated at 30 and 60 minutes after sampling. PaO2 of the samples stored at 0°C was increased significantly after 60 minutes (P = 0.007). The PaCO2 of the samples kept for 30 and 60 minutes at 22°C was significantly higher than primary result (P = 0.04, P samples stored at 22°C, pH decreased significantly after 30 and 60 minutes (P = 0.017, P = 0.001). There were no significant differences in other results of samples stored at 0°C or 22°C after 30 or 60 minutes. In samples stored in plastic syringes, overestimation of PaO2 levels should be noted if samples cooled before analysis. In samples stored in plastic syringes, it is not necessary to store samples in iced water when analysis delayed up to one hour.

  6. Pre-hospital care time intervals among victims of road traffic injuries in Iran. A cross-sectional study

    Directory of Open Access Journals (Sweden)

    Bigdeli Maryam

    2010-07-01

    Full Text Available Abstract Background Road traffic injuries (RTIs are a major public health problem, requiring concerted efforts both for their prevention and a reduction of their consequences. Timely arrival of the Emergency Medical Service (EMS at the crash scene followed by speedy victim transportation by trained personnel may reduce the RTIs' consequences. The first 60 minutes after injury occurrence - referred to as the "golden hour"- are vital for the saving of lives. The present study was designed to estimate the average of various time intervals occurring during the pre-hospital care process and to examine the differences between these time intervals as regards RTIs on urban and interurban roads. Method A retrospective cross-sectional study was designed and various time intervals in relation to pre-hospital care of RTIs identified in the ambulance dispatch centre in Urmia, Iran from 20 March 2005 to 20 March 2007. All cases which resulted in ambulance dispatches were reviewed and those that had complete data on time intervals were analyzed. Results In total, the cases of 2027 RTI victims were analysed. Of these, 61.5 % of the subjects were injured in city areas. The mean response time for city locations was 5.0 minutes, compared with 10.6 minutes for interurban road locations. The mean on-scene time on the interurban roads was longer than on city roads (9.2 vs. 6.1 minutes, p Conclusion The response, transport and total time intervals among EMS responding to RTI incidents were longer for interurban roads, compared to the city areas. More research should take place on needs-to and access-for EMS on city and interurban roads. The notification interval seems to be a hidden part of the post-crash events and indirectly affects the "golden hour" for victim management and it needs to be measured through the establishment of the surveillance systems.

  7. Adaptive Changes After 2 Weeks of 10-s Sprint Interval Training With Various Recovery Times

    Directory of Open Access Journals (Sweden)

    Robert A. Olek

    2018-04-01

    Full Text Available Purpose: The aim of this study was to compare the effect of applying two different rest recovery times in a 10-s sprint interval training session on aerobic and anaerobic capacities as well as skeletal muscle enzyme activities.Methods: Fourteen physically active but not highly trained male subjects (mean maximal oxygen uptake 50.5 ± 1.0 mlO2·kg−1·min−1 participated in the study. The training protocol involved a series of 10-s sprints separated by either 1-min (SIT10:1 or 4-min (SIT10:4 of recovery. The number of sprints progressed from four to six over six sessions separated by 1–2 days rest. Pre and post intervention anthropometric measurements, assessment of aerobic, anaerobic capacity and muscle biopsy were performed. In the muscle samples maximal activities of citrate synthase (CS, 3-hydroxyacylCoA dehydrogenase (HADH, carnitine palmitoyl-transferase (CPT, malate dehydrogenase (MDH, and its mitochondrial form (mMDH, as well as lactate dehydrogenase (LDH were determined. Analysis of variance was performed to determine changes between conditions.Results: Maximal oxygen uptake improved significantly in both training groups, by 13.6% in SIT10:1 and 11.9% in SIT10:4, with no difference between groups. Wingate anaerobic test results indicated main effect of time for total work, peak power output and mean power output, which increased significantly and similarly in both groups. Significant differences between training groups were observed for end power output, which increased by 10.8% in SIT10:1, but remained unchanged in SIT10:4. Both training protocols induced similar increase in CS activity (main effect of time p < 0.05, but no other enzymes.Conclusion: Sprint interval training protocols induce metabolic adaptation over a short period of time, and the reduced recovery between bouts may attenuate fatigue during maximal exercise.

  8. [Processing acoustically presented time intervals of seconds duration: an expression of the phonological loop of the working memory?].

    Science.gov (United States)

    Grube, D

    1996-01-01

    Working memory has been proposed to contribute to the processing of time, rhythm and music; the question which component of working memory is involved is under discussion. The present study tests the hypothesis that the phonological loop component (Baddeley, 1986) is involved in the processing of auditorily presented time intervals of a few seconds' duration. Typical effects well known with short-term retention of verbal material could be replicated with short-term retention of temporal intervals: The immediate reproduction of time intervals was impaired under conditions of background music and articulatory suppression. Neither the accuracy nor the speed of responses in a (non-phonological) mental rotation task were diminished under these conditions. Processing of auditorily presented time intervals seems to be constrained by the capacity of the phonological loop: The immediate serial recall of sequences of time intervals was shown to be related to the immediate serial recall of words (memory span). The results confirm the notion that working memory resources, and especially the phonological loop component, underlie the processing of auditorily presented temporal information with a duration of a few seconds.

  9. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals

    Science.gov (United States)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.

    2018-02-01

    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ physiological data after shuffling or with a group of symmetric synthetic time series.

  10. Method to measure autonomic control of cardiac function using time interval parameters from impedance cardiography

    International Nuclear Information System (INIS)

    Meijer, Jan H; Boesveldt, Sanne; Elbertse, Eskeline; Berendse, H W

    2008-01-01

    The time difference between the electrocardiogram and impedance cardiogram can be considered as a measure for the time delay between the electrical and mechanical activities of the heart. This time interval, characterized by the pre-ejection period (PEP), is related to the sympathetic autonomous nervous control of cardiac activity. PEP, however, is difficult to measure in practice. Therefore, a novel parameter, the initial systolic time interval (ISTI), is introduced to provide a more practical measure. The use of ISTI instead of PEP was evaluated in three groups: young healthy subjects, patients with Parkinson's disease, and a group of elderly, healthy subjects of comparable age. PEP and ISTI were studied under two conditions: at rest and after an exercise stimulus. Under both conditions, PEP and ISTI behaved largely similarly in the three groups and were significantly correlated. It is concluded that ISTI can be used as a substitute for PEP and, therefore, to evaluate autonomic neuropathy both in clinical and extramural settings. Measurement of ISTI can also be used to non-invasively monitor the electromechanical cardiac time interval, and the associated autonomic activity, under physiological circumstances

  11. Sampling Development

    Science.gov (United States)

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  12. Corticostriatal field potentials are modulated at delta and theta frequencies during interval-timing task in rodents

    Directory of Open Access Journals (Sweden)

    Eric B Emmons

    2016-04-01

    Full Text Available Organizing movements in time is a critical and highly conserved feature of mammalian behavior. Temporal control of action requires corticostriatal networks. We investigate these networks in rodents using a two-interval timing task while recording local field potentials in medial frontal cortex or dorsomedial striatum. Consistent with prior work, we found cue-triggered delta (1-4 Hz and theta activity (4-8 Hz primarily in rodent medial frontal cortex. We observed delta activity across temporal intervals in medial frontal cortex and dorsomedial striatum. Rewarded responses were associated with increased delta activity in medial frontal cortex. Activity in theta bands in medial frontal cortex and delta bands in the striatum was linked with the timing of responses. These data suggest both delta and theta activity in frontostriatal networks are modulated during interval timing and that activity in these bands may be involved in the temporal control of action.

  13. Pre-hospital care time intervals among victims of road traffic injuries in Iran. A cross-sectional study.

    Science.gov (United States)

    Bigdeli, Maryam; Khorasani-Zavareh, Davoud; Mohammadi, Reza

    2010-07-09

    Road traffic injuries (RTIs) are a major public health problem, requiring concerted efforts both for their prevention and a reduction of their consequences. Timely arrival of the Emergency Medical Service (EMS) at the crash scene followed by speedy victim transportation by trained personnel may reduce the RTIs' consequences. The first 60 minutes after injury occurrence--referred to as the "golden hour"--are vital for the saving of lives. The present study was designed to estimate the average of various time intervals occurring during the pre-hospital care process and to examine the differences between these time intervals as regards RTIs on urban and interurban roads. A retrospective cross-sectional study was designed and various time intervals in relation to pre-hospital care of RTIs identified in the ambulance dispatch centre in Urmia, Iran from 20 March 2005 to 20 March 2007. All cases which resulted in ambulance dispatches were reviewed and those that had complete data on time intervals were analyzed. In total, the cases of 2027 RTI victims were analysed. Of these, 61.5% of the subjects were injured in city areas. The mean response time for city locations was 5.0 minutes, compared with 10.6 minutes for interurban road locations. The mean on-scene time on the interurban roads was longer than on city roads (9.2 vs. 6.1 minutes, p transport times from the scene to the hospital were also significantly longer for interurban incidents (17.1 vs. 6.3 minutes, p transport and total time intervals among EMS responding to RTI incidents were longer for interurban roads, compared to the city areas. More research should take place on needs-to and access-for EMS on city and interurban roads. The notification interval seems to be a hidden part of the post-crash events and indirectly affects the "golden hour" for victim management and it needs to be measured through the establishment of the surveillance systems.

  14. Robust stability of interval bidirectional associative memory neural network with time delays.

    Science.gov (United States)

    Liao, Xiaofeng; Wong, Kwok-wo

    2004-04-01

    In this paper, the conventional bidirectional associative memory (BAM) neural network with signal transmission delay is intervalized in order to study the bounded effect of deviations in network parameters and external perturbations. The resultant model is referred to as a novel interval dynamic BAM (IDBAM) model. By combining a number of different Lyapunov functionals with the Razumikhin technique, some sufficient conditions for the existence of unique equilibrium and robust stability are derived. These results are fairly general and can be verified easily. To go further, we extend our investigation to the time-varying delay case. Some robust stability criteria for BAM with perturbations of time-varying delays are derived. Besides, our approach for the analysis allows us to consider several different types of activation functions, including piecewise linear sigmoids with bounded activations as well as the usual C1-smooth sigmoids. We believe that the results obtained have leading significance in the design and application of BAM neural networks.

  15. Yield and quality of milk and udder health in Martina Franca ass: effects of daily interval and time of machine milking

    Directory of Open Access Journals (Sweden)

    Giovanni Martemucci

    2010-01-01

    Full Text Available Twenty asses of Martina Franca breed, machine milked twice a day, were used to assess the influence of milking interval (3-h, 5-h, and 8-h; N=5 and time (700, 1200 and 1900 on milk yield and udder health. Individual milk samples were taken to determine fat, protein and lactose con- tent. Sensory analysis profile was also assessed. Milk’s total bacterial count (TBC, somatic cell con- tent (SCC and udder’s skin temperature were considered to assess udder health. Milk yield increases by 28.4% (P<0.01 with a milking interval from 3-h to 8-h and is higher (P<0.01 at morning milking. The maximum milk yield per milking corresponds to 700 milking (1416.9 mL thus indicating a circa- dian rhythm in milk secretion processes. Milking intervals of 5 and 8 hours cause a decrease (P<0.01 in milk fat and lactose content. The 8-h interval leads to an increase (P<0.01 in SCC but without any significance for the health udder. No alterations about CBT, clinical evaluation and temperature of ud- der were observed. Milk organoleptic characteristics were better in the 3-h interval milking.

  16. Self-produced Time Intervals Are Perceived as More Variable and/or Shorter Depending on Temporal Context in Subsecond and Suprasecond Ranges

    Directory of Open Access Journals (Sweden)

    Keita eMitani

    2016-06-01

    Full Text Available The processing of time intervals is fundamental for sensorimotor and cognitive functions. Perceptual and motor timing are often performed concurrently (e.g., playing a musical instrument. Although previous studies have shown the influence of body movements on time perception, how we perceive self-produced time intervals has remained unclear. Furthermore, it has been suggested that the timing mechanisms are distinct for the sub- and suprasecond ranges. Here, we compared perceptual performances for self-produced and passively presented time intervals in random contexts (i.e., multiple target intervals presented in a session across the sub- and suprasecond ranges (Experiment 1 and within the sub- (Experiment 2 and suprasecond (Experiment 3 ranges, and in a constant context (i.e., a single target interval presented in a session in the sub- and suprasecond ranges (Experiment 4. We show that self-produced time intervals were perceived as shorter and more variable across the sub- and suprasecond ranges and within the suprasecond range but not within the subsecond range in a random context. In a constant context, the self-produced time intervals were perceived as more variable in the suprasecond range but not in the subsecond range. The impairing effects indicate that motor timing interferes with perceptual timing. The dependence of impairment on temporal contexts suggests multiple timing mechanisms for the subsecond and suprasecond ranges. In addition, violation of the scalar property (i.e., a constant variability to target interval ratio was observed between the sub- and suprasecond ranges. The violation was clearer for motor timing than for perceptual timing. This suggests that the multiple timing mechanisms for the sub- and suprasecond ranges overlap more for perception than for motor. Moreover, the central tendency effect (i.e., where shorter base intervals are overestimated and longer base intervals are underestimated disappeared with subsecond

  17. Centralized and decentralized global outer-synchronization of asymmetric recurrent time-varying neural network by data-sampling.

    Science.gov (United States)

    Lu, Wenlian; Zheng, Ren; Chen, Tianping

    2016-03-01

    In this paper, we discuss outer-synchronization of the asymmetrically connected recurrent time-varying neural networks. By using both centralized and decentralized discretization data sampling principles, we derive several sufficient conditions based on three vector norms to guarantee that the difference of any two trajectories starting from different initial values of the neural network converges to zero. The lower bounds of the common time intervals between data samples in centralized and decentralized principles are proved to be positive, which guarantees exclusion of Zeno behavior. A numerical example is provided to illustrate the efficiency of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Tonic and Phasic Dopamine Fluctuations as Reflected in Beta-power Predict Interval Timing Behavior

    NARCIS (Netherlands)

    Kononowicz, Tadeusz; van Rijn, Hedderik

    It has been repeatedly shown that dopamine impacts interval timing in humans and animals (for a review, see Coull, Cheng, & Meck, 2012). Particularly, administration of dopamine agonists or antagonists speeds-up or slows down internal passage of time, respectively (Meck, 1996). This co-variations in

  19. Determination and identification of naturally occurring decay series using milli-second order pulse time interval analysis (TIA)

    International Nuclear Information System (INIS)

    Hashimoto, T.; Sanada, Y.; Uezu, Y.

    2003-01-01

    A delayed coincidence method, called a time interval analysis (TIA) method, has been successfully applied to selective determination of the correlated α-α decay events in millisecond order life-time. A main decay process applicable to TIA-treatment is 220 Rn → 216 Po(T 1/2 :145ms) → {Th-series}. The TIA is fundamentally based on the difference of time interval distribution between non-correlated decay events and other events such as background or random events when they were compiled the time interval data within a fixed time (for example, a tenth of concerned half lives). The sensitivity of the TIA-analysis due to correlated α-α decay events could be subsequently improved in respect of background elimination using the pulse shape discrimination technique (PSD with PERALS counter) to reject β/γ-pulses, purging of nitrogen gas into extra scintillator, and applying solvent extraction of Ra. (author)

  20. Usability of a new multiple high-speed pulse time data registration, processing and real-time display system for pulse time interval analysis

    International Nuclear Information System (INIS)

    Yawata, Takashi; Sakaue, Hisanobu; Hashimoto, Tetsuo; Itou, Shigeki

    2006-01-01

    A new high-speed multiple pulse time data registration, processing and real-time display system for time interval analysis (TIA) was developed for counting either β-α or α-α correlated decay-events. The TIA method has been so far limited to selective extraction of successive α-α decay events within the milli-second time scale owing to the use of original electronic hardware. In the present pulse-processing system, three different high-speed α/β(γ) pulses could be fed quickly to original 32 bit PCI board (ZN-HTS2) within 1 μs. This original PCI board is consisting of a timing-control IC (HTS-A) and 28 bit counting IC (HTS-B). All channel and pulse time data were stored to FIFO RAM, followed to transfer into temporary CPU RAM (32 MB) by DMA. Both data registration (into main RAM (200 MB)) and calculation of pulse time intervals together with real-time TIA-distribution display simultaneously processed using two sophisticate softwares. The present system has proven to succeed for the real-time display of TIA distribution spectrum even when 1.6x10 5 cps pulses from pulse generator were given to the system. By using this new system combined with liquid scintillation counting (LSC) apparatus, both a natural micro-second order β-α correlated decay-events and a milli-second order α-α correlated decay-event could be selectively extracted from the mixture of natural radionuclides. (author)

  1. Optimization of Allowed Outage Time and Surveillance Test Intervals

    Energy Technology Data Exchange (ETDEWEB)

    Al-Dheeb, Mujahed; Kang, Sunkoo; Kim, Jonghyun [KEPCO international nuclear graduate school, Ulsan (Korea, Republic of)

    2015-10-15

    The primary purpose of surveillance testing is to assure that the components of standby safety systems will be operable when they are needed in an accident. By testing these components, failures can be detected that may have occurred since the last test or the time when the equipment was last known to be operational. The probability a system or system component performs a specified function or mission under given conditions at a prescribed time is called availability (A). Unavailability (U) as a risk measure is just the complementary probability to A(t). The increase of U means the risk is increased as well. D and T have an important impact on components, or systems, unavailability. The extension of D impacts the maintenance duration distributions for at-power operations, making them longer. This, in turn, increases the unavailability due to maintenance in the systems analysis. As for T, overly-frequent surveillances can result in high system unavailability. This is because the system may be taken out of service often due to the surveillance itself and due to the repair of test-caused failures of the component. The test-caused failures include those incurred by wear and tear of the component due to the surveillances. On the other hand, as the surveillance interval increases, the component's unavailability will grow because of increased occurrences of time-dependent random failures. In that situation, the component cannot be relied upon, and accordingly the system unavailability will increase. Thus, there should be an optimal component surveillance interval in terms of the corresponding system availability. This paper aims at finding the optimal T and D which result in minimum unavailability which in turn reduces the risk. Applying the methodology in section 2 to find the values of optimal T and D for two components, i.e., safety injection pump (SIP) and turbine driven aux feedwater pump (TDAFP). Section 4 is addressing interaction between D and T. In general

  2. Optimization of Allowed Outage Time and Surveillance Test Intervals

    International Nuclear Information System (INIS)

    Al-Dheeb, Mujahed; Kang, Sunkoo; Kim, Jonghyun

    2015-01-01

    The primary purpose of surveillance testing is to assure that the components of standby safety systems will be operable when they are needed in an accident. By testing these components, failures can be detected that may have occurred since the last test or the time when the equipment was last known to be operational. The probability a system or system component performs a specified function or mission under given conditions at a prescribed time is called availability (A). Unavailability (U) as a risk measure is just the complementary probability to A(t). The increase of U means the risk is increased as well. D and T have an important impact on components, or systems, unavailability. The extension of D impacts the maintenance duration distributions for at-power operations, making them longer. This, in turn, increases the unavailability due to maintenance in the systems analysis. As for T, overly-frequent surveillances can result in high system unavailability. This is because the system may be taken out of service often due to the surveillance itself and due to the repair of test-caused failures of the component. The test-caused failures include those incurred by wear and tear of the component due to the surveillances. On the other hand, as the surveillance interval increases, the component's unavailability will grow because of increased occurrences of time-dependent random failures. In that situation, the component cannot be relied upon, and accordingly the system unavailability will increase. Thus, there should be an optimal component surveillance interval in terms of the corresponding system availability. This paper aims at finding the optimal T and D which result in minimum unavailability which in turn reduces the risk. Applying the methodology in section 2 to find the values of optimal T and D for two components, i.e., safety injection pump (SIP) and turbine driven aux feedwater pump (TDAFP). Section 4 is addressing interaction between D and T. In general

  3. Electric power demand forecasting using interval time series. A comparison between VAR and iMLP

    International Nuclear Information System (INIS)

    Garcia-Ascanio, Carolina; Mate, Carlos

    2010-01-01

    Electric power demand forecasts play an essential role in the electric industry, as they provide the basis for making decisions in power system planning and operation. A great variety of mathematical methods have been used for demand forecasting. The development and improvement of appropriate mathematical tools will lead to more accurate demand forecasting techniques. In order to forecast the monthly electric power demand per hour in Spain for 2 years, this paper presents a comparison between a new forecasting approach considering vector autoregressive (VAR) forecasting models applied to interval time series (ITS) and the iMLP, the multi-layer perceptron model adapted to interval data. In the proposed comparison, for the VAR approach two models are fitted per every hour, one composed of the centre (mid-point) and radius (half-range), and another one of the lower and upper bounds according to the interval representation assumed by the ITS in the learning set. In the case of the iMLP, only the model composed of the centre and radius is fitted. The other interval representation composed of the lower and upper bounds is obtained from the linear combination of the two. This novel approach, obtaining two bivariate models each hour, makes possible to establish, for different periods in the day, which interval representation is more accurate. Furthermore, the comparison between two different techniques adapted to interval time series allows us to determine the efficiency of these models in forecasting electric power demand. It is important to note that the iMLP technique has been selected for the comparison, as it has shown its accuracy in forecasting daily electricity price intervals. This work shows the ITS forecasting methods as a potential tool that will lead to a reduction in risk when making power system planning and operational decisions. (author)

  4. Assessing cardiac preload by the Initial Systolic Time Interval obtained from impedance cardiography

    Directory of Open Access Journals (Sweden)

    Jan H Meijer

    2010-01-01

    Full Text Available The Initial Systolic Time Interval (ISTI, obtained from the electrocardiogram (ECG and impedance cardiogram (ICG, is considered to be a measure for the time delay between the electrical and mechanical activity of the heart and reflects an early active period of the cardiac cycle. The clinical relevance of this time interval is subject of study. This paper presents preliminary results of a pilot study investigating the use of ISTI in evaluating and predicting the circulatory response to fluid administration in patients after coronary artery bypass graft surgery, by comparing ISTI with cardiac output (CO responsiveness. Also the use of the pulse transit time (PTT, earlier recommended for this purpose, is investigated. The results show an inverse relationship between ISTI and CO at all moments of fluid administration and also an inverse relationship between the changes ΔISTI and ΔCO before and after full fluid administration. No relationships between PTT and CO or ΔPTT and ΔCO were found. It is concluded that ISTI is dependent upon preload, and that ISTI has the potential to be used as a clinical parameter assessing preload.

  5. Evaluation of downmotion time interval molten materials to core catcher during core disruptive accidents postulated in LMFR

    International Nuclear Information System (INIS)

    Voronov, S.A.; Kiryushin, A.I.; Kuzavkov, N.G.; Vlasichev, G.N.

    1994-01-01

    Hypothetical core disruptive accidents are postulated to clear potential of a reactor plant to withstand extreme conditions and to generate measures for management and mitigation of accidents consequence. In Russian advanced reactors there is a core catcher below the diagrid to prevent vessel bottom melting and to localize fuel debris. In this paper the calculation technique and estimation of relocation time of molten fuel and materials are presented in the case of core disruptive accidents postulated for LMFR reactor. To evaluate minimum interval of fuel relocation time the calculations for different initial data are provided. Large mass of materials between the core and the catcher in LMFR reactor hinders molten materials relocation toward the vessel bottom. That condition increases the time interval of reaching core catcher by molten fuel. Computations performed allowed to evaluate the minimum molten materials relocation time from the core to the core catcher. This time interval is in a range of 3.5-5.5 hours. (author)

  6. RISMA: A Rule-based Interval State Machine Algorithm for Alerts Generation, Performance Analysis and Monitoring Real-Time Data Processing

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2013-04-01

    The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory

  7. Robust L2-L∞ Filtering of Time-Delay Jump Systems with Respect to the Finite-Time Interval

    Directory of Open Access Journals (Sweden)

    Shuping He

    2011-01-01

    Full Text Available This paper studied the problem of stochastic finite-time boundedness and disturbance attenuation for a class of linear time-delayed systems with Markov jumping parameters. Sufficient conditions are provided to solve this problem. The L2-L∞ filters are, respectively, designed for time-delayed Markov jump linear systems with/without uncertain parameters such that the resulting filtering error dynamic system is stochastically finite-time bounded and has the finite-time interval disturbance attenuation γ for all admissible uncertainties, time delays, and unknown disturbances. By using stochastic Lyapunov-Krasovskii functional approach, it is shown that the filter designing problem is in terms of the solutions of a set of coupled linear matrix inequalities. Simulation examples are included to demonstrate the potential of the proposed results.

  8. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  9. Improved Criteria on Delay-Dependent Stability for Discrete-Time Neural Networks with Interval Time-Varying Delays

    Directory of Open Access Journals (Sweden)

    O. M. Kwon

    2012-01-01

    Full Text Available The purpose of this paper is to investigate the delay-dependent stability analysis for discrete-time neural networks with interval time-varying delays. Based on Lyapunov method, improved delay-dependent criteria for the stability of the networks are derived in terms of linear matrix inequalities (LMIs by constructing a suitable Lyapunov-Krasovskii functional and utilizing reciprocally convex approach. Also, a new activation condition which has not been considered in the literature is proposed and utilized for derivation of stability criteria. Two numerical examples are given to illustrate the effectiveness of the proposed method.

  10. Detection of abnormal item based on time intervals for recommender systems.

    Science.gov (United States)

    Gao, Min; Yuan, Quan; Ling, Bin; Xiong, Qingyu

    2014-01-01

    With the rapid development of e-business, personalized recommendation has become core competence for enterprises to gain profits and improve customer satisfaction. Although collaborative filtering is the most successful approach for building a recommender system, it suffers from "shilling" attacks. In recent years, the research on shilling attacks has been greatly improved. However, the approaches suffer from serious problem in attack model dependency and high computational cost. To solve the problem, an approach for the detection of abnormal item is proposed in this paper. In the paper, two common features of all attack models are analyzed at first. A revised bottom-up discretized approach is then proposed based on time intervals and the features for the detection. The distributions of ratings in different time intervals are compared to detect anomaly based on the calculation of chi square distribution (χ(2)). We evaluated our approach on four types of items which are defined according to the life cycles of these items. The experimental results show that the proposed approach achieves a high detection rate with low computational cost when the number of attack profiles is more than 15. It improves the efficiency in shilling attacks detection by narrowing down the suspicious users.

  11. Detection of Abnormal Item Based on Time Intervals for Recommender Systems

    Directory of Open Access Journals (Sweden)

    Min Gao

    2014-01-01

    Full Text Available With the rapid development of e-business, personalized recommendation has become core competence for enterprises to gain profits and improve customer satisfaction. Although collaborative filtering is the most successful approach for building a recommender system, it suffers from “shilling” attacks. In recent years, the research on shilling attacks has been greatly improved. However, the approaches suffer from serious problem in attack model dependency and high computational cost. To solve the problem, an approach for the detection of abnormal item is proposed in this paper. In the paper, two common features of all attack models are analyzed at first. A revised bottom-up discretized approach is then proposed based on time intervals and the features for the detection. The distributions of ratings in different time intervals are compared to detect anomaly based on the calculation of chi square distribution (χ2. We evaluated our approach on four types of items which are defined according to the life cycles of these items. The experimental results show that the proposed approach achieves a high detection rate with low computational cost when the number of attack profiles is more than 15. It improves the efficiency in shilling attacks detection by narrowing down the suspicious users.

  12. A new criterion for global robust stability of interval neural networks with discrete time delays

    International Nuclear Information System (INIS)

    Li Chuandong; Chen Jinyu; Huang Tingwen

    2007-01-01

    This paper further studies global robust stability of a class of interval neural networks with discrete time delays. By introducing an equivalent transformation of interval matrices, a new criterion on global robust stability is established. In comparison with the results reported in the literature, the proposed approach leads to results with less restrictive conditions. Numerical examples are also worked through to illustrate our results

  13. The importance of the sampling frequency in determining short-time-averaged irradiance and illuminance for rapidly changing cloud cover

    International Nuclear Information System (INIS)

    Delaunay, J.J.; Rommel, M.; Geisler, J.

    1994-01-01

    The sampling interval is an important parameter which must be chosen carefully, if measurements of the direct, global, and diffuse irradiance or illuminance are carried out to determine their averages over a given period. Using measurements from a day with rapidly moving clouds, we investigated the influence of the sampling interval on the uncertainly of the calculated 15-min averages. We conclude, for this averaging period, that the sampling interval should not exceed 60 s and 10 s for measurement of the diffuse and global components respectively, to reduce the influence of the sampling interval below 2%. For the direct component, even a 5 s sampling interval is too long to reach this influence level for days with extremely quickly changing insolation conditions. (author)

  14. Time and temperature affect glycolysis in blood samples regardless of fluoride-based preservatives: a potential underestimation of diabetes.

    Science.gov (United States)

    Stapleton, Mary; Daly, Niamh; O'Kelly, Ruth; Turner, Michael J

    2017-11-01

    Background The inhibition of glycolysis prior to glucose measurement is an important consideration when interpreting glucose tolerance tests. This is particularly important in gestational diabetes mellitus where prompt diagnosis and treatment is essential. A study was planned to investigate the effect of preservatives and temperature on glycolysis. Methods Blood samples for glucose were obtained from consented females. Lithium heparin and fluoride-EDTA samples transported rapidly in ice slurry to the laboratory were analysed for glucose concentration and then held either in ice slurry or at room temperature for varying time intervals. Paired fluoride-citrate samples were received at room temperature and held at room temperature, with analysis at similar time intervals. Results No significant difference was noted between mean glucose concentrations when comparing different sample types received in ice slurry. The mean glucose concentrations decreased significantly for both sets of samples when held at room temperature (0.4 mmol/L) and in ice slurry (0.2 mmol/L). A review of patient glucose tolerance tests reported in our hospital indicated that 17.8% exceeded the recommended diagnostic criteria for gestational diabetes mellitus. It was predicted that if the results of fasting samples were revised to reflect the effect of glycolysis at room temperature, the adjusted diagnostic rate could increase to 35.3%. Conclusion Preanalytical handling of blood samples for glucose analysis is vital. Fluoride-EDTA is an imperfect antiglycolytic, even when the samples are transported and analysed rapidly provides such optimal conditions. The use of fluoride-citrate tubes may offer a viable alternative in the diagnosis of diabetes mellitus.

  15. The Perforation-Operation time Interval; An Important Mortality Indicator in Peptic Ulcer Perforation.

    Science.gov (United States)

    Surapaneni, Sushama; S, Rajkumar; Reddy A, Vijaya Bhaskar

    2013-05-01

    To find out the significance of the Perforation-Operation Interval (POI) with respect to an early prognosis, in patients with peritonitis which is caused by peptic ulcer perforation. Case series. Place and Duration of the Study: Department of General Surgery, Konaseema Institute of Medical Sciences and RF Amalapuram, Andhra Pradesh, India from 2008-2011. This study included 150 patients with generalized peritonitis, who were diagnosed to have Perforated Peptic Ulcers (PPUs). The diagnosis of the PPUs was established on the basis of the history , the clinical examination and the radiological findings. The perforation-operation interval was calculated from the time of onset of the symptoms like severe abdominal pain or vomiting till the time the patient was operated. Out of the 150 patients 134 were males and 16 were females, with a male : female ratio of 9:1. Their ages ranged between 25-70 years. Out of the 150 patients, 65 patients (43.3%) presented within 24 hours of the onset of severe abdominal pain (Group A), 27 patients (18%) presented between 24-48 hours of the onset of severe abdominal pain (Group B) and 58 patients (38.6%) presented after 48 hours. There was no mortality in Group A and the morbidity was more in Group B and Group C. There were 15 deaths in Group C. The problem of peptic ulcer perforation with its complication, can be decreased by decreasing the perforation -operation time interval, which as per our study, appeared to be the single most important mortality and morbidity indicator in peptic ulcer perforation.

  16. Detection of surface electromyography recording time interval without muscle fatigue effect for biceps brachii muscle during maximum voluntary contraction.

    Science.gov (United States)

    Soylu, Abdullah Ruhi; Arpinar-Avsar, Pinar

    2010-08-01

    The effects of fatigue on maximum voluntary contraction (MVC) parameters were examined by using force and surface electromyography (sEMG) signals of the biceps brachii muscles (BBM) of 12 subjects. The purpose of the study was to find the sEMG time interval of the MVC recordings which is not affected by the muscle fatigue. At least 10s of force and sEMG signals of BBM were recorded simultaneously during MVC. The subjects reached the maximum force level within 2s by slightly increasing the force, and then contracted the BBM maximally. The time index of each sEMG and force signal were labeled with respect to the time index of the maximum force (i.e. after the time normalization, each sEMG or force signal's 0s time index corresponds to maximum force point). Then, the first 8s of sEMG and force signals were divided into 0.5s intervals. Mean force, median frequency (MF) and integrated EMG (iEMG) values were calculated for each interval. Amplitude normalization was performed by dividing the force signals to their mean values of 0s time intervals (i.e. -0.25 to 0.25s). A similar amplitude normalization procedure was repeated for the iEMG and MF signals. Statistical analysis (Friedman test with Dunn's post hoc test) was performed on the time and amplitude normalized signals (MF, iEMG). Although the ANOVA results did not give statistically significant information about the onset of the muscle fatigue, linear regression (mean force vs. time) showed a decreasing slope (Pearson-r=0.9462, pfatigue starts after the 0s time interval as the muscles cannot attain their peak force levels. This implies that the most reliable interval for MVC calculation which is not affected by the muscle fatigue is from the onset of the EMG activity to the peak force time. Mean, SD, and range of this interval (excluding 2s gradual increase time) for 12 subjects were 2353, 1258ms and 536-4186ms, respectively. Exceeding this interval introduces estimation errors in the maximum amplitude calculations

  17. Evaluation of statistical methods for quantifying fractal scaling in water-quality time series with irregular sampling

    Science.gov (United States)

    Zhang, Qian; Harman, Ciaran J.; Kirchner, James W.

    2018-02-01

    River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling - in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) - are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb-Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of

  18. Defining an optimum pumping-time requirement for sampling ground-water wells on the Hanford site

    International Nuclear Information System (INIS)

    Scharnhorst, N.L.

    1982-04-01

    The objective was to determine the optimum time period necessary to pump water from a well before a representative sample of the ground water can be obtained. It was assumed that a representative sample has been collected if the concentration of chemical parameters is the same in a number of samples taken consecutively, so that the concentration of parameters does not vary with time of collection. Ground-water samples used in this project were obtained by pumping selected wells on the Hanford Site. At each well, samples were taken at two minute intervals, and on each sample various chemical analyses were performed. Samples were checked for pH, sulfate, iron, specific conductivity, chloride, nitrate and alkalinity. The data showed that pH, alkalinity, sulfate and specific conductivity levels stabilized almost immediately after pumping of the well began. In many wells, the chloride and nitrate levels were unstable throughout the 38-minute sampling period. Iron levels, however, did not behave in either fashion. The concentration of iron in the samples was high when pumping began but dropped rapidly as pumping continued. The best explanation for this is that iron is flushed from the sides of the casing into the well when pumping begins. After several minutes of pumping, most of the dissolved iron is washed from the well casing and the iron concentration reaches a stable plateau representative of the iron concentration in the ground water.Since iron concentration takes longest to stabilize, the optimum pumping time for a well is based on the iron stabilization time for that well

  19. Nonparametric estimation in an "illness-death" model when all transition times are interval censored

    DEFF Research Database (Denmark)

    Frydman, Halina; Gerds, Thomas; Grøn, Randi

    2013-01-01

    We develop nonparametric maximum likelihood estimation for the parameters of an irreversible Markov chain on states {0,1,2} from the observations with interval censored times of 0 → 1, 0 → 2 and 1 → 2 transitions. The distinguishing aspect of the data is that, in addition to all transition times ...

  20. CMOS direct time interval measurement of long-lived luminescence lifetimes.

    Science.gov (United States)

    Yao, Lei; Yung, Ka Yi; Cheung, Maurice C; Chodavarapu, Vamsy P; Bright, Frank V

    2011-01-01

    We describe a Complementary Metal-Oxide Semiconductor (CMOS) Direct Time Interval Measurement (DTIM) Integrated Circuit (IC) to detect the decay (fall) time of the luminescence emission when analyte-sensitive luminophores are excited with an optical pulse. The CMOS DTIM IC includes 14 × 14 phototransistor array, transimpedance amplifier, regulated gain amplifier, fall time detector, and time-to-digital convertor. We examined the DTIM system to measure the emission lifetime of oxygen-sensitive luminophores tris(4,7-diphenyl-1, 10-phenanthroline) ruthenium(II) ([Ru(dpp)(3)](2+)) encapsulated in sol-gel derived xerogel thin-films. The DTIM system fabricated using TSMC 0.35 μm process functions to detect lifetimes from 4 μs to 14.4 μs but can be tuned to detect longer lifetimes. The system provides 8-bit digital output proportional to lifetimes and consumes 4.5 mW of power with 3.3 V DC supply. The CMOS system provides a useful platform for the development of reliable, robust, and miniaturized optical chemical sensors.

  1. Comparison of equilibrium radionuclide and contrast angiographic measurements of left ventricular peak ejection and filling rates and their time intervals

    Energy Technology Data Exchange (ETDEWEB)

    Sugrue, D.D.; Dickie, S.; Newman, H.; Myers, M.J.; Lavender, J.P.; McKenna, W.J. (Royal Postgraduate Medical School, London (UK))

    1984-10-01

    A comparison has been made of the equilibrium radionuclide and contrast angiographic estimates of normalized peak rates of ejection (PER) and filling (PFR) and their time intervals in twenty-one patients with cardiac disorders. Contrast angiographic and radionuclide measurements of left ventricular ejection fraction (LVEF), PER and PFR correlated well but time intervals correlated poorly. Mean values for radionuclide LVEF, PER and PFR were significantly lower and radionuclide time intervals were significantly longer compared to contrast angiography measurements.

  2. PETA KENDALI X DENGAN UKURAN SAMPEL DAN INTERVAL PENGAMBILAN SAMPEL YANG BERVARIASI

    Directory of Open Access Journals (Sweden)

    Tanti Octavia

    2000-01-01

    Full Text Available Shewhart X chart is widely used in statistical process control for monitoring variable data and has shown good performance in detecting large mean shift but less sensitive in detecting moderate to small process shift. X chart with variable sample size and sampling interval (VSSI X chart is proposed to enhance the ability of detecting moderate to small process shift. The performance of VSSI X chart is compared with those of Shewhart X chart, VSS X chart (Variable Sample Size X chart and VSI X chart (Variable Sampling Interval X chart. Performance of these control charts is presented in the form of ATS (Average Time to Signal which is obtained from computer simulation and markov chain approach. The VSSI X chart shows better performance in detecting moderate mean shift. The simulation is then continued for VSSI X chart and VSS X chart with minimum sample size n 1=1 and n 1=2. Abstract in Bahasa Indonesia : Peta kendali X Shewhart telah umum digunakan dalam pengendalian proses statistis untuk data variabel dan terbukti berfungsi dengan baik untuk mendeteksi pergeseran rerata yang besar, namun kurang cepat dalam mendeteksi pergeseran rerata yang sedang hingga kecil. Untuk mengatasi kelemahan ini, diusulkan penggunaan peta kendali X dengan ukuran sampel dan interval pengambilan sampel yang bervariasi (peta kendali VSSI. Kinerja peta kendali X VSSI dibandingkan dengan kinerja peta kendali Shewhart, peta kendali X VSS (peta kendali X dengan ukuran sampel yang bervariasi, dan peta kendali X VSI (peta kendali X dengan interval waktu pengambilan sampel yang bervariasi. Kinerja peta kendali dinyatakan dalam nilai ATS (Average Time to Signal yang didapatkan dari hasil simulasi program komputer maupun perhitungan Rantai Markov. Peta kendali X VSSI terbukti mempunyai kinerja yang lebih baik dalam mendeteksi pergeseran rerata yang sedang. Selain itu juga disimulasikan penggunaan peta kendali X VSSI dan peta kendali X VSS dengan ukuran sampel minimum n1=1 dan n1

  3. An integrated theory of prospective time interval estimation : The role of cognition, attention, and learning

    NARCIS (Netherlands)

    Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John

    A theory of prospective time perception is introduced and incorporated as a module in an integrated theory of cognition, thereby extending existing theories and allowing predictions about attention and learning. First, a time perception module is established by fitting existing datasets (interval

  4. Heuristic algorithms for the minmax regret flow-shop problem with interval processing times.

    Science.gov (United States)

    Ćwik, Michał; Józefczyk, Jerzy

    2018-01-01

    An uncertain version of the permutation flow-shop with unlimited buffers and the makespan as a criterion is considered. The investigated parametric uncertainty is represented by given interval-valued processing times. The maximum regret is used for the evaluation of uncertainty. Consequently, the minmax regret discrete optimization problem is solved. Due to its high complexity, two relaxations are applied to simplify the optimization procedure. First of all, a greedy procedure is used for calculating the criterion's value, as such calculation is NP-hard problem itself. Moreover, the lower bound is used instead of solving the internal deterministic flow-shop. The constructive heuristic algorithm is applied for the relaxed optimization problem. The algorithm is compared with previously elaborated other heuristic algorithms basing on the evolutionary and the middle interval approaches. The conducted computational experiments showed the advantage of the constructive heuristic algorithm with regards to both the criterion and the time of computations. The Wilcoxon paired-rank statistical test confirmed this conclusion.

  5. Online evolution reconstruction from a single measurement record with random time intervals for quantum communication

    Science.gov (United States)

    Zhou, Hua; Su, Yang; Wang, Rong; Zhu, Yong; Shen, Huiping; Pu, Tao; Wu, Chuanxin; Zhao, Jiyong; Zhang, Baofu; Xu, Zhiyong

    2017-10-01

    Online reconstruction of a time-variant quantum state from the encoding/decoding results of quantum communication is addressed by developing a method of evolution reconstruction from a single measurement record with random time intervals. A time-variant two-dimensional state is reconstructed on the basis of recovering its expectation value functions of three nonorthogonal projectors from a random single measurement record, which is composed from the discarded qubits of the six-state protocol. The simulated results prove that our method is robust to typical metro quantum channels. Our work extends the Fourier-based method of evolution reconstruction from the version for a regular single measurement record with equal time intervals to a unified one, which can be applied to arbitrary single measurement records. The proposed protocol of evolution reconstruction runs concurrently with the one of quantum communication, which can facilitate the online quantum tomography.

  6. An approach to the selection of recommended cooling intervals for the activation analysis of unknown samples with Ge(Li) gamma-ray spectrometry

    International Nuclear Information System (INIS)

    Hirose, Akio; Ishii, Daido

    1975-01-01

    Estimation of the optimum cooling interval by the mathematic or graphic method for Ge(Li) γ-ray spectrometry performed in the presence of some Compton interferences, and the recommended cooling intervals available for activation analysis of unknown samples have been proposed, and applied to the non-destructive activation analysis of gold in pure copper. In the presence of Compton interferences, two kinds of optimum cooling intervals were discussed. One maximizes the S/N ratio of a desired photo-peak. This interval had been originated by Isenhour, et al. Using the computer technique, this work is abbreviated as tsub( s/ n). The other, which minimizes the relative standard deviation (delta s/S) of a net photo-peak counting rate of interest (S) was originated by Tomov, et al. and Quittner, et al., this work is abbreviated as tsub(opt) or t'sub(opt). All equations derived by the above authors, however, have the practical disadvantage of including a term relating to the intensity of the desired photo-peak, thus making it difficult to predict the optimum cooling interval before irradiation. Since in chemical analysis, the concentration of the desired element, or the intensity of the photo-peak of interest, should be considered as ''unknown''. In the present work, an approach to the selection of recommended cooling interval applicable to the unknown sample has been discussed, and the interval, tsub(opt), which minimizes the lower limit of detection of a desired element under given irradiation and counting conditions has been proposed. (Evans, J.)

  7. The in vitro antibacterial effect of iodine-potassium iodide and calcium hydroxide in infected dentinal tubules at different time intervals.

    Science.gov (United States)

    Lin, Shaul; Kfir, Anda; Laviv, Amir; Sela, Galit; Fuss, Zvi

    2009-03-01

    The aim of this study was to evaluate the antibacterial effect of iodine-potassium iodide (IKI) and calcium hydroxide (CH) on dentinal tubules infected with Enterococcus faecalis (E. faecalis) at different time intervals. Hollow cylinders of bovine root dentin (n=45) were infected and divided into three equal groups filled with either IKI or CH and a positive control. After placing each medicament in the infected cylinders for time periods of 10 minutes, 48 hours and 7 days, microbiological samples were analyzed. At the end of each period, four 100 microm thick inner dentin layers (400 microm thick from each specimen) were removed using dental burs of increasing diameters. Dentin powder was cultured on agar plates to quantitatively assess their infection, expressed in colony forming units (cfu). In all layers of the positive control group, heavy bacterial infection was observed. After 10 minutes, IKI reduced the amount of viable bacteria more efficiently than CH, whereas at later time intervals CH showed the best results. For short periods of exposure, IKI has a more efficient antibacterial effect in the dentinal tubules than CH but CH performs better after longer durations of exposure. This research indicates the use of IKI is a better choice for disinfecting the root canal than CH if only a short duration of exposure is used because of its more efficient antibacterial effect. However, if a longer exposure time is used, then CH is a better choice because of its better disinfecting effect over time.

  8. Neutron generation time of the reactor 'crocus' by an interval distribution method for counts collected by two detectors

    International Nuclear Information System (INIS)

    Haldy, P.-A.; Chikouche, M.

    1975-01-01

    The distribution is considered of time intervals between a count in one neutron detector and the consequent event registered in a second one. A 'four interval' probability generating function was derived by means of which the expression for the distribution of the time intervals, lasting from triggering detection in the first detector to subsequent count in the second, one could be obtained. The experimental work was conducted in the zero thermal power reactor Crocus, using a neutron source provided by spontaneous fission, a BF 3 counter for the first detector and an He 3 detector for the second instrument. (U.K.)

  9. Simultaneous determination of radionuclides separable into natural decay series by use of time-interval analysis

    International Nuclear Information System (INIS)

    Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro

    2004-01-01

    A delayed coincidence method, time-interval analysis (TIA), has been applied to successive α-α decay events on the millisecond time-scale. Such decay events are part of the 220 Rn→ 216 Po (T 1/2 145 ms) (Th-series) and 219 Rn→ 215 Po (T 1/2 1.78 ms) (Ac-series). By using TIA in addition to measurement of 226 Ra (U-series) from α-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject β-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N 2 gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the 221 Fr→ 217 At (T 1/2 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the 225 Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples. (orig.)

  10. Multivariate interval-censored survival data

    DEFF Research Database (Denmark)

    Hougaard, Philip

    2014-01-01

    Interval censoring means that an event time is only known to lie in an interval (L,R], with L the last examination time before the event, and R the first after. In the univariate case, parametric models are easily fitted, whereas for non-parametric models, the mass is placed on some intervals, de...

  11. Theoretical implications of quantitative properties of interval timing and probability estimation in mouse and rat.

    Science.gov (United States)

    Kheifets, Aaron; Freestone, David; Gallistel, C R

    2017-07-01

    In three experiments with mice ( Mus musculus ) and rats (Rattus norvigicus), we used a switch paradigm to measure quantitative properties of the interval-timing mechanism. We found that: 1) Rodents adjusted the precision of their timed switches in response to changes in the interval between the short and long feed latencies (the temporal goalposts). 2) The variability in the timing of the switch response was reduced or unchanged in the face of large trial-to-trial random variability in the short and long feed latencies. 3) The adjustment in the distribution of switch latencies in response to changes in the relative frequency of short and long trials was sensitive to the asymmetry in the Kullback-Leibler divergence. The three results suggest that durations are represented with adjustable precision, that they are timed by multiple timers, and that there is a trial-by-trial (episodic) record of feed latencies in memory. © 2017 Society for the Experimental Analysis of Behavior.

  12. Comparing confidence intervals for Goodman and Kruskal's gamma coefficient

    NARCIS (Netherlands)

    van der Ark, L.A.; van Aert, R.C.M.

    2015-01-01

    This study was motivated by the question which type of confidence interval (CI) one should use to summarize sample variance of Goodman and Kruskal's coefficient gamma. In a Monte-Carlo study, we investigated the coverage and computation time of the Goodman-Kruskal CI, the Cliff-consistent CI, the

  13. An analyzer for pulse-interval times to study high-order effects in the processing of nuclear detector signals

    International Nuclear Information System (INIS)

    Denecke, B.; Jonge, S. de

    1998-01-01

    An electronic device to measure interval time density distributions of subsequent pulses in nuclear detectors and their electronics is described. The device has a pair-pulse resolution of 10 ns and 25 ns for 3 subsequent input signals. The conversion range is 4096 channels and the lowest channel width is 10 ns. Counter dead times, single and in series were studied and compared with the statistical model. True count rates were obtained from an exponential fit through the interval-time distribution

  14. An Integrated Theory of Prospective Time Interval Estimation: The Role of Cognition, Attention, and Learning

    Science.gov (United States)

    Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John

    2007-01-01

    A theory of prospective time perception is introduced and incorporated as a module in an integrated theory of cognition, thereby extending existing theories and allowing predictions about attention and learning. First, a time perception module is established by fitting existing datasets (interval estimation and bisection and impact of secondary…

  15. Interval stability for complex systems

    Science.gov (United States)

    Klinshov, Vladimir V.; Kirillov, Sergey; Kurths, Jürgen; Nekorkin, Vladimir I.

    2018-04-01

    Stability of dynamical systems against strong perturbations is an important problem of nonlinear dynamics relevant to many applications in various areas. Here, we develop a novel concept of interval stability, referring to the behavior of the perturbed system during a finite time interval. Based on this concept, we suggest new measures of stability, namely interval basin stability (IBS) and interval stability threshold (IST). IBS characterizes the likelihood that the perturbed system returns to the stable regime (attractor) in a given time. IST provides the minimal magnitude of the perturbation capable to disrupt the stable regime for a given interval of time. The suggested measures provide important information about the system susceptibility to external perturbations which may be useful for practical applications. Moreover, from a theoretical viewpoint the interval stability measures are shown to bridge the gap between linear and asymptotic stability. We also suggest numerical algorithms for quantification of the interval stability characteristics and demonstrate their potential for several dynamical systems of various nature, such as power grids and neural networks.

  16. Global Robust Stability of Switched Interval Neural Networks with Discrete and Distributed Time-Varying Delays of Neural Type

    Directory of Open Access Journals (Sweden)

    Huaiqin Wu

    2012-01-01

    Full Text Available By combing the theories of the switched systems and the interval neural networks, the mathematics model of the switched interval neural networks with discrete and distributed time-varying delays of neural type is presented. A set of the interval parameter uncertainty neural networks with discrete and distributed time-varying delays of neural type are used as the individual subsystem, and an arbitrary switching rule is assumed to coordinate the switching between these networks. By applying the augmented Lyapunov-Krasovskii functional approach and linear matrix inequality (LMI techniques, a delay-dependent criterion is achieved to ensure to such switched interval neural networks to be globally asymptotically robustly stable in terms of LMIs. The unknown gain matrix is determined by solving this delay-dependent LMIs. Finally, an illustrative example is given to demonstrate the validity of the theoretical results.

  17. Two-sorted Point-Interval Temporal Logics

    DEFF Research Database (Denmark)

    Balbiani, Philippe; Goranko, Valentin; Sciavicco, Guido

    2011-01-01

    There are two natural and well-studied approaches to temporal ontology and reasoning: point-based and interval-based. Usually, interval-based temporal reasoning deals with points as particular, duration-less intervals. Here we develop explicitly two-sorted point-interval temporal logical framework...... whereby time instants (points) and time periods (intervals) are considered on a par, and the perspective can shift between them within the formal discourse. We focus on fragments involving only modal operators that correspond to the inter-sort relations between points and intervals. We analyze...

  18. Evaluation of statistical methods for quantifying fractal scaling in water-quality time series with irregular sampling

    Directory of Open Access Journals (Sweden)

    Q. Zhang

    2018-02-01

    Full Text Available River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1 fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2 the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling – in the form of spectral slope (β or other equivalent scaling parameters (e.g., Hurst exponent – are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1 they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β  =  0 to Brown noise (β  =  2 and (2 their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb–Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among

  19. A network of spiking neurons that can represent interval timing: mean field analysis.

    Science.gov (United States)

    Gavornik, Jeffrey P; Shouval, Harel Z

    2011-04-01

    Despite the vital importance of our ability to accurately process and encode temporal information, the underlying neural mechanisms are largely unknown. We have previously described a theoretical framework that explains how temporal representations, similar to those reported in the visual cortex, can form in locally recurrent cortical networks as a function of reward modulated synaptic plasticity. This framework allows networks of both linear and spiking neurons to learn the temporal interval between a stimulus and paired reward signal presented during training. Here we use a mean field approach to analyze the dynamics of non-linear stochastic spiking neurons in a network trained to encode specific time intervals. This analysis explains how recurrent excitatory feedback allows a network structure to encode temporal representations.

  20. Contrasting Perspectives of Anesthesiologists and Gastroenterologists on the Optimal Time Interval between Bowel Preparation and Endoscopic Sedation

    Directory of Open Access Journals (Sweden)

    Deepak Agrawal

    2015-01-01

    Full Text Available Background. The optimal time interval between the last ingestion of bowel prep and sedation for colonoscopy remains controversial, despite guidelines that sedation can be administered 2 hours after consumption of clear liquids. Objective. To determine current practice patterns among anesthesiologists and gastroenterologists regarding the optimal time interval for sedation after last ingestion of bowel prep and to understand the rationale underlying their beliefs. Design. Questionnaire survey of anesthesiologists and gastroenterologists in the USA. The questions were focused on the preferred time interval of endoscopy after a polyethylene glycol based preparation in routine cases and select conditions. Results. Responses were received from 109 anesthesiologists and 112 gastroenterologists. 96% of anesthesiologists recommended waiting longer than 2 hours until sedation, in contrast to only 26% of gastroenterologists. The main reason for waiting >2 hours was that PEG was not considered a clear liquid. Most anesthesiologists, but not gastroenterologists, waited longer in patients with history of diabetes or reflux. Conclusions. Anesthesiologists and gastroenterologists do not agree on the optimal interval for sedation after last drink of bowel prep. Most anesthesiologists prefer to wait longer than the recommended 2 hours for clear liquids. The data suggest a need for clearer guidelines on this issue.

  1. Prognostic value of cardiac time intervals measured by tissue Doppler imaging M-mode in the general population

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Mogelvang, Rasmus; Jensen, Jan Skov

    2015-01-01

    : In a large prospective community-based study, cardiac function was evaluated in 1915 participants by both conventional echocardiography and TDI. The cardiac time intervals, including the isovolumic relaxation time (IVRT), isovolumic contraction time (IVCT) and ejection time (ET), were obtained by TDI M...

  2. The delayed reproduction of long time intervals defined by innocuous thermal sensation.

    Science.gov (United States)

    Khoshnejad, Mina; Martinu, Kristina; Grondin, Simon; Rainville, Pierre

    2016-04-01

    The presence of discrete events during an interval to be estimated generally causes a dilation of perceived duration (event-filling effect). Here, we investigated this phenomenon in the thermal modality using multi-seconds (19 s) innocuous cool stimuli that were either constant (continuous interval) or fluctuating to create three discrete sensory events (segmented interval). Moreover, we introduced a delay following stimulus offset, before the reproduction phase, to allow for a direct comparison with our recent study showing an underestimation of duration in a delayed reproduction task of heat pain sensations (Khoshnejad et al. in Pain 155:581-590, 2014. doi: 10.1016/j.pain.2013.12.015 ). The event-filling effect was tested by comparing the delayed reproduction of the segmented and the continuous stimuli in experimental conditions asking participants to (1) reproduce the dynamics of the sensation (i.e., changes in sensory intensity over time) or (2) reproduce only the interval duration (i.e., sensation onset-to-offset). A perceptual (control) condition required participants to report changes in sensation concurrently with the stimulus. Results of the dynamic task confirmed the underestimation of duration in the delayed reproduction task, but this effect was only found with the continuous and not with the segmented stimulus. This implies that the dilation of duration produced by segmentation might compensate for the underestimation of duration in this delayed reproduction task. However, this temporal dilation effect was only observed when participants were required to attend and reproduce the dynamics of sensation. These results suggest that the event-filling effect can be observed in the thermal sensory modality and that attention directed toward changes in sensory intensity might contribute to this effect.

  3. On the Influence of the Data Sampling Interval on Computer-Derived K-Indices

    Directory of Open Access Journals (Sweden)

    A Bernard

    2011-06-01

    Full Text Available The K index was devised by Bartels et al. (1939 to provide an objective monitoring of irregular geomagnetic activity. The K index was then routinely used to monitor the magnetic activity at permanent magnetic observatories as well as at temporary stations. The increasing number of digital and sometimes unmanned observatories and the creation of INTERMAGNET put the question of computer production of K at the centre of the debate. Four algorithms were selected during the Vienna meeting (1991 and endorsed by IAGA for the computer production of K indices. We used one of them (FMI algorithm to investigate the impact of the geomagnetic data sampling interval on computer produced K values through the comparison of the computer derived K values for the period 2009, January 1st to 2010, May 31st at the Port-aux-Francais magnetic observatory using magnetic data series with different sampling rates (the smaller: 1 second; the larger: 1 minute. The impact is investigated on both 3-hour range values and K indices data series, as a function of the activity level for low and moderate geomagnetic activity.

  4. Determining diabetic retinopathy screening interval based on time from no retinopathy to laser therapy.

    Science.gov (United States)

    Hughes, Daniel; Nair, Sunil; Harvey, John N

    2017-12-01

    Objectives To determine the necessary screening interval for retinopathy in diabetic patients with no retinopathy based on time to laser therapy and to assess long-term visual outcome following screening. Methods In a population-based community screening programme in North Wales, 2917 patients were followed until death or for approximately 12 years. At screening, 2493 had no retinopathy; 424 had mostly minor degrees of non-proliferative retinopathy. Data on timing of first laser therapy and visual outcome following screening were obtained from local hospitals and ophthalmology units. Results Survival analysis showed that very few of the no retinopathy at screening group required laser therapy in the early years compared with the non-proliferative retinopathy group ( p retinopathy at screening group required laser therapy, and at three years 0.2% (cumulative), lower rates of treatment than have been suggested by analyses of sight-threatening retinopathy determined photographically. At follow-up (mean 7.8 ± 4.6 years), mild to moderate visual impairment in one or both eyes due to diabetic retinopathy was more common in those with retinopathy at screening (26% vs. 5%, p diabetes occurred in only 1 in 1000. Conclusions Optimum screening intervals should be determined from time to active treatment. Based on requirement for laser therapy, the screening interval for diabetic patients with no retinopathy can be extended to two to three years. Patients who attend for retinal screening and treatment who have no or non-proliferative retinopathy now have a very low risk of eventual blindness from diabetes.

  5. Interpregnancy interval and risk of autistic disorder.

    Science.gov (United States)

    Gunnes, Nina; Surén, Pål; Bresnahan, Michaeline; Hornig, Mady; Lie, Kari Kveim; Lipkin, W Ian; Magnus, Per; Nilsen, Roy Miodini; Reichborn-Kjennerud, Ted; Schjølberg, Synnve; Susser, Ezra Saul; Øyen, Anne-Siri; Stoltenberg, Camilla

    2013-11-01

    A recent California study reported increased risk of autistic disorder in children conceived within a year after the birth of a sibling. We assessed the association between interpregnancy interval and risk of autistic disorder using nationwide registry data on pairs of singleton full siblings born in Norway. We defined interpregnancy interval as the time from birth of the first-born child to conception of the second-born child in a sibship. The outcome of interest was autistic disorder in the second-born child. Analyses were restricted to sibships in which the second-born child was born in 1990-2004. Odds ratios (ORs) were estimated by fitting ordinary logistic models and logistic generalized additive models. The study sample included 223,476 singleton full-sibling pairs. In sibships with interpregnancy intervals autistic disorder, compared with 0.13% in the reference category (≥ 36 months). For interpregnancy intervals shorter than 9 months, the adjusted OR of autistic disorder in the second-born child was 2.18 (95% confidence interval 1.42-3.26). The risk of autistic disorder in the second-born child was also increased for interpregnancy intervals of 9-11 months in the adjusted analysis (OR = 1.71 [95% CI = 1.07-2.64]). Consistent with a previous report from California, interpregnancy intervals shorter than 1 year were associated with increased risk of autistic disorder in the second-born child. A possible explanation is depletion of micronutrients in mothers with closely spaced pregnancies.

  6. Low-sampling-rate ultra-wideband digital receiver using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig

    2014-09-01

    In this paper, we propose an all-digital scheme for ultra-wideband symbol detection. In the proposed scheme, the received symbols are sampled many times below the Nyquist rate. It is shown that when the number of symbol repetitions, P, is co-prime with the symbol duration given in Nyquist samples, the receiver can sample the received data P times below the Nyquist rate, without loss of fidelity. The proposed scheme is applied to perform channel estimation and binary pulse position modulation (BPPM) detection. Results are presented for two receivers operating at two different sampling rates that are 10 and 20 times below the Nyquist rate. The feasibility of the proposed scheme is demonstrated in different scenarios, with reasonable bit error rates obtained in most of the cases.

  7. Low-sampling-rate ultra-wideband digital receiver using equivalent-time sampling

    KAUST Repository

    Ballal, Tarig; Al-Naffouri, Tareq Y.

    2014-01-01

    In this paper, we propose an all-digital scheme for ultra-wideband symbol detection. In the proposed scheme, the received symbols are sampled many times below the Nyquist rate. It is shown that when the number of symbol repetitions, P, is co-prime with the symbol duration given in Nyquist samples, the receiver can sample the received data P times below the Nyquist rate, without loss of fidelity. The proposed scheme is applied to perform channel estimation and binary pulse position modulation (BPPM) detection. Results are presented for two receivers operating at two different sampling rates that are 10 and 20 times below the Nyquist rate. The feasibility of the proposed scheme is demonstrated in different scenarios, with reasonable bit error rates obtained in most of the cases.

  8. H∞ state estimation of generalised neural networks with interval time-varying delays

    Science.gov (United States)

    Saravanakumar, R.; Syed Ali, M.; Cao, Jinde; Huang, He

    2016-12-01

    This paper focuses on studying the H∞ state estimation of generalised neural networks with interval time-varying delays. The integral terms in the time derivative of the Lyapunov-Krasovskii functional are handled by the Jensen's inequality, reciprocally convex combination approach and a new Wirtinger-based double integral inequality. A delay-dependent criterion is derived under which the estimation error system is globally asymptotically stable with H∞ performance. The proposed conditions are represented by linear matrix inequalities. Optimal H∞ norm bounds are obtained easily by solving convex problems in terms of linear matrix inequalities. The advantage of employing the proposed inequalities is illustrated by numerical examples.

  9. Application of the entropic coefficient for interval number optimization during interval assessment

    Directory of Open Access Journals (Sweden)

    Tynynyka A. N.

    2017-06-01

    Full Text Available In solving many statistical problems, the most precise choice of the distribution law of a random variable is required, the sample of which the authors observe. This choice requires the construction of an interval series. Therefore, the problem arises of assigning an optimal number of intervals, and this study proposes a number of formulas for solving it. Which of these formulas solves the problem more accurately? In [9], this question is investigated using the Pearson criterion. This article describes the procedure and on its basis gives formulas available in literature and proposed new formulas using the entropy coefficient. A comparison is made with the previously published results of applying Pearson's concord criterion for these purposes. Differences in the estimates of the accuracy of the formulas are found. The proposed new formulas for calculating the number of intervals showed the best results. Calculations have been made to compare the work of the same formulas for the distribution of sample data according to the normal law and the Rayleigh law.

  10. Hospital process intervals, not EMS time intervals, are the most important predictors of rapid reperfusion in EMS Patients with ST-segment elevation myocardial infarction.

    Science.gov (United States)

    Clark, Carol Lynn; Berman, Aaron D; McHugh, Ann; Roe, Edward Jedd; Boura, Judith; Swor, Robert A

    2012-01-01

    To assess the relationship of emergency medical services (EMS) intervals and internal hospital intervals to the rapid reperfusion of patients with ST-segment elevation myocardial infarction (STEMI). We performed a secondary analysis of a prospectively collected database of STEMI patients transported to a large academic community hospital between January 1, 2004, and December 31, 2009. EMS and hospital data intervals included EMS scene time, transport time, hospital arrival to myocardial infarction (MI) team activation (D2Page), page to catheterization laboratory arrival (P2Lab), and catheterization laboratory arrival to reperfusion (L2B). We used two outcomes: EMS scene arrival to reperfusion (S2B) ≤90 minutes and hospital arrival to reperfusion (D2B) ≤90 minutes. Means and proportions are reported. Pearson chi-square and multivariate regression were used for analysis. During the study period, we included 313 EMS-transported STEMI patients with 298 (95.2%) MI team activations. Of these STEMI patients, 295 (94.2%) were taken to the cardiac catheterization laboratory and 244 (78.0%) underwent percutaneous coronary intervention (PCI). For the patients who underwent PCI, 127 (52.5%) had prehospital EMS activation, 202 (82.8%) had D2B ≤90 minutes, and 72 (39%) had S2B ≤90 minutes. In a multivariate analysis, hospital processes EMS activation (OR 7.1, 95% CI 2.7, 18.4], Page to Lab [6.7, 95% CI 2.3, 19.2] and Lab arrival to Reperfusion [18.5, 95% CI 6.1, 55.6]) were the most important predictors of Scene to Balloon ≤ 90 minutes. EMS scene and transport intervals also had a modest association with rapid reperfusion (OR 0.85, 95% CI 0.78, 0.93 and OR 0.89, 95% CI 0.83, 0.95, respectively). In a secondary analysis, Hospital processes (Door to Page [OR 44.8, 95% CI 8.6, 234.4], Page 2 Lab [OR 5.4, 95% CI 1.9, 15.3], and Lab arrival to Reperfusion [OR 14.6 95% CI 2.5, 84.3]), but not EMS scene and transport intervals were the most important predictors D2B ≤90

  11. Is the time interval between surgery and radiotherapy important in operable nonsmall cell lung cancer? A retrospective analysis of 340 cases

    International Nuclear Information System (INIS)

    Wuerschmidt, Florian; Buenemann, Henry; Ehnert, Michael; Heilmann, Hans-Peter

    1997-01-01

    Purpose: To evaluate the influence of prognostic factors in postoperative radiotherapy of NSCLC with special emphasis on the time interval between surgery and start of radiotherapy. Methods and Materials: Between January 1976 and December 1993, 340 cases were treated and retrospectively analyzed meeting the following criteria: complete follow-up; complete staging information including pathological confirmation of resection status; maximum interval between surgery (SX) and radiotherapy (RT) of 12 weeks (median 36 days, range 18 to 84 days); minimum dose of 50 Gy (R0), and maximum dose of 70 Gy (R2). Two hundred thirty patients (68%) had N2 disease; 228 patients were completely resected (R0). One hundred six (31%) had adenocarcinoma, 172 (51%) squamous cell carcinoma. Results: In univariate analysis, Karnofsky performance status (90+ > 60-80%; p = 0.019 log rank), resection status stratified for nodal disease (R+ < R0; p = 0.046), and the time interval between SX and RT were of significant importance. Patients with a long interval (37 to 84 days) had higher 5-year survival rates (26%) and a median survival time (MST: 21.9 months, 95% C.I. 17.2 to 28.6 months) than patients with a short interval (18 to 36 days: 15%; 14.9 months, 13 to 19.9 months; p = 0.013). A further subgroup analysis revealed significant higher survival rates in patients with a long interval in N0/1 disease (p = 0.011) and incompletely resected NSCLC (p = 0.012). In multivariate analysis, the time interval had a p-value of 0.009 (nodal disease: p = 0.0083; KPI: p = 0.0037; sex: p = 0.035). Conclusion: Shortening the time interval between surgery and postoperative radiotherapy to less than 6 weeks even in R+ cases is not necessary. Survival of patients with a long interval between surgery and start of radiotherapy was better in this retrospective analysis as compared to patients with a short interval

  12. Perceptual inequality between two neighboring time intervals defined by sound markers: correspondence between neurophysiological and psychological data

    Directory of Open Access Journals (Sweden)

    Takako eMitsudo

    2014-09-01

    Full Text Available Brain activity related to time estimation processes in humans was analyzed using a perceptual phenomenon called auditory temporal assimilation. In a typical stimulus condition, two neighboring time intervals (T1 and T2 in this order are perceived as equal even when the physical lengths of these time intervals are considerably different. Our previous event-related potential (ERP study demonstrated that a slow negative component (SNCt appears in the right-frontal brain area (around the F8 electrode after T2, which is associated with judgment of the equality/inequality of T1 and T2. In the present study, we conducted two ERP experiments to further confirm the robustness of the SNCt. The stimulus patterns consisted of two neighboring time intervals marked by three successive tone bursts. Thirteen participants only listened to the patterns in the first session, and judged the equality/inequality of T1 and T2 in the next session. Behavioral data showed typical temporal assimilation. The ERP data revealed that three components (N1; contingent negative variation, CNV; and SNCt emerged related to the temporal judgment. The N1 appeared in the central area, and its peak latencies corresponded to the physical timing of each marker onset. The CNV component appeared in the frontal area during T2 presentation, and its amplitude increased as a function of T1. The SNCt appeared in the right-frontal area after the presentation of T1 and T2, and its magnitude was larger for the temporal patterns causing perceptual inequality. The SNCt was also correlated with the perceptual equality/inequality of the same stimulus pattern, and continued up to about 400 ms after the end of T2. These results suggest that the SNCt can be a signature of equality/inequality judgment, which derives from the comparison of the two neighboring time intervals.

  13. Diagnostic Efficiency of MR Imaging of the Knee. Relationship to time Interval between MR and Arthroscopy

    International Nuclear Information System (INIS)

    Barrera, M. C.; Recondo, J. A.; Aperribay, M.; Gervas, C.; Fernandez, E.; Alustiza, J. M.

    2003-01-01

    To evaluate the efficiency of magnetic resonance (MR) in the diagnosis of knee lesions and how the results are influenced by the time interval between MR and arthroscopy. 248 knees studied by MR were retrospectively analyzed, as well as those which also underwent arthroscopy. Arthroscopy was considered to be the gold standard, MR diagnostic capacity was evaluated for both meniscal and cruciate ligament lesions. Sensitivity, specificity and Kappa index were calculated for the set of all knees included in the study (248), for those in which the time between MR and arthroscopy was less than or equal to three months (134) and for those in which the time between both procedures was less than or equal to one month. Sensitivity, specificity and Kappa index of the MR had global values of 96.5%, 70% and 71%, respectively. When the interval between MR and arthroscopy was less than or equal to three months, sensitivity, specificity and Kappa index were 95.5%, 75% and 72%, respectively. When it was less than or equal to one month, sensitivity was 100%, specificity was 87.5% and Kappa index was 91%. MR is an excellent tool for the diagnosis of knee lesions. Higher MR values of sensitivity, specificity and Kappa index are obtained when the time interval between both procedures is kept to a minimum. (Author) 11 refs

  14. Detecting chaos in irregularly sampled time series.

    Science.gov (United States)

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  15. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Science.gov (United States)

    Deitchler, Megan; Deconinck, Hedwig; Bergeron, Gilles

    2008-01-01

    The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster) and a 67 × 3 cluster design (67 clusters, 3 observations per cluster). Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals) than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data collection in emergency

  16. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Directory of Open Access Journals (Sweden)

    Deconinck Hedwig

    2008-05-01

    Full Text Available Abstract The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster and a 67 × 3 cluster design (67 clusters, 3 observations per cluster. Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data

  17. Comparing confidence intervals for Goodman and Kruskal’s gamma coefficient

    NARCIS (Netherlands)

    van der Ark, L.A.; van Aert, R.C.M.

    2015-01-01

    This study was motivated by the question which type of confidence interval (CI) one should use to summarize sample variance of Goodman and Kruskal's coefficient gamma. In a Monte-Carlo study, we investigated the coverage and computation time of the Goodman–Kruskal CI, the Cliff-consistent CI, the

  18. Time Interval to Initiation of Contraceptive Methods Following ...

    African Journals Online (AJOL)

    2018-01-30

    Jan 30, 2018 ... interval between a woman's last childbirth and the initiation of contraception. Materials and ..... DF=Degree of freedom; χ2=Chi‑square test ..... practice of modern contraception among single women in a rural and urban ...

  19. Impact of Vestibular Lesions on Allocentric Navigation and Interval Timing: The Role of Self-Initiated Motion in Spatial-Temporal Integration

    Czech Academy of Sciences Publication Activity Database

    Dallal, N. L.; Yin, B.; Nekovářová, Tereza; Stuchlík, Aleš; Meck, W. H.

    2015-01-01

    Roč. 3, 3-4 (2015), s. 269-305 ISSN 2213-445X R&D Projects: GA MŠk(CZ) LH14053 Institutional support: RVO:67985823 Keywords : peak-interval procedure * interval timing * radial-arm maze * magnitude representation * dorsolateral striatum * self-initiated movement * hippocampus * cerebellum * time perception * allocentric navigation Subject RIV: FH - Neurology

  20. An experimental evaluation of electrical skin conductivity changes in postmortem interval and its assessment for time of death estimation.

    Science.gov (United States)

    Cantürk, İsmail; Karabiber, Fethullah; Çelik, Safa; Şahin, M Feyzi; Yağmur, Fatih; Kara, Sadık

    2016-02-01

    In forensic medicine, estimation of the time of death (ToD) is one of the most important and challenging medico-legal problems. Despite the partial accomplishments in ToD estimations to date, the error margin of ToD estimation is still too large. In this study, electrical conductivity changes were experimentally investigated in the postmortem interval in human cases. Electrical conductivity measurements give some promising clues about the postmortem interval. A living human has a natural electrical conductivity; in the postmortem interval, intracellular fluids gradually leak out of cells. These leaked fluids combine with extra-cellular fluids in tissues and since both fluids are electrolytic, intracellular fluids help increase conductivity. Thus, the level of electrical conductivity is expected to increase with increased time after death. In this study, electrical conductivity tests were applied for six hours. The electrical conductivity of the cases exponentially increased during the tested time period, indicating a positive relationship between electrical conductivity and the postmortem interval. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Pumping time required to obtain tube well water samples with aquifer characteristic radon concentrations

    International Nuclear Information System (INIS)

    Ricardo, Carla Pereira; Oliveira, Arno Heeren de

    2011-01-01

    Radon is an inert noble gas, which comes from the natural radioactive decay of uranium and thorium in soil, rock and water. Radon isotopes emanated from radium-bearing grains of a rock or soil are released into the pore space. Radon that reaches the pore space is partitioned between the gaseous and aqueous phases. Thus, the groundwater presents a radon signature from the rock that is characteristic of the aquifer. The characteristic radon concentration of an aquifer, which is mainly related to the emanation, is also influenced by the degree of subsurface degassing, especially in the vicinity of a tube well, where the radon concentration is strongly reduced. Looking for the required pumping time to take a tube well water sample that presents the characteristic radon concentration of the aquifer, an experiment was conducted in an 80 m deep tube well. In this experiment, after twenty-four hours without extraction, water samples were collected periodically, about ten minutes intervals, during two hours of pumping time. The radon concentrations of the samples were determined by using the RAD7 Electronic Radon Detector from Durridge Company, a solid state alpha spectrometric detector. It was realized that the necessary time to reach the maximum radon concentration, that means the characteristic radon concentration of the aquifer, is about sixty minutes. (author)

  2. Confidence intervals for correlations when data are not normal.

    Science.gov (United States)

    Bishara, Anthony J; Hittner, James B

    2017-02-01

    With nonnormal data, the typical confidence interval of the correlation (Fisher z') may be inaccurate. The literature has been unclear as to which of several alternative methods should be used instead, and how extreme a violation of normality is needed to justify an alternative. Through Monte Carlo simulation, 11 confidence interval methods were compared, including Fisher z', two Spearman rank-order methods, the Box-Cox transformation, rank-based inverse normal (RIN) transformation, and various bootstrap methods. Nonnormality often distorted the Fisher z' confidence interval-for example, leading to a 95 % confidence interval that had actual coverage as low as 68 %. Increasing the sample size sometimes worsened this problem. Inaccurate Fisher z' intervals could be predicted by a sample kurtosis of at least 2, an absolute sample skewness of at least 1, or significant violations of normality hypothesis tests. Only the Spearman rank-order and RIN transformation methods were universally robust to nonnormality. Among the bootstrap methods, an observed imposed bootstrap came closest to accurate coverage, though it often resulted in an overly long interval. The results suggest that sample nonnormality can justify avoidance of the Fisher z' interval in favor of a more robust alternative. R code for the relevant methods is provided in supplementary materials.

  3. Is high-intensity interval training a time-efficient exercise strategy to improve health and fitness?

    Science.gov (United States)

    Gillen, Jenna B; Gibala, Martin J

    2014-03-01

    Growing research suggests that high-intensity interval training (HIIT) is a time-efficient exercise strategy to improve cardiorespiratory and metabolic health. "All out" HIIT models such as Wingate-type exercise are particularly effective, but this type of training may not be safe, tolerable or practical for many individuals. Recent studies, however, have revealed the potential for other models of HIIT, which may be more feasible but are still time-efficient, to stimulate adaptations similar to more demanding low-volume HIIT models and high-volume endurance-type training. As little as 3 HIIT sessions per week, involving ≤10 min of intense exercise within a time commitment of ≤30 min per session, including warm-up, recovery between intervals and cool down, has been shown to improve aerobic capacity, skeletal muscle oxidative capacity, exercise tolerance and markers of disease risk after only a few weeks in both healthy individuals and people with cardiometabolic disorders. Additional research is warranted, as studies conducted have been relatively short-term, with a limited number of measurements performed on small groups of subjects. However, given that "lack of time" remains one of the most commonly cited barriers to regular exercise participation, low-volume HIIT is a time-efficient exercise strategy that warrants consideration by health practitioners and fitness professionals.

  4. Discrete- vs. Continuous-Time Modeling of Unequally Spaced Experience Sampling Method Data

    Directory of Open Access Journals (Sweden)

    Silvia de Haan-Rietdijk

    2017-10-01

    Full Text Available The Experience Sampling Method is a common approach in psychological research for collecting intensive longitudinal data with high ecological validity. One characteristic of ESM data is that it is often unequally spaced, because the measurement intervals within a day are deliberately varied, and measurement continues over several days. This poses a problem for discrete-time (DT modeling approaches, which are based on the assumption that all measurements are equally spaced. Nevertheless, DT approaches such as (vector autoregressive modeling are often used to analyze ESM data, for instance in the context of affective dynamics research. There are equivalent continuous-time (CT models, but they are more difficult to implement. In this paper we take a pragmatic approach and evaluate the practical relevance of the violated model assumption in DT AR(1 and VAR(1 models, for the N = 1 case. We use simulated data under an ESM measurement design to investigate the bias in the parameters of interest under four different model implementations, ranging from the true CT model that accounts for all the exact measurement times, to the crudest possible DT model implementation, where even the nighttime is treated as a regular interval. An analysis of empirical affect data illustrates how the differences between DT and CT modeling can play out in practice. We find that the size and the direction of the bias in DT (VAR models for unequally spaced ESM data depend quite strongly on the true parameter in addition to data characteristics. Our recommendation is to use CT modeling whenever possible, especially now that new software implementations have become available.

  5. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    Science.gov (United States)

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  6. An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files

    Directory of Open Access Journals (Sweden)

    Anthony Chan

    2008-01-01

    Full Text Available A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events. These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughly proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage. The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.

  7. Poststimulation time interval-dependent effects of motor cortex anodal tDCS on reaction-time task performance.

    Science.gov (United States)

    Molero-Chamizo, Andrés; Alameda Bailén, José R; Garrido Béjar, Tamara; García López, Macarena; Jaén Rodríguez, Inmaculada; Gutiérrez Lérida, Carolina; Pérez Panal, Silvia; González Ángel, Gloria; Lemus Corchero, Laura; Ruiz Vega, María J; Nitsche, Michael A; Rivera-Urbina, Guadalupe N

    2018-02-01

    Anodal transcranial direct current stimulation (tDCS) induces long-term potentiation-like plasticity, which is associated with long-lasting effects on different cognitive, emotional, and motor performances. Specifically, tDCS applied over the motor cortex is considered to improve reaction time in simple and complex tasks. The timing of tDCS relative to task performance could determine the efficacy of tDCS to modulate performance. The aim of this study was to compare the effects of a single session of anodal tDCS (1.5 mA, for 15 min) applied over the left primary motor cortex (M1) versus sham stimulation on performance of a go/no-go simple reaction-time task carried out at three different time points after tDCS-namely, 0, 30, or 60 min after stimulation. Performance zero min after anodal tDCS was improved during the whole course of the task. Performance 30 min after anodal tDCS was improved only in the last block of the reaction-time task. Performance 60 min after anodal tDCS was not significantly different throughout the entire task. These findings suggest that the motor cortex excitability changes induced by tDCS can improve motor responses, and these effects critically depend on the time interval between stimulation and task performance.

  8. Improving Delay-Range-Dependent Stability Condition for Systems with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Wei Qian

    2013-01-01

    Full Text Available This paper discusses the delay-range-dependent stability for systems with interval time-varying delay. Through defining the new Lyapunov-Krasovskii functional and estimating the derivative of the LKF by introducing new vectors, using free matrices and reciprocally convex approach, the new delay-range-dependent stability conditions are obtained. Two well-known examples are given to illustrate the less conservatism of the proposed theoretical results.

  9. Association between prehospital time interval and short-term outcome in acute heart failure patients.

    Science.gov (United States)

    Takahashi, Masashi; Kohsaka, Shun; Miyata, Hiroaki; Yoshikawa, Tsutomu; Takagi, Atsutoshi; Harada, Kazumasa; Miyamoto, Takamichi; Sakai, Tetsuo; Nagao, Ken; Sato, Naoki; Takayama, Morimasa

    2011-09-01

    Acute heart failure (AHF) is one of the most frequently encountered cardiovascular conditions that can seriously affect the patient's prognosis. However, the importance of early triage and treatment initiation in the setting of AHF has not been recognized. The Tokyo Cardiac Care Unit Network Database prospectively collected information of emergency admissions to acute cardiac care facilities in 2005-2007 from 67 participating hospitals in the Tokyo metropolitan area. We analyzed records of 1,218 AHF patients transported to medical centers via emergency medical services (EMS). AHF was defined as rapid onset or change in the signs and symptoms of heart failure, resulting in the need for urgent therapy. Patients with acute coronary syndrome were excluded from this analysis. Logistic regression analysis was performed to calculate the risk-adjusted in-hospital mortality. A majority of the patients were elderly (76.1 ± 11.5 years old) and male (54.1%). The overall in-hospital mortality rate was 6.0%. The median time interval between symptom onset and EMS arrival (response time) was 64 minutes (interquartile range [IQR] 26-205 minutes), and that between EMS arrival and ER arrival (transportation time) was 27 minutes (IQR 9-78 minutes). The risk-adjusted mortality increased with transportation time, but did not correlate with the response time. Those who took >45 minutes to arrive at the medical centers were at a higher risk for in-hospital mortality (odds ratio 2.24, 95% confidence interval 1.17-4.31; P = .015). Transportation time correlated with risk-adjusted mortality, and steps should be taken to reduce the EMS transfer time to improve the outcome in AHF patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. National Survey of Adult and Pediatric Reference Intervals in Clinical Laboratories across Canada: A Report of the CSCC Working Group on Reference Interval Harmonization.

    Science.gov (United States)

    Adeli, Khosrow; Higgins, Victoria; Seccombe, David; Collier, Christine P; Balion, Cynthia M; Cembrowski, George; Venner, Allison A; Shaw, Julie

    2017-11-01

    Reference intervals are widely used decision-making tools in laboratory medicine, serving as health-associated standards to interpret laboratory test results. Numerous studies have shown wide variation in reference intervals, even between laboratories using assays from the same manufacturer. Lack of consistency in either sample measurement or reference intervals across laboratories challenges the expectation of standardized patient care regardless of testing location. Here, we present data from a national survey conducted by the Canadian Society of Clinical Chemists (CSCC) Reference Interval Harmonization (hRI) Working Group that examines variation in laboratory reference sample measurements, as well as pediatric and adult reference intervals currently used in clinical practice across Canada. Data on reference intervals currently used by 37 laboratories were collected through a national survey to examine the variation in reference intervals for seven common laboratory tests. Additionally, 40 clinical laboratories participated in a baseline assessment by measuring six analytes in a reference sample. Of the seven analytes examined, alanine aminotransferase (ALT), alkaline phosphatase (ALP), and creatinine reference intervals were most variable. As expected, reference interval variation was more substantial in the pediatric population and varied between laboratories using the same instrumentation. Reference sample results differed between laboratories, particularly for ALT and free thyroxine (FT4). Reference interval variation was greater than test result variation for the majority of analytes. It is evident that there is a critical lack of harmonization in laboratory reference intervals, particularly for the pediatric population. Furthermore, the observed variation in reference intervals across instruments cannot be explained by the bias between the results obtained on instruments by different manufacturers. Copyright © 2017 The Canadian Society of Clinical Chemists

  11. Time Interval to Initiation of Contraceptive Methods Following ...

    African Journals Online (AJOL)

    Objectives: The objectives of the study were to determine factors affecting the interval between a woman's last childbirth and the initiation of contraception. Materials and Methods: This was a retrospective study. Family planning clinic records of the Barau Dikko Teaching Hospital Kaduna from January 2000 to March 2014 ...

  12. Economic Statistical Design of Variable Sampling Interval X¯$\\overline X $ Control Chart Based on Surrogate Variable Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Lee Tae-Hoon

    2016-12-01

    Full Text Available In many cases, a X¯$\\overline X $ control chart based on a performance variable is used in industrial fields. Typically, the control chart monitors the measurements of a performance variable itself. However, if the performance variable is too costly or impossible to measure, and a less expensive surrogate variable is available, the process may be more efficiently controlled using surrogate variables. In this paper, we present a model for the economic statistical design of a VSI (Variable Sampling Interval X¯$\\overline X $ control chart using a surrogate variable that is linearly correlated with the performance variable. We derive the total average profit model from an economic viewpoint and apply the model to a Very High Temperature Reactor (VHTR nuclear fuel measurement system and derive the optimal result using genetic algorithms. Compared with the control chart based on a performance variable, the proposed model gives a larger expected net income per unit of time in the long-run if the correlation between the performance variable and the surrogate variable is relatively high. The proposed model was confined to the sample mean control chart under the assumption that a single assignable cause occurs according to the Poisson process. However, the model may also be extended to other types of control charts using a single or multiple assignable cause assumptions such as VSS (Variable Sample Size X¯$\\overline X $ control chart, EWMA, CUSUM charts and so on.

  13. Model for the respiratory modulation of the heart beat-to-beat time interval series

    Science.gov (United States)

    Capurro, Alberto; Diambra, Luis; Malta, C. P.

    2005-09-01

    In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.

  14. Time interval between infective endocarditis first symptoms and diagnosis: relationship to infective endocarditis characteristics, microorganisms and prognosis.

    Science.gov (United States)

    N'Guyen, Yohan; Duval, Xavier; Revest, Matthieu; Saada, Matthieu; Erpelding, Marie-Line; Selton-Suty, Christine; Bouchiat, Coralie; Delahaye, François; Chirouze, Catherine; Alla, François; Strady, Christophe; Hoen, Bruno

    2017-03-01

    To analyze the characteristics and outcome of infective endocarditis (IE) according to the time interval between IE first symptoms and diagnosis. Among the IE cases of a French population-based epidemiological survey, patients having early-diagnosed IE (diagnosis of IE within 1 month of first symptoms) were compared with those having late-diagnosed IE (diagnosis of IE more than 1 month after first symptoms). Among the 486 definite-IE, 124 (25%) had late-diagnosed IE whereas others had early-diagnosed IE. Early-diagnosed IE were independently associated with female gender (OR = 1.8; 95% CI [1.0-3.0]), prosthetic valve (OR= 2.6; 95% CI [1.4-5.0]) and staphylococci as causative pathogen (OR = 3.7; 95% CI [2.2-6.2]). Cardiac surgery theoretical indication rates were not different between early and late-diagnosed IE (56.3% vs 58.9%), whereas valve surgery performance was lower in early-diagnosed IE (41% vs 53%; p = .03). In-hospital mortality rates were higher in early-diagnosed IE than in late-diagnosed IE (25.1% vs 16.1%; p endocarditis, which time interval between first symptoms and diagnosis was less than one month, were mainly due to Staphylococcus aureus in France. Staphylococcus aureus infective endocarditis were associated with septic shock, transient ischemic attack or stroke and higher mortality rates than infective endocarditis due to other bacteria or infective endocarditis, which time interval between first symptoms and diagnosis was more than one month. Infective endocarditis, which time interval between first symptoms and diagnosis was more than one month, were accounting for one quarter of all infective endocarditis in our study and were associated with vertebral osteomyelitis and a higher rate of cardiac surgery performed for hemodynamic indication than other infective endocarditis.

  15. Effects of varied doses of psilocybin on time interval reproduction in human subjects.

    Science.gov (United States)

    Wackermann, Jirí; Wittmann, Marc; Hasler, Felix; Vollenweider, Franz X

    2008-04-11

    Action of a hallucinogenic substance, psilocybin, on internal time representation was investigated in two double-blind, placebo-controlled studies: Experiment 1 with 12 subjects and graded doses, and Experiment 2 with 9 subjects and a very low dose. The task consisted in repeated reproductions of time intervals in the range from 1.5 to 5s. The effects were assessed by parameter kappa of the 'dual klepsydra' model of internal time representation, fitted to individual response data and intra-individually normalized with respect to initial values. The estimates kappa were in the same order of magnitude as in earlier studies. In both experiments, kappa was significantly increased by psilocybin at 90 min from the drug intake, indicating a higher loss rate of the internal duration representation. These findings are tentatively linked to qualitative alterations of subjective time in altered states of consciousness.

  16. Postprandial oxidative losses of dietary leucine depend on the time interval between consecutive meals

    NARCIS (Netherlands)

    Myszkowska-Ryciak, J.; Keller, J.S.; Bujko, J.; Stankiewicz-Ciupa, J.; Koopmanschap, R.E.; Schreurs, V.V.A.M.

    2015-01-01

    Postprandial oxidative losses of egg white-bound [1-13C]-leucine were studied as 13C recovery in the breath of rats in relation to different time intervals between two meals. Male Wistar rats (n = 48; 68.3 ±5.9 g) divided into 4 groups (n = 12) were fed two meals a day (9:00

  17. Generalized Confidence Intervals and Fiducial Intervals for Some Epidemiological Measures

    Directory of Open Access Journals (Sweden)

    Ionut Bebu

    2016-06-01

    Full Text Available For binary outcome data from epidemiological studies, this article investigates the interval estimation of several measures of interest in the absence or presence of categorical covariates. When covariates are present, the logistic regression model as well as the log-binomial model are investigated. The measures considered include the common odds ratio (OR from several studies, the number needed to treat (NNT, and the prevalence ratio. For each parameter, confidence intervals are constructed using the concepts of generalized pivotal quantities and fiducial quantities. Numerical results show that the confidence intervals so obtained exhibit satisfactory performance in terms of maintaining the coverage probabilities even when the sample sizes are not large. An appealing feature of the proposed solutions is that they are not based on maximization of the likelihood, and hence are free from convergence issues associated with the numerical calculation of the maximum likelihood estimators, especially in the context of the log-binomial model. The results are illustrated with a number of examples. The overall conclusion is that the proposed methodologies based on generalized pivotal quantities and fiducial quantities provide an accurate and unified approach for the interval estimation of the various epidemiological measures in the context of binary outcome data with or without covariates.

  18. Importance Sampling for Stochastic Timed Automata

    DEFF Research Database (Denmark)

    Jegourel, Cyrille; Larsen, Kim Guldstrand; Legay, Axel

    2016-01-01

    We present an importance sampling framework that combines symbolic analysis and simulation to estimate the probability of rare reachability properties in stochastic timed automata. By means of symbolic exploration, our framework first identifies states that cannot reach the goal. A state-wise cha......We present an importance sampling framework that combines symbolic analysis and simulation to estimate the probability of rare reachability properties in stochastic timed automata. By means of symbolic exploration, our framework first identifies states that cannot reach the goal. A state...

  19. Effect of a data buffer on the recorded distribution of time intervals for random events

    Energy Technology Data Exchange (ETDEWEB)

    Barton, J C [Polytechnic of North London (UK)

    1976-03-15

    The use of a data buffer enables the distribution of the time intervals between events to be studied for times less than the recording system dead-time but the usual negative exponential distribution for random events has to be modified. The theory for this effect is developed for an n-stage buffer followed by an asynchronous recorder. Results are evaluated for the values of n from 1 to 5. In the language of queueing theory the system studied is of type M/D/1/n+1, i.e. with constant service time and a finite number of places.

  20. Biostratigraphic analysis of core samples from wells drilled in the Devonian shale interval of the Appalachian and Illinois Basins

    Energy Technology Data Exchange (ETDEWEB)

    Martin, S.J.; Zielinski, R.E.

    1978-07-14

    A palynological investigation was performed on 55 samples of core material from four wells drilled in the Devonian Shale interval of the Appalachian and Illinois Basins. Using a combination of spores and acritarchs, it was possible to divide the Middle Devonian from the Upper Devonian and to make subdivisions within the Middle and Upper Devonian. The age of the palynomorphs encountered in this study is Upper Devonian.

  1. Effect of insertion method and postinsertion time interval prior to force application on the removal torque of orthodontic miniscrews.

    Science.gov (United States)

    Sharifi, Maryam; Ghassemi, Amirreza; Bayani, Shahin

    2015-01-01

    Success of orthodontic miniscrews in providing stable anchorage is dependent on their stability. The purpose of this study was to assess the effect of insertion method and postinsertion time interval on the removal torque of miniscrews as an indicator of their stability. Seventy-two miniscrews (Jeil Medical) were inserted into the femoral bones of three male German Shepherd dogs and assigned to nine groups of eight miniscrews. Three insertion methods, including hand-driven, motor-driven with 5.0-Ncm insertion torque, and motor-driven with 20.0-Ncm insertion torque, were tested. Three time intervals of 0, 2, and 6 weeks between miniscrew insertion and removal were tested as well. Removal torque values were measured in newton centimeters by a removal torque tester (IMADA). Data were analyzed by one-way analysis of variance (ANOVA) followed by the Bonferroni post hoc test at a .05 level of significance. A miniscrew survival rate of 93% was observed in this study. The highest mean value of removal torque among the three postinsertion intervals (2.4 ± 0.59 Ncm) was obtained immediately after miniscrew insertion with a statistically significant difference from the other two time intervals (P torque values were obtained immediately after insertion.

  2. Evaluation of the Trail Making Test and interval timing as measures of cognition in healthy adults: comparisons by age, education, and gender.

    Science.gov (United States)

    Płotek, Włodzimierz; Łyskawa, Wojciech; Kluzik, Anna; Grześkowiak, Małgorzata; Podlewski, Roland; Żaba, Zbigniew; Drobnik, Leon

    2014-02-03

    Human cognitive functioning can be assessed using different methods of testing. Age, level of education, and gender may influence the results of cognitive tests. The well-known Trail Making Test (TMT), which is often used to measure the frontal lobe function, and the experimental test of Interval Timing (IT) were compared. The methods used in IT included reproduction of auditory and visual stimuli, with the subsequent production of the time intervals of 1-, 2-, 5-, and 7-seconds durations with no pattern. Subjects included 64 healthy adult volunteers aged 18-63 (33 women, 31 men). Comparisons were made based on age, education, and gender. TMT was performed quickly and was influenced by age, education, and gender. All reproduced visual and produced intervals were shortened and the reproduction of auditory stimuli was more complex. Age, education, and gender have more pronounced impact on the cognitive test than on the interval timing test. The reproduction of the short auditory stimuli was more accurate in comparison to other modalities used in the IT test. The interval timing, when compared to the TMT, offers an interesting possibility of testing. Further studies are necessary to confirm the initial observation.

  3. Measurement of the ecological flow of the Acaponeta river, Nayarit, comparing different time intervals

    Directory of Open Access Journals (Sweden)

    Guadalupe de la Lanza Espino

    2012-07-01

    Full Text Available The diverse management of river water in Mexico has been unequal due to the different anthropological activities, and it is associated with inter-annual changes in the climate and runoff patterns, leading to a loss of the ecosystem integrity. However, nowadays there are different methods to assess the water volume that is necessary to conserve the environment, among which are hydrological methods, such as those applied here, that are based on information on water volumes recorded over decades, which are not always available in the country. For this reason, this study compares runoff records for different time ranges: minimum of 10 years, medium of 20 years, and more than 50 years, to quantify the environmental flow. These time intervals provided similar results, which mean that not only for the Acaponeta river, but possibly for others lotic systems as well, a 10-year interval may be used satisfactorily. In this river, the runoff water that must be kept for environmental purposes is: for 10 years 70.1%, for 20 years 78.1% and for >50 years 68.8%, with an average of 72.3% of the total water volume or of the average annual runoff.

  4. A comparison between brachial and echocardiographic systolic time intervals.

    Directory of Open Access Journals (Sweden)

    Ho-Ming Su

    Full Text Available Systolic time interval (STI is an established noninvasive technique for the assessment of cardiac function. Brachial STIs can be automatically determined by an ankle-brachial index (ABI-form device. The aims of this study are to evaluate whether the STIs measured from ABI-form device can represent those measured from echocardiography and to compare the diagnostic values of brachial and echocardiographic STIs in the prediction of left ventricular ejection fraction (LVEF <50%. A total of 849 patients were included in the study. Brachial pre-ejection period (bPEP and brachial ejection time (bET were measured using an ABI-form device and pre-ejection period (PEP and ejection time (ET were measured from echocardiography. Agreement was assessed by correlation coefficient and Bland-Altman plot. Brachial STIs had a significant correlation with echocardiographic STIs (r = 0.644, P<0.001 for bPEP and PEP; r  = 0.850, P<0.001 for bET and ET; r = 0.708, P<0.001 for bPEP/bET and PEP/ET. The disagreement between brachial and echocardiographic STIs (brachial STIs minus echocardiographic STIs was 28.55 ms for bPEP and PEP, -4.15 ms for bET and ET and -0.11 for bPEP/bET and PEP/ET. The areas under the curve for bPEP/bET and PEP/ET in the prediction of LVEF <50% were 0.771 and 0.765, respectively. Brachial STIs were good alternatives to STIs obtained from echocardiography and also helpful in prediction of LVEF <50%. Brachial STIs automatically obtained from an ABI-form device may be helpful for evaluation of left ventricular systolic dysfunction.

  5. Time interval between stroke onset and hospital arrival in acute ischemic stroke patients in Shanghai, China.

    Science.gov (United States)

    Fang, Jing; Yan, Weihong; Jiang, Guo-Xin; Li, Wei; Cheng, Qi

    2011-02-01

    To observe the time interval between stroke onset and hospital arrival (time-to-hospital) in acute ischemic stroke patients and analyze its putatively associated factors. During the period from November 1, 2006 to August 31, 2008, patients with acute ischemic stroke admitted consecutively to the Department of Neurology, Ninth Hospital, Shanghai, were enrolled in the study. Information of the patients was registered including the time-to-hospital, demographic data, history of stroke, season at attack, neurological symptom at onset, etc. Characteristics of the patients were analyzed and logistic regression analyses were conducted to identify factors associated with the time-to-hospital. There were 536 patients in the study, 290 (54.1%) males and 246 (45.9%) females. The median time-to-hospital was 8h (ranged from 0.1 to 300 h) for all patients. Within 3h after the onset of stroke, 162 patients (30.2%) arrived at our hospital; and within 6h, 278 patients (51.9%). Patients with a history of stroke, unconsciousness at onset, or a high NIHSS score at admission had significantly less time-to-hospital. The time interval between stroke onset and hospital arrival was importance of seeking immediate medical help after stroke onset of patients and their relatives could significantly influence their actions. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. Time interval measurement between to emission: a systematics; Mesure de l`intervalle de temps entre deux emissions: une systematique

    Energy Technology Data Exchange (ETDEWEB)

    Bizard, G.; Bougault, R.; Brou, R.; Colin, J.; Durand, D.; Genoux-Lubain, A.; Horn, D.; Kerambrun, A.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Lopez, O.; Louvel, M.; Mahi, M.; Meslin, C.; Steckmeyer, J.C.; Tamain, B.; Wieloch, A. [Lab. de Physique Corpusculaire, Caen Univ., 14 (France); LPC (Caen) - CRN (Strasbourg) Collaboration

    1998-04-01

    A systematic study of the evolution of intervals of fragment emission times as a function of the energy deposited in the compound system was performed. Several measurements, Ne at 60 MeV/u, Ar at 30 and 60 MeV/u and two measurements for Kr at 60 MeV/u (central and semi-peripheral collisions) are presented. In all the experiments the target was Au and the mass of the compounds system was around A = 200. The excitation energies per nucleon reached in the case of these heavy systems cover the range of 3 to 5.5 MeV/u. The method used to determine the emission time intervals is based on the correlation functions associated to the relative angle distributions. The gaps between the data and simulations allow to evaluate the emission times. A rapid decrease of these time intervals was observed when the excitation energy increased. This variation starts at 500 fm/c which corresponds to a sequential emission. This relatively long time which indicates a weak interaction between fragments, corresponds practically to the measurement threshold. The shortest intervals (about 50 fm/c) are associated to a spontaneous multifragmentation and were observed in the case of central collisions at Ar+Au and Kr+Au at 60 MeV/u. Two interpretations are possible. The multifragmentation process might be viewed as a sequential process of very short time-separation or else, one can separate two zones heaving in mind that the multifragmentation is predominant from 4,5 MeV/u excitation energy upwards. This question is still open and its study is under way at LPC. An answer could come from the study of the rupture process of an excited nucleus, notably by the determination of its life-time

  7. Cardiac time intervals and the association with 2D-speckle-tracking, tissue Doppler and conventional echocardiography

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Jensen, Jan Skov; Andersen, Henrik Ullits

    2016-01-01

    Cardiac time intervals (CTI) are prognostic above and beyond conventional echocardiographic measures. The explanation may be that CTI contain information about both systolic and diastolic measures; this is, however, unknown. The relationship between the CTI and systolic and diastolic function...

  8. Using hemoglobin A1C as a predicting model for time interval from pre-diabetes progressing to diabetes.

    Directory of Open Access Journals (Sweden)

    Chen-Ling Huang

    Full Text Available The early identification of subjects at high risk for diabetes is essential, thus, random rather than fasting plasma glucose is more useful. We aim to evaluate the time interval between pre-diabetes to diabetes with anti-diabetic drugs by using HbA1C as a diagnostic tool, and predicting it using a mathematic model.We used the Taipei Medical University Affiliated Hospital Patient Profile Database (AHPPD from January-2007 to June-2011. The patients who progressed and were prescribed anti-diabetic drugs were selected from AHPPD. The mathematical model used to predict the time interval of HbA1C value ranged from 5.7% to 6.5% for diabetes progression.We predicted an average overall time interval for all participants in between 5.7% to 6.5% during a total of 907 days (standard error, 103 days. For each group found among 5.7% to 6.5% we determined 1169.3 days for the low risk group (i.e. 3.2 years, 1080.5 days (i.e. 2.96 years for the increased risk group and 729.4 days (i.e. 1.99 years for the diabetes group. This indicates the patients will take an average of 2.49 years to reach 6.5%.This prediction model is very useful to help prioritize the diagnosis at an early stage for targeting individuals with risk of diabetes. Using patients' HbA1C before anti-diabetes drugs are used we predicted the time interval from pre-diabetes progression to diabetes is 2.49 years without any influence of age and gender. Additional studies are needed to support this model for a long term prediction.

  9. Time-Scale and Time-Frequency Analyses of Irregularly Sampled Astronomical Time Series

    Directory of Open Access Journals (Sweden)

    S. Roques

    2005-09-01

    Full Text Available We evaluate the quality of spectral restoration in the case of irregular sampled signals in astronomy. We study in details a time-scale method leading to a global wavelet spectrum comparable to the Fourier period, and a time-frequency matching pursuit allowing us to identify the frequencies and to control the error propagation. In both cases, the signals are first resampled with a linear interpolation. Both results are compared with those obtained using Lomb's periodogram and using the weighted waveletZ-transform developed in astronomy for unevenly sampled variable stars observations. These approaches are applied to simulations and to light variations of four variable stars. This leads to the conclusion that the matching pursuit is more efficient for recovering the spectral contents of a pulsating star, even with a preliminary resampling. In particular, the results are almost independent of the quality of the initial irregular sampling.

  10. Entropy Analysis of RR and QT Interval Variability during Orthostatic and Mental Stress in Healthy Subjects

    Directory of Open Access Journals (Sweden)

    Mathias Baumert

    2014-12-01

    Full Text Available Autonomic activity affects beat-to-beat variability of heart rate and QT interval. The aim of this study was to explore whether entropy measures are suitable to detect changes in neural outflow to the heart elicited by two different stress paradigms. We recorded short-term ECG in 11 normal subjects during an experimental protocol that involved head-up tilt and mental arithmetic stress and computed sample entropy, cross-sample entropy and causal interactions based on conditional entropy from RR and QT interval time series. Head-up tilt resulted in a significant reduction in sample entropy of RR intervals and cross-sample entropy, while mental arithmetic stress resulted in a significant reduction in coupling directed from RR to QT. In conclusion, measures of entropy are suitable to detect changes in neural outflow to the heart and decoupling of repolarisation variability from heart rate variability elicited by orthostatic or mental arithmetic stress.

  11. Predicting fecal coliform using the interval-to-interval approach and SWAT in the Miyun watershed, China.

    Science.gov (United States)

    Bai, Jianwen; Shen, Zhenyao; Yan, Tiezhu; Qiu, Jiali; Li, Yangyang

    2017-06-01

    Pathogens in manure can cause waterborne-disease outbreaks, serious illness, and even death in humans. Therefore, information about the transformation and transport of bacteria is crucial for determining their source. In this study, the Soil and Water Assessment Tool (SWAT) was applied to simulate fecal coliform bacteria load in the Miyun Reservoir watershed, China. The data for the fecal coliform were obtained at three sampling sites, Chenying (CY), Gubeikou (GBK), and Xiahui (XH). The calibration processes of the fecal coliform were conducted using the CY and GBK sites, and validation was conducted at the XH site. An interval-to-interval approach was designed and incorporated into the processes of fecal coliform calibration and validation. The 95% confidence interval of the predicted values and the 95% confidence interval of measured values were considered during calibration and validation in the interval-to-interval approach. Compared with the traditional point-to-point comparison, this method can improve simulation accuracy. The results indicated that the simulation of fecal coliform using the interval-to-interval approach was reasonable for the watershed. This method could provide a new research direction for future model calibration and validation studies.

  12. Perceptions of Time and Long Time Intervals

    International Nuclear Information System (INIS)

    Drottz-Sjoeberg, Britt-Marie

    2006-01-01

    There are certainly many perspectives presented in the literature on time and time perception. This contribution has focused on perceptions of the time frames related to risk and danger of radiation from a planned Swedish repository for spent nuclear fuel. Respondents from two municipalities judged SSI's reviews of the entrepreneur's plans and work of high importance, and more important the closer to our time the estimate was given. Similarly were the consequences of potential leakage from a repository perceived as more serious the closer it would be to our time. Judgements of risks related to the storage of spent nuclear fuel were moderately large on the used measurement scales. Experts are experts because they have more knowledge, and in this context they underlined e.g. the importance of reviews of the radiation situation of time periods up to 100,000 years. It was of interest to note that 55% of the respondents from the municipalities did not believe that the future repository would leak radioactivity. They were much more pessimistic with respect to world politics, i.e. a new world war. However, with respect to the seriousness of the consequences given a leakage from the repository, the public group consistently gave high risk estimates, often significantly higher than those of the expert group. The underestimations of time estimates, as seen in the tasks of pinpointing historic events, provide examples of the difficulty of making estimations involving long times. Similar results showed that thinking of 'the future' most often involved about 30 years. On average, people reported memories of about 2.5 generations back in time, and emotional relationships stretching approximately 2.5 generations into the future; 94% of the responses, with respect to how many future generations one had an emotional relationship, were given in the range of 1-5 generations. Similarly, Svenson and Nilsson found the opinion that the current generations' general responsibility for

  13. Deficits in Interval Timing Measured by the Dual-Task Paradigm among Children and Adolescents with Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Hwang, Shoou-Lian; Gau, Susan Shur-Fen; Hsu, Wen-Yau; Wu, Yu-Yu

    2010-01-01

    Background: The underlying mechanism of time perception deficit in long time intervals in attention-deficit/hyperactivity disorder (ADHD) is still unclear. This study used the time reproduction dual task to explore the role of the attentional resource in time perception deficits among children and adolescents with ADHD. Methods: Participants…

  14. Proceedings of Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting (23rd) held in Pasadena, California on December 3-5, 1991

    Science.gov (United States)

    1991-12-05

    Between Two Western European Time Laboratories and VNIIFTRI ............. 341 P Daly, University of Leeds, N.B. Koshelyaevsky, VNIIFTRI , and W Lewandowski...equipped with GPS time receivers and contributing to TAI. The last GPS antenna position determined by the BIPM is installed near Moscow in the VNIIFTRI : it...Leeds and VNIIFTRI ", accepted in Proc. 23rd Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting, 1991. 15. W. Lewandowski and

  15. Sample-interpolation timing: an optimized technique for the digital measurement of time of flight for γ rays and neutrons at relatively low sampling rates

    International Nuclear Information System (INIS)

    Aspinall, M D; Joyce, M J; Mackin, R O; Jarrah, Z; Boston, A J; Nolan, P J; Peyton, A J; Hawkes, N P

    2009-01-01

    A unique, digital time pick-off method, known as sample-interpolation timing (SIT) is described. This method demonstrates the possibility of improved timing resolution for the digital measurement of time of flight compared with digital replica-analogue time pick-off methods for signals sampled at relatively low rates. Three analogue timing methods have been replicated in the digital domain (leading-edge, crossover and constant-fraction timing) for pulse data sampled at 8 GSa s −1 . Events arising from the 7 Li(p, n) 7 Be reaction have been detected with an EJ-301 organic liquid scintillator and recorded with a fast digital sampling oscilloscope. Sample-interpolation timing was developed solely for the digital domain and thus performs more efficiently on digital signals compared with analogue time pick-off methods replicated digitally, especially for fast signals that are sampled at rates that current affordable and portable devices can achieve. Sample interpolation can be applied to any analogue timing method replicated digitally and thus also has the potential to exploit the generic capabilities of analogue techniques with the benefits of operating in the digital domain. A threshold in sampling rate with respect to the signal pulse width is observed beyond which further improvements in timing resolution are not attained. This advance is relevant to many applications in which time-of-flight measurement is essential

  16. Perceptions of Time and Long Time Intervals

    Energy Technology Data Exchange (ETDEWEB)

    Drottz-Sjoeberg, Britt-Marie [Norwegian Univ. of Science and Technology, Trondheim (Norway). Dept. of Psychology

    2006-09-15

    There are certainly many perspectives presented in the literature on time and time perception. This contribution has focused on perceptions of the time frames related to risk and danger of radiation from a planned Swedish repository for spent nuclear fuel. Respondents from two municipalities judged SSI's reviews of the entrepreneur's plans and work of high importance, and more important the closer to our time the estimate was given. Similarly were the consequences of potential leakage from a repository perceived as more serious the closer it would be to our time. Judgements of risks related to the storage of spent nuclear fuel were moderately large on the used measurement scales. Experts are experts because they have more knowledge, and in this context they underlined e.g. the importance of reviews of the radiation situation of time periods up to 100,000 years. It was of interest to note that 55% of the respondents from the municipalities did not believe that the future repository would leak radioactivity. They were much more pessimistic with respect to world politics, i.e. a new world war. However, with respect to the seriousness of the consequences given a leakage from the repository, the public group consistently gave high risk estimates, often significantly higher than those of the expert group. The underestimations of time estimates, as seen in the tasks of pinpointing historic events, provide examples of the difficulty of making estimations involving long times. Similar results showed that thinking of 'the future' most often involved about 30 years. On average, people reported memories of about 2.5 generations back in time, and emotional relationships stretching approximately 2.5 generations into the future; 94% of the responses, with respect to how many future generations one had an emotional relationship, were given in the range of 1-5 generations. Similarly, Svenson and Nilsson found the opinion that the current generations

  17. An actual load forecasting methodology by interval grey modeling based on the fractional calculus.

    Science.gov (United States)

    Yang, Yang; Xue, Dingyü

    2017-07-17

    The operation processes for thermal power plant are measured by the real-time data, and a large number of historical interval data can be obtained from the dataset. Within defined periods of time, the interval information could provide important information for decision making and equipment maintenance. Actual load is one of the most important parameters, and the trends hidden in the historical data will show the overall operation status of the equipments. However, based on the interval grey parameter numbers, the modeling and prediction process is more complicated than the one with real numbers. In order not lose any information, the geometric coordinate features are used by the coordinates of area and middle point lines in this paper, which are proved with the same information as the original interval data. The grey prediction model for interval grey number by the fractional-order accumulation calculus is proposed. Compared with integer-order model, the proposed method could have more freedom with better performance for modeling and prediction, which can be widely used in the modeling process and prediction for the small amount interval historical industry sequence samples. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Optimal methods for fitting probability distributions to propagule retention time in studies of zoochorous dispersal.

    Science.gov (United States)

    Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi

    2016-02-01

    Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We

  19. Rapid determination of long-lived artificial alpha radionuclides using time interval analysis

    International Nuclear Information System (INIS)

    Uezu, Yasuhiro; Koarashi, Jun; Sanada, Yukihisa; Hashimoto, Tetsuo

    2003-01-01

    It is important to monitor long lived alpha radionuclides as plutonium ( 238 Pu, 239+240 Pu) in the field of working area and environment of nuclear fuel cycle facilities, because it is well known that potential risks of cancer-causing from alpha radiation is higher than gamma radiations. Thus, these monitoring are required high sensitivity, high resolution and rapid determination in order to measure a very low-level concentration of plutonium isotopes. In such high sensitive monitoring, natural radionuclides, including radon ( 222 Rn or 220 Rn) and their progenies, should be eliminated as low as possible. In this situation, a sophisticated discrimination method between Pu and progenies of 222 Rn or 220 Rn using time interval analysis (TIA), which was able to subtract short-lived radionuclides using the time interval distributions calculation of successive alpha and beta decay events within millisecond or microsecond orders, was designed and developed. In this system, alpha rays from 214 Po, 216 Po and 212 Po are extractable. TIA measuring system composes of Silicon Surface Barrier Detector (SSD), an amplifier, an Analog to Digital Converter (ADC), a Multi-Channel Analyzer (MCA), a high-resolution timer (TIMER), a multi-parameter collector and a personal computer. In ADC, incidental alpha and beta pulses are sent to the MCA and the TIMER simultaneously. Pulses from them are synthesized by the multi-parameter collector. After measurement, natural radionuclides are subtracted. Airborne particles were collected on membrane filter for 60 minutes at 100 L/min. Small Pu particles were added on the surface of it. Alpha and beta rays were measured and natural radionuclides were subtracted within 5 times of 145 msec. by TIA. As a result of it, the hidden Pu in natural background could be recognized clearly. The lower limit of determination of 239 Pu is calculated as 6x10 -9 Bq/cm 3 . This level is satisfied with the derived air concentration (DAC) of 239 Pu (8x10 -9 Bq/cm 3

  20. Sampled-data and discrete-time H2 optimal control

    NARCIS (Netherlands)

    Trentelman, Harry L.; Stoorvogel, Anton A.

    1993-01-01

    This paper deals with the sampled-data H2 optimal control problem. Given a linear time-invariant continuous-time system, the problem of minimizing the H2 performance over all sampled-data controllers with a fixed sampling period can be reduced to a pure discrete-time H2 optimal control problem. This

  1. Time interval between cover crop termination and planting influences corn seedling disease, plant growth, and yield

    Science.gov (United States)

    Experiments were established in controlled and field environment to evaluate the effect of time intervals between cereal rye cover crop termination and corn planting on corn seedling disease, corn growth, and grain yield in 2014 and 2015. Rye termination dates ranged from 25 days before planting (DB...

  2. On solving wave equations on fixed bounded intervals involving Robin boundary conditions with time-dependent coefficients

    Science.gov (United States)

    van Horssen, Wim T.; Wang, Yandong; Cao, Guohua

    2018-06-01

    In this paper, it is shown how characteristic coordinates, or equivalently how the well-known formula of d'Alembert, can be used to solve initial-boundary value problems for wave equations on fixed, bounded intervals involving Robin type of boundary conditions with time-dependent coefficients. A Robin boundary condition is a condition that specifies a linear combination of the dependent variable and its first order space-derivative on a boundary of the interval. Analytical methods, such as the method of separation of variables (SOV) or the Laplace transform method, are not applicable to those types of problems. The obtained analytical results by applying the proposed method, are in complete agreement with those obtained by using the numerical, finite difference method. For problems with time-independent coefficients in the Robin boundary condition(s), the results of the proposed method also completely agree with those as for instance obtained by the method of separation of variables, or by the finite difference method.

  3. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions.

    Science.gov (United States)

    Tao, Guohua; Miller, William H

    2011-07-14

    An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.

  4. Interval selection with machine-dependent intervals

    OpenAIRE

    Bohmova K.; Disser Y.; Mihalak M.; Widmayer P.

    2013-01-01

    We study an offline interval scheduling problem where every job has exactly one associated interval on every machine. To schedule a set of jobs, exactly one of the intervals associated with each job must be selected, and the intervals selected on the same machine must not intersect.We show that deciding whether all jobs can be scheduled is NP-complete already in various simple cases. In particular, by showing the NP-completeness for the case when all the intervals associated with the same job...

  5. Experimental and numerical investigation of low-drag intervals in turbulent boundary layer

    Science.gov (United States)

    Park, Jae Sung; Ryu, Sangjin; Lee, Jin

    2017-11-01

    It has been widely investigated that there is a substantial intermittency between high and low drag states in wall-bounded shear flows. Recent experimental and computational studies in a turbulent channel flow have identified low-drag time intervals based on wall shear stress measurements. These intervals are a weak turbulence state characterized by low-speed streaks and weak streamwise vortices. In this study, the spatiotemporal dynamics of low-drag intervals in a turbulent boundary layer is investigated using experiments and simulations. The low-drag intervals are monitored based on the wall shear stress measurement. We show that near the wall conditionally-sampled mean velocity profiles during low-drag intervals closely approach that of a low-drag nonlinear traveling wave solution as well as that of the so-called maximum drag reduction asymptote. This observation is consistent with the channel flow studies. Interestingly, the large spatial stretching of the streak is very evident in the wall-normal direction during low-drag intervals. Lastly, a possible connection between the mean velocity profile during the low-drag intervals and the Blasius profile will be discussed. This work was supported by startup funds from the University of Nebraska-Lincoln.

  6. The Effect of Fluoridated and Non Fluoridated Mouth Washes on Color Stability of Different Aesthetic Arch Wires At Different Time Intervals (An in Vitro Study

    Directory of Open Access Journals (Sweden)

    Lubna Maky Hussein

    2018-01-01

    Full Text Available Background:The color stability of aesthetic arch wires is an important factor in the success of an aesthetic orthodontic treatment, but the color of these arch wires tends to change with time.This study was performed to assess the effect of two types of mouth washes on the color  stability of different types of aesthetic arch wires at different time intervals. Materials and methods:Four brands of nickel titanium coated aesthetic arch wires were used: epoxy coated (Orthotechnology and G&H and Teflon coated (Dany and Hubit.Thirty six samples were prepared, each sample contains ten halves of the aesthetic arch wires. They were divided into three groups according to the immersion media (distilled water as a control media, Listerine with fluoride and Listerine without fluoride and immersed for 30 seconds twice daily according to manufacturer's instructions to measure color change after 1 week, 3 weeks and 6 weeks by using spectrophotometer VITA Easyshade Compact according to Commission Internationale de l’Eclairage L*a*b* color space system. Results:It was found that there were highly significant differences in color change values of aesthetic arch wires among all immersion media at different time intervals and color change value increases as the time of immersion increases.Additionally, Listerine with fluoride mouth wash caused higher color change values of aesthetic arch wires than Listerine without fluoride and Hubit aesthetic arch wires were the least color stable while Orthotechnology aesthetic arch wires were the most color stable. Conclusions: We can conclude that the daily use of Listerine mouth washes could affect on the color stability of aesthetic arch wires. Although all tested aesthetic arch wires revealed color changes at variable degrees but some of these changes were not  visible and the others were clinically acceptable while the remaining were clinically unacceptable.

  7. Complete Blood Count Reference Intervals for Healthy Han Chinese Adults

    Science.gov (United States)

    Mu, Runqing; Guo, Wei; Qiao, Rui; Chen, Wenxiang; Jiang, Hong; Ma, Yueyun; Shang, Hong

    2015-01-01

    Background Complete blood count (CBC) reference intervals are important to diagnose diseases, screen blood donors, and assess overall health. However, current reference intervals established by older instruments and technologies and those from American and European populations are not suitable for Chinese samples due to ethnic, dietary, and lifestyle differences. The aim of this multicenter collaborative study was to establish CBC reference intervals for healthy Han Chinese adults. Methods A total of 4,642 healthy individuals (2,136 males and 2,506 females) were recruited from six clinical centers in China (Shenyang, Beijing, Shanghai, Guangzhou, Chengdu, and Xi’an). Blood samples collected in K2EDTA anticoagulant tubes were analyzed. Analysis of variance was performed to determine differences in consensus intervals according to the use of data from the combined sample and selected samples. Results Median and mean platelet counts from the Chengdu center were significantly lower than those from other centers. Red blood cell count (RBC), hemoglobin (HGB), and hematocrit (HCT) values were higher in males than in females at all ages. Other CBC parameters showed no significant instrument-, region-, age-, or sex-dependent difference. Thalassemia carriers were found to affect the lower or upper limit of different RBC profiles. Conclusion We were able to establish consensus intervals for CBC parameters in healthy Han Chinese adults. RBC, HGB, and HCT intervals were established for each sex. The reference interval for platelets for the Chengdu center should be established independently. PMID:25769040

  8. Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats

    Science.gov (United States)

    Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.

    2012-01-01

    This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.

  9. Sampling returns for realized variance calculations: tick time or transaction time?

    NARCIS (Netherlands)

    Griffin, J.E.; Oomen, R.C.A.

    2008-01-01

    This article introduces a new model for transaction prices in the presence of market microstructure noise in order to study the properties of the price process on two different time scales, namely, transaction time where prices are sampled with every transaction and tick time where prices are

  10. 33 CFR 150.503 - What are the time interval requirements for maintenance on survival craft falls?

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false What are the time interval requirements for maintenance on survival craft falls? 150.503 Section 150.503 Navigation and Navigable Waters... maintenance on survival craft falls? (a) Each fall used in a launching device for survival craft or rescue...

  11. Reference intervals and variation for urinary epinephrine, norepinephrine and cortisol in healthy men and women in Denmark

    DEFF Research Database (Denmark)

    Hansen, Åse Marie; Garde, A H; Christensen, J M

    2001-01-01

    Reference intervals for urinary epinephrine, norepinephrine and cortisol in 120 healthy individuals performing their routine work were established according to the International Union of Pure and Applied Chemistry (IUPAC) and the International Federation of Clinical Chemistry and Laboratory...... Medicine (IFCC) for use in the risk assessment of exposure to occupational stress. Reference intervals were established for three different times of the day: in morning samples (05.45-07.15) the limit of detection (LOD) was 2.10 micromol epinephrine/mol creatinine (82 women) and 2.86 micromol epinephrine....../mol creatinine (37 men), and the reference interval was 3.6-29.1 micromol norepinephrine/mol creatinine and 2.3-52.8 micromol cortisol/mol creatinine (119 women and men); in afternoon samples (15.30-18.30) the reference interval was 0.64-10.8 micromol epinephrine/mol creatinine (82 women), 1.20-11.2 micromol...

  12. Accurate Sample Time Reconstruction of Inertial FIFO Data

    Directory of Open Access Journals (Sweden)

    Sebastian Stieber

    2017-12-01

    Full Text Available In the context of modern cyber-physical systems, the accuracy of underlying sensor data plays an increasingly important role in sensor data fusion and feature extraction. The raw events of multiple sensors have to be aligned in time to enable high quality sensor fusion results. However, the growing number of simultaneously connected sensor devices make the energy saving data acquisition and processing more and more difficult. Hence, most of the modern sensors offer a first-in-first-out (FIFO interface to store multiple data samples and to relax timing constraints, when handling multiple sensor devices. However, using the FIFO interface increases the negative influence of individual clock drifts—introduced by fabrication inaccuracies, temperature changes and wear-out effects—onto the sampling data reconstruction. Furthermore, additional timing offset errors due to communication and software latencies increases with a growing number of sensor devices. In this article, we present an approach for an accurate sample time reconstruction independent of the actual clock drift with the help of an internal sensor timer. Such timers are already available in modern sensors, manufactured in micro-electromechanical systems (MEMS technology. The presented approach focuses on calculating accurate time stamps using the sensor FIFO interface in a forward-only processing manner as a robust and energy saving solution. The proposed algorithm is able to lower the overall standard deviation of reconstructed sampling periods below 40 μ s, while run-time savings of up to 42% are achieved, compared to single sample acquisition.

  13. Determination of the postmortem interval by Laser Induced Breakdown Spectroscopy using swine skeletal muscles

    International Nuclear Information System (INIS)

    Marín-Roldan, A.; Manzoor, S.; Moncayo, S.; Navarro-Villoslada, F.; Izquierdo-Hornillos, R.C.; Caceres, J.O.

    2013-01-01

    Skin and muscle samples are useful to discriminate individuals as well as their postmortem interval (PMI) in crime scenes and natural or caused disasters. In this study, a simple and fast method based on Laser Induced Breakdown Spectroscopy (LIBS) has been developed to estimate PMI using swine skeletal muscle samples. Environmental conditions (moisture, temperature, fauna, etc.) having strong influence on the PMI determination were considered. Time-dependent changes in the emission intensity ratio for Mg, Na, Hα and K were observed, as a result of the variations in their concentration due to chemical reactions in tissues and were correlated with PMI. This relationship, which has not been reported previously in the forensic literature, offers a simple and potentially valuable means of estimating the PMI. - Highlights: • LIBS has been applied for Postmortem Interval estimation. • Environmental and sample storage conditions have been considered. • Significant correlation of elemental emission intensity with PMI has been observed. • Pig skeletal muscle samples have been used

  14. Determination of the postmortem interval by Laser Induced Breakdown Spectroscopy using swine skeletal muscles

    Energy Technology Data Exchange (ETDEWEB)

    Marín-Roldan, A.; Manzoor, S.; Moncayo, S.; Navarro-Villoslada, F.; Izquierdo-Hornillos, R.C.; Caceres, J.O., E-mail: jcaceres@quim.ucm.es

    2013-10-01

    Skin and muscle samples are useful to discriminate individuals as well as their postmortem interval (PMI) in crime scenes and natural or caused disasters. In this study, a simple and fast method based on Laser Induced Breakdown Spectroscopy (LIBS) has been developed to estimate PMI using swine skeletal muscle samples. Environmental conditions (moisture, temperature, fauna, etc.) having strong influence on the PMI determination were considered. Time-dependent changes in the emission intensity ratio for Mg, Na, Hα and K were observed, as a result of the variations in their concentration due to chemical reactions in tissues and were correlated with PMI. This relationship, which has not been reported previously in the forensic literature, offers a simple and potentially valuable means of estimating the PMI. - Highlights: • LIBS has been applied for Postmortem Interval estimation. • Environmental and sample storage conditions have been considered. • Significant correlation of elemental emission intensity with PMI has been observed. • Pig skeletal muscle samples have been used.

  15. Synchronization of Markovian jumping stochastic complex networks with distributed time delays and probabilistic interval discrete time-varying delays

    International Nuclear Information System (INIS)

    Li Hongjie; Yue Dong

    2010-01-01

    The paper investigates the synchronization stability problem for a class of complex dynamical networks with Markovian jumping parameters and mixed time delays. The complex networks consist of m modes and the networks switch from one mode to another according to a Markovian chain with known transition probability. The mixed time delays are composed of discrete and distributed delays, the discrete time delay is assumed to be random and its probability distribution is known a priori. In terms of the probability distribution of the delays, the new type of system model with probability-distribution-dependent parameter matrices is proposed. Based on the stochastic analysis techniques and the properties of the Kronecker product, delay-dependent synchronization stability criteria in the mean square are derived in the form of linear matrix inequalities which can be readily solved by using the LMI toolbox in MATLAB, the solvability of derived conditions depends on not only the size of the delay, but also the probability of the delay-taking values in some intervals. Finally, a numerical example is given to illustrate the feasibility and effectiveness of the proposed method.

  16. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    Science.gov (United States)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  17. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    Science.gov (United States)

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Analysis of the Factors Affecting the Interval between Blood Donations Using Log-Normal Hazard Model with Gamma Correlated Frailties.

    Science.gov (United States)

    Tavakol, Najmeh; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Time to donating blood plays a major role in a regular donor to becoming continues one. The aim of this study was to determine the effective factors on the interval between the blood donations. In a longitudinal study in 2008, 864 samples of first-time donors in Shahrekord Blood Transfusion Center,  capital city of Chaharmahal and Bakhtiari Province, Iran were selected by a systematic sampling and were followed up for five years. Among these samples, a subset of 424 donors who had at least two successful blood donations were chosen for this study and the time intervals between their donations were measured as response variable. Sex, body weight, age, marital status, education, stay and job were recorded as independent variables. Data analysis was performed based on log-normal hazard model with gamma correlated frailty. In this model, the frailties are sum of two independent components assumed a gamma distribution. The analysis was done via Bayesian approach using Markov Chain Monte Carlo algorithm by OpenBUGS. Convergence was checked via Gelman-Rubin criteria using BOA program in R. Age, job and education were significant on chance to donate blood (Pdonation for the higher-aged donors, clericals, workers, free job, students and educated donors were higher and in return, time intervals between their blood donations were shorter. Due to the significance effect of some variables in the log-normal correlated frailty model, it is necessary to plan educational and cultural program to encourage the people with longer inter-donation intervals to donate more frequently.

  19. Irrigation Water Sources and Time Intervals as Variables on the Presence of Campylobacter spp. and Listeria monocytogenes on Romaine Lettuce Grown in Muck Soil.

    Science.gov (United States)

    Guévremont, Evelyne; Lamoureux, Lisyanne; Généreux, Mylène; Côté, Caroline

    2017-07-01

    Irrigation water has been identified as a possible source of vegetable contamination by foodborne pathogens. Risk management for pathogens such as Campylobacter spp. and Listeria monocytogenes in fields can be influenced by the source of the irrigation water and the time interval between last irrigation and harvest. Plots of romaine lettuce were irrigated with manure-contaminated water or aerated pond water 21, 7, or 3 days prior to harvesting, and water and muck soil samples were collected at each irrigation treatment. Lettuce samples were collected at the end of the trials. The samples were tested for the presence of Campylobacter spp. and L. monocytogenes. Campylobacter coli was isolated from 33% of hog manure samples (n = 9) and from 11% of the contaminated water samples (n = 27), but no lettuce samples were positive (n = 288). L. monocytogenes was not found in manure, and only one sample of manure-contaminated irrigation water (n = 27) and one lettuce sample (n = 288) were positive. No Campylobacter or L. monocytogenes was recovered from the soil samples (n = 288). Because of the low incidence of pathogens, it was not possible to link the contamination of either soil or lettuce with the type of irrigation water. Nevertheless, experimental field trials mimicking real conditions provide new insights into the survival of two significant foodborne pathogens on romaine lettuce.

  20. Sample acceptance time criteria, electronic issue and alloimmunisation in thalassaemia.

    Science.gov (United States)

    Trompeter, S; Baxter, L; McBrearty, M; Zatkya, E; Porter, J

    2015-12-01

    To determine the safety of a 1-week acceptance criteria of sample receipt in laboratory to transfusion commencement in transfusion dependent thalassaemia with respect to alloimmunisation. To determine the safety of electronic issue of blood components in such a setting. Retrospective audit of alloimmunisation (1999-2012) and blood exposure in registered thalassaemia patients at a central London thalassaemia centre where the acceptance criteria for the group and save sample from arrival in the laboratory to the time of issue of blood for transfusion for someone who has been transfused in the last 28 days was 1 week, and there was electronic issue protocol for patients who have always had a negative antibody screen (other than temporary positivity in pregnant women receiving prophylactic anti-D or anti Le-a, Anti Le-b and Anti P1 that are no longer detectable). There were 133 patients with thalassemia variants regularly attending UCLH for review. A total of 105 patients had transfusion dependent thalassaemia (TDT) (7 E-beta thalassaemia, 98 beta thalassaemia major). Ten of the 84 patients who received their transfusions at UCLH were alloimmunised. Seven of them had been alloimmunised prior to arrival at UCLH. Only two patients developed antibodies at UCLH during this period. The prevalence of alloantibody formation of 2% in UCLH transfused patients, with presumptive incidence of 0.01 alloantibodies per 100 units or 0·001 immunisations per person per year compares favourably with other reported series and suggests that 1 week interval with appropriate electronic issue is acceptable practice. © 2015 British Blood Transfusion Society.

  1. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    Science.gov (United States)

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  2. A study on assessment methodology of surveillance test interval and allowed outage time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol

    1996-07-01

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method

  3. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol [Seoul Nationl Univ., Seoul (Korea, Republic of)] (and others)

    1996-07-15

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method.

  4. Digital timing: sampling frequency, anti-aliasing filter and signal interpolation filter dependence on timing resolution

    International Nuclear Information System (INIS)

    Cho, Sanghee; Grazioso, Ron; Zhang Nan; Aykac, Mehmet; Schmand, Matthias

    2011-01-01

    The main focus of our study is to investigate how the performance of digital timing methods is affected by sampling rate, anti-aliasing and signal interpolation filters. We used the Nyquist sampling theorem to address some basic questions such as what will be the minimum sampling frequencies? How accurate will the signal interpolation be? How do we validate the timing measurements? The preferred sampling rate would be as low as possible, considering the high cost and power consumption of high-speed analog-to-digital converters. However, when the sampling rate is too low, due to the aliasing effect, some artifacts are produced in the timing resolution estimations; the shape of the timing profile is distorted and the FWHM values of the profile fluctuate as the source location changes. Anti-aliasing filters are required in this case to avoid the artifacts, but the timing is degraded as a result. When the sampling rate is marginally over the Nyquist rate, a proper signal interpolation is important. A sharp roll-off (higher order) filter is required to separate the baseband signal from its replicates to avoid the aliasing, but in return the computation will be higher. We demonstrated the analysis through a digital timing study using fast LSO scintillation crystals as used in time-of-flight PET scanners. From the study, we observed that there is no significant timing resolution degradation down to 1.3 Ghz sampling frequency, and the computation requirement for the signal interpolation is reasonably low. A so-called sliding test is proposed as a validation tool checking constant timing resolution behavior of a given timing pick-off method regardless of the source location change. Lastly, the performance comparison for several digital timing methods is also shown.

  5. Behavioural sampling techniques and activity pattern of Indian Pangolin Manis crassicaudata (Mammalia: Manidae in captivity

    Directory of Open Access Journals (Sweden)

    R.K. Mohapatra

    2013-12-01

    Full Text Available The study presents data on six Indian Pangolins Manis crassicaudata observed in captivity at the Pangolin Conservation Breeding Centre, Nandankanan, Odisha, India over 1377 hours of video recordings for each pangolin between 1500hr and 0800hr on 81 consecutive observational days. Video recordings were made through digital systems assisted by infrared enabled CCTV cameras. The data highlights patterns relate to 12 different behaviour and enclosure utilization. Different interval periods for sampling of instantaneous behaviour from video recordings have been evaluated to develop optimal study methods for the future. The activity budgets of pangolins displayed natural patterns of nocturnal activity with a peak between 20:00-21:00 hr. When out of their burrow, they spent about 59% of the time walking in the enclosure, and 14% of the time feeding. The repeatability of the behaviours has a significant negative correlation with the mean time spent in that behaviour. Focal behavioural samples significantly correlated with instantaneous samples up to 15 minutes interval. The correlation values gradually decreased with the increase in sampling interval. The results indicate that results obtained from focal sampling and instantaneous sampling with relatively shorter intervals (=5 minutes are about equally reliable. The study suggests use of focal sampling, instead of instantaneous sampling to record behaviour relating to social interactions.

  6. Multiple-step fault estimation for interval type-II T-S fuzzy system of hypersonic vehicle with time-varying elevator faults

    Directory of Open Access Journals (Sweden)

    Jin Wang

    2017-03-01

    Full Text Available This article proposes a multiple-step fault estimation algorithm for hypersonic flight vehicles that uses an interval type-II Takagi–Sugeno fuzzy model. An interval type-II Takagi–Sugeno fuzzy model is developed to approximate the nonlinear dynamic system and handle the parameter uncertainties of hypersonic firstly. Then, a multiple-step time-varying additive fault estimation algorithm is designed to estimate time-varying additive elevator fault of hypersonic flight vehicles. Finally, the simulation is conducted in both aspects of modeling and fault estimation; the validity and availability of such method are verified by a series of the comparison of numerical simulation results.

  7. Confidence interval procedures for Monte Carlo transport simulations

    International Nuclear Information System (INIS)

    Pederson, S.P.

    1997-01-01

    The problem of obtaining valid confidence intervals based on estimates from sampled distributions using Monte Carlo particle transport simulation codes such as MCNP is examined. Such intervals can cover the true parameter of interest at a lower than nominal rate if the sampled distribution is extremely right-skewed by large tallies. Modifications to the standard theory of confidence intervals are discussed and compared with some existing heuristics, including batched means normality tests. Two new types of diagnostics are introduced to assess whether the conditions of central limit theorem-type results are satisfied: the relative variance of the variance determines whether the sample size is sufficiently large, and estimators of the slope of the right tail of the distribution are used to indicate the number of moments that exist. A simulation study is conducted to quantify the relationship between various diagnostics and coverage rates and to find sample-based quantities useful in indicating when intervals are expected to be valid. Simulated tally distributions are chosen to emulate behavior seen in difficult particle transport problems. Measures of variation in the sample variance s 2 are found to be much more effective than existing methods in predicting when coverage will be near nominal rates. Batched means tests are found to be overly conservative in this regard. A simple but pathological MCNP problem is presented as an example of false convergence using existing heuristics. The new methods readily detect the false convergence and show that the results of the problem, which are a factor of 4 too small, should not be used. Recommendations are made for applying these techniques in practice, using the statistical output currently produced by MCNP

  8. Adaptive Sampling of Time Series During Remote Exploration

    Science.gov (United States)

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  9. Assessment of time interval between tramadol intake and seizure and second drug-induced attack

    Directory of Open Access Journals (Sweden)

    Bahareh Abbasi

    2015-11-01

    Full Text Available Background: Tramadol is a synthetic drug which is prescribed in moderate and severe pain. Tramadol overdose can induce severe complications such as consciousness impairment and convulsions. This study was done to determine the convulsions incidence after tramadol use until one week after hospital discharge. Methods: This prospective study was done in tramadol overdose patients without uncontrolled epilepsy and head injury history. All cases admitted in Loghman and Rasol Akram Hospitals, Tehran, Iran from 1, April 2011 to 1, April 2012 were included and observed for at least 12 hours. Time interval between tramadol intake and first seizure were record. Then, patients with second drug-induced seizure were recognized and log time between the first and second seizure was analyzed. The patients were transferred to the intensive care unit (ICU if clinical worsening status observed. One week after hospital discharge, telephone follow-up was conducted. Results: A total of 150 patients with a history of tramadol induced seizures (141 men, 9 women, age: 23.23±5.94 years were enrolled in this study. Convulsion was seen in 104 patients (69.3%. In 8 out of 104 patients (7.6% two or more convulsion was seen. Time interval between tramadol use and the onset of the first and second seizure were 0.93±0.17 and 2.5±0.75 hours, respectively. Tramadol induced seizures are more likely to occur in males and patients with a history of drug abuse. Finally, one hundred forty nine patients (99.3% were discharged with good condition and the only one patient died from tramadol overdose. Conclusion: The results of the study showed tramadol induced seizure most frequently occurred within the first 4 hours of tramadol intake. The chance of experiencing a second seizure exists in the susceptible population. Thus, 4 hours after drug intake is the best time for patients to be hospital discharged.

  10. The Acute Effects of Interval-Type Exercise on Glycemic Control in Type 2 Diabetes Subjects: Importance of Interval Length. A Controlled, Counterbalanced, Crossover Study.

    Directory of Open Access Journals (Sweden)

    Ida Jakobsen

    Full Text Available Interval-type exercise is effective for improving glycemic control, but the optimal approach is unknown. The purpose of this study was to determine the importance of the interval length on changes in postprandial glycemic control following a single exercise bout. Twelve subjects with type 2 diabetes completed a cross-over study with three 1-hour interventions performed in a non-randomized but counter-balanced order: 1 Interval walking consisting of repeated cycles of 3 min slow (aiming for 54% of Peak oxygen consumption rate [VO2peak] and 3 min fast (aiming for 89% of VO2peak walking (IW3; 2 Interval walking consisting of repeated cycles of 1 min slow and 1 min fast walking (IW1 and 3 No walking (CON. The exercise interventions were matched with regards to walking speed, and VO2 and heart rate was assessed throughout all interventions. A 4-hour liquid mixed meal tolerance test commenced 30 min after each intervention, with blood samples taken regularly. IW3 and IW1 resulted in comparable mean VO2 and heart rates. Overall mean postprandial blood glucose levels were lower after IW3 compared to CON (10.3±3.0 vs. 11.1±3.3 mmol/L; P 0.05 for both. Conversely blood glucose levels at specific time points during the MMTT differed significantly following both IW3 and IW1 as compared to CON. Our findings support the previously found blood glucose lowering effect of IW3 and suggest that reducing the interval length, while keeping the walking speed and time spend on fast and slow walking constant, does not result in additional improvements.ClinicalTrials.gov NCT02257190.

  11. Delay-Dependent Stability Criterion for Bidirectional Associative Memory Neural Networks with Interval Time-Varying Delays

    Science.gov (United States)

    Park, Ju H.; Kwon, O. M.

    In the letter, the global asymptotic stability of bidirectional associative memory (BAM) neural networks with delays is investigated. The delay is assumed to be time-varying and belongs to a given interval. A novel stability criterion for the stability is presented based on the Lyapunov method. The criterion is represented in terms of linear matrix inequality (LMI), which can be solved easily by various optimization algorithms. Two numerical examples are illustrated to show the effectiveness of our new result.

  12. Optimal Data Interval for Estimating Advertising Response

    OpenAIRE

    Gerard J. Tellis; Philip Hans Franses

    2006-01-01

    The abundance of highly disaggregate data (e.g., at five-second intervals) raises the question of the optimal data interval to estimate advertising carryover. The literature assumes that (1) the optimal data interval is the interpurchase time, (2) too disaggregate data causes a disaggregation bias, and (3) recovery of true parameters requires assumption of the underlying advertising process. In contrast, we show that (1) the optimal data interval is what we call , (2) too disaggregate data do...

  13. The Influence of Pretreatment Characteristics and Radiotherapy Parameters on Time Interval to Development of Radiation-Associated Meningioma

    International Nuclear Information System (INIS)

    Paulino, Arnold C.; Ahmed, Irfan M.; Mai, Wei Y.; Teh, Bin S.

    2009-01-01

    Purpose: To identify pretreatment characteristics and radiotherapy parameters which may influence time interval to development of radiation-associated meningioma (RAM). Methods and Materials: A Medline/PUBMED search of articles dealing with RAM yielded 66 studies between 1981 and 2006. Factors analyzed included patient age and gender, type of initial tumor treated, radiotherapy (RT) dose and volume, and time interval from RT to development of RAM. Results: A total of 143 patients with a median age at RT of 12 years form the basis of this report. The most common initial tumors or conditions treated with RT were medulloblastoma (n = 27), pituitary adenoma (n = 20), acute lymphoblastic leukemia (n = 20), low-grade astrocytoma (n = 19), and tinea capitis (n = 14). In the 116 patients whose RT fields were known, 55 (47.4%) had a portion of the brain treated, whereas 32 (27.6%) and 29 (25.0%) had craniospinal and whole-brain fields. The median time from RT to develop a RAM or latent time (LT) was 19 years (range, 1-63 years). Male gender (p = 0.001), initial diagnosis of leukemia (p = 0.001), and use of whole brain or craniospinal field (p ≤ 0.0001) were associated with a shorter LT, whereas patients who received lower doses of RT had a longer LT (p < 0.0001). Conclusions: The latent time to develop a RAM was related to gender, initial tumor type, radiotherapy volume, and radiotherapy dose.

  14. Effect of the time interval between fusion and activation on epigenetic reprogramming and development of bovine somatic cell nuclear transfer embryos.

    Science.gov (United States)

    Liu, Jun; Wang, Yongsheng; Su, Jianmin; Wang, Lijun; Li, Ruizhe; Li, Qian; Wu, Yongyan; Hua, Song; Quan, Fusheng; Guo, Zekun; Zhang, Yong

    2013-04-01

    Previous studies have shown that the time interval between fusion and activation (FA interval) play an important role in nuclear remodeling and in vitro development of somatic cell nuclear transfer (SCNT) embryos. However, the effects of FA interval on the epigenetic reprogramming and in vivo developmental competence of SCNT embryos remain unknown. In the present study, the effects of different FA intervals (0 h, 2 h, and 4 h) on the epigenetic reprogramming and developmental competence of bovine SCNT embryos were assessed. The results demonstrated that H3 lysine 9 (H3K9ac) levels decreased rapidly after fusion in all three groups. H3K9ac was practically undetectable 2 h after fusion in the 2-h and 4-h FA interval groups. However, H3K9ac was still evidently detectable in the 0-h FA interval group. The H3K9ac levels increased 10 h after fusion in all three groups, but were higher in the 2-h and 4-h FA interval groups than that in the 0-h FA interval group. The methylation levels of the satellite I region in day-7 blastocysts derived from the 2-h or 4-h FA interval groups was similar to that of in vitro fertilization blastocysts and is significantly lower than that of the 0-h FA interval group. SCNT embryos derived from 2-h FA interval group showed higher developmental competence than those from the 0-h and 4-h FA interval groups in terms of cleavage rate, blastocyst formation rate, apoptosis index, and pregnancy and calving rates. Hence, the FA interval is an important factor influencing the epigenetic reprogramming and developmental competence of bovine SCNT embryos.

  15. Time interval between maternal electrocardiogram and venous Doppler waves in normal pregnancy and preeclampsia: a pilot study.

    Science.gov (United States)

    Tomsin, K; Mesens, T; Molenberghs, G; Peeters, L; Gyselaers, W

    2012-12-01

    To evaluate the time interval between maternal electrocardiogram (ECG) and venous Doppler waves at different stages of uncomplicated pregnancy (UP) and in preeclampsia (PE). Cross-sectional pilot study in 40 uncomplicated singleton pregnancies, categorized in four groups of ten according to gestational age: 10 - 14 weeks (UP1), 18 - 23 weeks (UP2), 28 - 33 weeks (UP3) and ≥ 37 weeks (UP4) of gestation. A fifth group of ten women with PE was also included. A Doppler flow examination at the level of renal interlobar veins (RIV) and hepatic veins (HV) was performed according to a standard protocol, in association with a maternal ECG. The time interval between the ECG P-wave and the corresponding A-deflection of the venous Doppler waves was measured (PA), and expressed relative to the duration of the cardiac cycle (RR), and labeled PA/RR. In hepatic veins, the PA/RR is longer in UP 4 than in UP 1 (0.48 ± 0.15 versus 0.29 ± 0.09, p ≤ 0.001). When all UP groups were compared, the PA/RR increased gradually with gestational age. In PE, the HV PA/RR is shorter than in UP 3 (0.25 ± 0.09 versus 0.42 ± 0.14, p advanced gestational stages are consistent with known features of maternal cardiovascular adaptation. Shorter values in preeclampsia are consistent with maternal cardiovascular maladaptation mechanisms. Our pilot study invites more research of the relevance of the time interval between maternal ECG and venous Doppler waves as a new parameter for studying the gestational cardiovascular (patho)physiology of the maternal venous compartment by duplex sonography. © Georg Thieme Verlag KG Stuttgart · New York.

  16. OSL response bleaching of BeO samples, using fluorescent light and blue LEDs

    International Nuclear Information System (INIS)

    Groppo, Daniela Piai; Caldas, Linda V.E.

    2015-01-01

    The optically stimulated luminescence (OSL) is widely used as a dosimetric technique for many applications. In this work, the OSL response bleaching of BeO samples was studied. The samples were irradiated using a beta radiation source ( 90 Sr+ 90 Y); the bleaching treatments (fluorescent light and blue LEDs) were performed, and the results were compared. Various optical treatment time intervals were tested until reaching the complete bleaching of the OSL response. The best combination of the time interval and bleaching type was analyzed. (author)

  17. OSL response bleaching of BeO samples, using fluorescent light and blue LEDs

    International Nuclear Information System (INIS)

    Groppo, D P; Caldas, L V E

    2016-01-01

    The optically stimulated luminescence (OSL) is widely used as a dosimetric technique for many applications. In this work, the OSL response bleaching of BeO samples was studied. The samples were irradiated using a beta radiation source ("9"0Sr+"9"0Y); the bleaching treatments (fluorescent light and blue LEDs) were performed, and the results were compared. Various optical treatment time intervals were tested until reaching the complete bleaching of the OSL response. The best combination of the time interval and bleaching type was analyzed. (paper)

  18. 'Aussie normals': an a priori study to develop clinical chemistry reference intervals in a healthy Australian population.

    Science.gov (United States)

    Koerbin, G; Cavanaugh, J A; Potter, J M; Abhayaratna, W P; West, N P; Glasgow, N; Hawkins, C; Armbruster, D; Oakman, C; Hickman, P E

    2015-02-01

    Development of reference intervals is difficult, time consuming, expensive and beyond the scope of most laboratories. The Aussie Normals study is a direct a priori study to determine reference intervals in healthy Australian adults. All volunteers completed a health and lifestyle questionnaire and exclusion was based on conditions such as pregnancy, diabetes, renal or cardiovascular disease. Up to 91 biochemical analyses were undertaken on a variety of analytical platforms using serum samples collected from 1856 volunteers. We report on our findings for 40 of these analytes and two calculated parameters performed on the Abbott ARCHITECTci8200/ci16200 analysers. Not all samples were analysed for all assays due to volume requirements or assay/instrument availability. Results with elevated interference indices and those deemed unsuitable after clinical evaluation were removed from the database. Reference intervals were partitioned based on the method of Harris and Boyd into three scenarios, combined gender, males and females and age and gender. We have performed a detailed reference interval study on a healthy Australian population considering the effects of sex, age and body mass. These reference intervals may be adapted to other manufacturer's analytical methods using method transference.

  19. Long QT interval in Turner syndrome: a high prevalence of LQTS gene mutations

    DEFF Research Database (Denmark)

    Trolle, Christian

    Objective: QT interval prolongation of unknown aetiology is common in Turner syndrome (TS). This study set out to explore the presence of known pathogenic long QT (LQT) mutations in TS and to examine the corrected QT interval (QTc) over time and relate the findings to the TS phenotype. Methods......QTc). The prevalence of mutations in genes related to Long QT syndrome (LQTS) was determined in females with TS and a QTc >432.0 milliseconds (ms). Echocardiographic assessment of aortic valve morphology, 24-hour blood pressures and blood samples were done. Results: The mean hQTc in females with TS (414.0±25.5 ms...

  20. Comparison of serum pools and oral fluid samples for detection of porcine circovirus type 2 by quantitative real-time PCR in finisher pigs

    DEFF Research Database (Denmark)

    Nielsen, Gitte Blach; Nielsen, Jens Peter; Haugegaard, John

    2018-01-01

    Porcine circovirus type 2 (PCV2) diagnostics in live pigs often involves pooled serum and/or oral fluid samples for group-level determination of viral load by quantitative real-time polymerase chain reaction (qPCR). The purpose of the study was to compare the PCV2 viral load determined by q......PCR of paired samples at the pen level of pools of sera (SP) from 4 to 5 pigs and the collective oral fluid (OF) from around 30 pigs corresponding to one rope put in the same pen. Pigs in pens of 2 finishing herds were sampled by cross-sectional (Herd 1) and cross-sectional with follow-up (Herd 2) study designs....... In Herd 1, 50 sample pairs consisting of SP from 4 to 5 pigs and OF from around 23 pigs were collected. In Herd 2, 65 sample pairs consisting of 4 (SP) and around 30 (OF) pigs were collected 4 times at 3-week intervals. A higher proportion of PCV2-positive pens (86% vs. 80% and 100% vs. 91%) and higher...

  1. Socioeconomic position and the primary care interval

    DEFF Research Database (Denmark)

    Vedsted, Anders

    2018-01-01

    to the easiness to interpret the symptoms of the underlying cancer. Methods. We conducted a population-based cohort study using survey data on time intervals linked at an individually level to routine collected data on demographics from Danish registries. Using logistic regression we estimated the odds......Introduction. Diagnostic delays affect cancer survival negatively. Thus, the time interval from symptomatic presentation to a GP until referral to secondary care (i.e. primary care interval (PCI)), should be as short as possible. Lower socioeconomic position seems associated with poorer cancer...... younger than 45 years of age and older than 54 years of age had longer primary care interval than patients aged ‘45-54’ years. No other associations for SEP characteristics were observed. The findings may imply that GPs are referring patients regardless of SEP, although some room for improvement prevails...

  2. Identification of continuous-time systems from samples of input ...

    Indian Academy of Sciences (India)

    Abstract. This paper presents an introductory survey of the methods that have been developed for identification of continuous-time systems from samples of input±output data. The two basic approaches may be described as (i) the indirect method, where first a discrete-time model is estimated from the sampled data and then ...

  3. Initial Systolic Time Interval (ISTI) as a Predictor of Intradialytic Hypotension (IDH)

    International Nuclear Information System (INIS)

    Biesheuvel, J D; Verdaasdonk, R M; Meijer, JH; Vervloet, M G

    2013-01-01

    In haemodialysis treatment the clearance and volume control by the kidneys of a patient are partially replaced by intermittent haemodialysis. Because this artificial process is performed on a limited time scale, unphysiological imbalances in the fluid compartments of the body occur, that can lead to intradialytic hypotensions (IDH). An IDH endangers the efficacy of the haemodialysis session and is associated with dismal clinical endpoints, including mortality. A diagnostic method that predicts the occurrence of these drops in blood pressure could facilitate timely measures for the prevention of IDH. The present study investigates whether the Initial Systolic Time Interval (ISTI) can provide such a diagnostic method. The ISTI is defined as the time difference between the R-peak in the electrocardiogram (ECG) and the C-wave in the impedance cardiogram (ICG) and is considered to be a non-invasive assessment of the time delay between the electrical and mechanical activity of the heart. This time delay has previously been found to depend on autonomic nervous function as well as preload of the heart. Therefore, it can be expected that ISTI may predict an imminent IDH caused by a low circulating blood volume. This ongoing observational clinical study investigates the relationship between changes in ISTI and subsequent drops in blood pressure during haemodialysis. A registration of a complicated dialysis showed a significant correlation between a drop in blood pressure, a decrease in relative blood volume and a substantial increase in ISTI. An uncomplicated dialysis, in which also a considerable amount of fluid was removed, showed no correlations. Both, blood pressure and ISTI remained stable. In conclusion, the preliminary results of the present study show a substantial response of ISTI to haemodynamic instability, indicating an application in optimization and individualisation of the dialysis process.

  4. Decoupling of modeling and measuring interval in groundwater time series analysis based on response characteristics

    NARCIS (Netherlands)

    Berendrecht, W.L.; Heemink, A.W.; Geer, F.C. van; Gehrels, J.C.

    2003-01-01

    A state-space representation of the transfer function-noise (TFN) model allows the choice of a modeling (input) interval that is smaller than the measuring interval of the output variable. Since in geohydrological applications the interval of the available input series (precipitation excess) is

  5. Adjusted Wald Confidence Interval for a Difference of Binomial Proportions Based on Paired Data

    Science.gov (United States)

    Bonett, Douglas G.; Price, Robert M.

    2012-01-01

    Adjusted Wald intervals for binomial proportions in one-sample and two-sample designs have been shown to perform about as well as the best available methods. The adjusted Wald intervals are easy to compute and have been incorporated into introductory statistics courses. An adjusted Wald interval for paired binomial proportions is proposed here and…

  6. Modification of a two blood sample method used for measurement of GFR with 99mTc-DTPA.

    Science.gov (United States)

    Surma, Marian J; Płachcińska, Anna; Kuśmierek, Jacek

    2018-01-01

    Measurements of GFR may be performed with a slope/intercept method (S/I), using only two blood samples taken in strictly defined time points. The aim of the study was to modify this method in order to extend time intervals suitable for blood sampling. Modification was based on a variation of a Russel et al. model parameter, selection of time intervals suitable for blood sampling and assessment of uncertainty of calculated results. Archived values of GFR measurements of 169 patients with different renal function, from 5.5 to 179 mL/min, calculated with a multiple blood sample method were used. Concentrations of a radiopharmaceutical in consecutive minutes, from 60th to 190th after injection, were calculated theoretically, using archived parameters of biexponential functions describing a decrease in 99mTc-DTPA concentration in blood plasma with time. These values, together with injected activities, were treated as measurements and used for S/I clearance calculations. Next, values of S/I clearance were compared with the multiple blood sample method in order to calculate suitable values of exponent present in a Russel's model, for every combination of two blood sampling time points. A model was considered accurately fitted to measured values when SEE ≤ 3.6 mL/min. Assessments of uncertainty of obtained results were based on law of error superposition, taking into account mean square prediction error and also errors introduced by pipetting, time measurement and stochastic radioactive decay. The accepted criteria resulted in extension of time intervals suitable for blood sampling to: between 60 and 90 minutes after injection for the first sample and between 150 and 180 minutes for the second sample. Uncertainty of results was assessed as between 4 mL/min for GFR = 5-10 mL/min and 8 mL/min for GFR = 180 mL/min. Time intervals accepted for blood sampling fully satisfy nuclear medicine staff and ensure proper determination of GFR. Uncertainty of results is entirely

  7. Adrenal Hormones in Common Bottlenose Dolphins (Tursiops truncatus: Influential Factors and Reference Intervals.

    Directory of Open Access Journals (Sweden)

    Leslie B Hart

    Full Text Available Inshore common bottlenose dolphins (Tursiops truncatus are exposed to a broad spectrum of natural and anthropogenic stressors. In response to these stressors, the mammalian adrenal gland releases hormones such as cortisol and aldosterone to maintain physiological and biochemical homeostasis. Consequently, adrenal gland dysfunction results in disruption of hormone secretion and an inappropriate stress response. Our objective herein was to develop diagnostic reference intervals (RIs for adrenal hormones commonly associated with the stress response (i.e., cortisol, aldosterone that account for the influence of intrinsic (e.g., age, sex and extrinsic (e.g., time factors. Ultimately, these reference intervals will be used to gauge an individual's response to chase-capture stress and could indicate adrenal abnormalities. Linear mixed models (LMMs were used to evaluate demographic and sampling factors contributing to differences in serum cortisol and aldosterone concentrations among bottlenose dolphins sampled in Sarasota Bay, Florida, USA (2000-2012. Serum cortisol concentrations were significantly associated with elapsed time from initial stimulation to sample collection (p<0.05, and RIs were constructed using nonparametric methods based on elapsed sampling time for dolphins sampled in less than 30 minutes following net deployment (95% RI: 0.91-4.21 µg/dL and following biological sampling aboard a research vessel (95% RI: 2.32-6.68 µg/dL. To examine the applicability of the pre-sampling cortisol RI across multiple estuarine stocks, data from three additional southeast U.S. sites were compared, revealing that all of the dolphins sampled from the other sites (N = 34 had cortisol concentrations within the 95th percentile RI. Significant associations between serum concentrations of aldosterone and variables reported in previous studies (i.e., age, elapsed sampling time were not observed in the current project (p<0.05. Also, approximately 16% of

  8. Adrenal Hormones in Common Bottlenose Dolphins (Tursiops truncatus): Influential Factors and Reference Intervals.

    Science.gov (United States)

    Hart, Leslie B; Wells, Randall S; Kellar, Nick; Balmer, Brian C; Hohn, Aleta A; Lamb, Stephen V; Rowles, Teri; Zolman, Eric S; Schwacke, Lori H

    2015-01-01

    Inshore common bottlenose dolphins (Tursiops truncatus) are exposed to a broad spectrum of natural and anthropogenic stressors. In response to these stressors, the mammalian adrenal gland releases hormones such as cortisol and aldosterone to maintain physiological and biochemical homeostasis. Consequently, adrenal gland dysfunction results in disruption of hormone secretion and an inappropriate stress response. Our objective herein was to develop diagnostic reference intervals (RIs) for adrenal hormones commonly associated with the stress response (i.e., cortisol, aldosterone) that account for the influence of intrinsic (e.g., age, sex) and extrinsic (e.g., time) factors. Ultimately, these reference intervals will be used to gauge an individual's response to chase-capture stress and could indicate adrenal abnormalities. Linear mixed models (LMMs) were used to evaluate demographic and sampling factors contributing to differences in serum cortisol and aldosterone concentrations among bottlenose dolphins sampled in Sarasota Bay, Florida, USA (2000-2012). Serum cortisol concentrations were significantly associated with elapsed time from initial stimulation to sample collection (p<0.05), and RIs were constructed using nonparametric methods based on elapsed sampling time for dolphins sampled in less than 30 minutes following net deployment (95% RI: 0.91-4.21 µg/dL) and following biological sampling aboard a research vessel (95% RI: 2.32-6.68 µg/dL). To examine the applicability of the pre-sampling cortisol RI across multiple estuarine stocks, data from three additional southeast U.S. sites were compared, revealing that all of the dolphins sampled from the other sites (N = 34) had cortisol concentrations within the 95th percentile RI. Significant associations between serum concentrations of aldosterone and variables reported in previous studies (i.e., age, elapsed sampling time) were not observed in the current project (p<0.05). Also, approximately 16% of Sarasota Bay

  9. Conversion rate of laparoscopic cholecystectomy after endoscopic retrograde cholangiography in the treatment of choledocholithiasis - Does the time interval matter?

    NARCIS (Netherlands)

    de Vries, A.; Donkervoort, S. C.; van Geloven, A. A. W.; Pierik, E. G. J. M.

    2005-01-01

    Background: Preceding endoscopic retrograde cholangiography (ERC) in patients with choledochocystolithiasis impedes laparoscopic cholecystectomy (LC) and increases risk of conversion. We studied the influence of time interval between ERC and LC on the course of LC. Methods: All patients treated for

  10. Time without clocks - an attempt

    International Nuclear Information System (INIS)

    Karpman, G.

    1978-01-01

    A definition of time intervals separating two states of systems of elementary particles and observers is attempted. The definition is founded on the notion of instant state of the system and uses no information connected with the use of a clock. Applying the definition to a classical clock and to a sample of unstable particles, results are obtained in agreement with experiment. However, if the system contains 'few' elementary particles, the properties of the time interval present some different features. (author)

  11. Algae viability over time in a ballast water sample

    Science.gov (United States)

    Gollasch, Stephan; David, Matej

    2018-03-01

    The biology of vessels' ballast water needs to be analysed for several reasons, one of these being performance tests of ballast water management systems. This analysis includes a viability assessment of phytoplankton. To overcome logistical problems to get algae sample processing gear on board of a vessel to document algae viability, samples may be transported to land-based laboratories. Concerns were raised how the storage conditions of the sample may impact algae viability over time and what the most appropriate storage conditions were. Here we answer these questions with a long-term algae viability study with daily sample analysis using Pulse-Amplitude Modulated (PAM) fluorometry. The sample was analysed over 79 days. We tested different storage conditions: fridge and room temperature with and without light. It seems that during the first two weeks of the experiment the viability remains almost unchanged with a slight downwards trend. In the continuing period, before the sample was split, a slightly stronger downwards viability trend was observed, which occurred at a similar rate towards the end of the experiment. After the sample was split, the strongest viability reduction was measured for the sample stored without light at room temperature. We concluded that the storage conditions, especially regarding temperature and light exposure, have a stronger impact on algae viability compared to the storage duration and that inappropriate storage conditions reduce algal viability. A sample storage time of up to two weeks in a dark and cool environment has little influence on the organism viability. This indicates that a two week time duration between sample taking on board a vessel and the viability measurement in a land-based laboratory may not be very critical.

  12. A modified Wald interval for the area under the ROC curve (AUC) in diagnostic case-control studies.

    Science.gov (United States)

    Kottas, Martina; Kuss, Oliver; Zapf, Antonia

    2014-02-19

    The area under the receiver operating characteristic (ROC) curve, referred to as the AUC, is an appropriate measure for describing the overall accuracy of a diagnostic test or a biomarker in early phase trials without having to choose a threshold. There are many approaches for estimating the confidence interval for the AUC. However, all are relatively complicated to implement. Furthermore, many approaches perform poorly for large AUC values or small sample sizes. The AUC is actually a probability. So we propose a modified Wald interval for a single proportion, which can be calculated on a pocket calculator. We performed a simulation study to compare this modified Wald interval (without and with continuity correction) with other intervals regarding coverage probability and statistical power. The main result is that the proposed modified Wald intervals maintain and exploit the type I error much better than the intervals of Agresti-Coull, Wilson, and Clopper-Pearson. The interval suggested by Bamber, the Mann-Whitney interval without transformation and also the interval of the binormal AUC are very liberal. For small sample sizes the Wald interval with continuity has a comparable coverage probability as the LT interval and higher power. For large sample sizes the results of the LT interval and of the Wald interval without continuity correction are comparable. If individual patient data is not available, but only the estimated AUC and the total sample size, the modified Wald intervals can be recommended as confidence intervals for the AUC. For small sample sizes the continuity correction should be used.

  13. On the sample transport time of a pneumatic transfer system

    International Nuclear Information System (INIS)

    Kondo, Yoshihide

    1983-01-01

    The counts accumulated in measuring system are affected by the variations in transport time of the sample on cyclic activation experiments with a mechanical sample transfer system. In use of the pneumatic transfer system, which has been set up, the transport time is variable according to the differences as follows: The form, size and weight of samples, the pneumatic pressure and so on. Comprehending the relationships between the transpot time and these variable factors is essentially important to make experiments with this transfer system. (author)

  14. Confidence intervals for distinguishing ordinal and disordinal interactions in multiple regression.

    Science.gov (United States)

    Lee, Sunbok; Lei, Man-Kit; Brody, Gene H

    2015-06-01

    Distinguishing between ordinal and disordinal interaction in multiple regression is useful in testing many interesting theoretical hypotheses. Because the distinction is made based on the location of a crossover point of 2 simple regression lines, confidence intervals of the crossover point can be used to distinguish ordinal and disordinal interactions. This study examined 2 factors that need to be considered in constructing confidence intervals of the crossover point: (a) the assumption about the sampling distribution of the crossover point, and (b) the possibility of abnormally wide confidence intervals for the crossover point. A Monte Carlo simulation study was conducted to compare 6 different methods for constructing confidence intervals of the crossover point in terms of the coverage rate, the proportion of true values that fall to the left or right of the confidence intervals, and the average width of the confidence intervals. The methods include the reparameterization, delta, Fieller, basic bootstrap, percentile bootstrap, and bias-corrected accelerated bootstrap methods. The results of our Monte Carlo simulation study suggest that statistical inference using confidence intervals to distinguish ordinal and disordinal interaction requires sample sizes more than 500 to be able to provide sufficiently narrow confidence intervals to identify the location of the crossover point. (c) 2015 APA, all rights reserved).

  15. VALIDATION OF SPRING OPERATED PRESSURE RELIEF VALVE TIME TO FAILURE AND THE IMPORTANCE OF STATISTICALLY SUPPORTED MAINTENANCE INTERVALS

    Energy Technology Data Exchange (ETDEWEB)

    Gross, R; Stephen Harris, S

    2009-02-18

    The Savannah River Site operates a Relief Valve Repair Shop certified by the National Board of Pressure Vessel Inspectors to NB-23, The National Board Inspection Code. Local maintenance forces perform inspection, testing, and repair of approximately 1200 spring-operated relief valves (SORV) each year as the valves are cycled in from the field. The Site now has over 7000 certified test records in the Computerized Maintenance Management System (CMMS); a summary of that data is presented in this paper. In previous papers, several statistical techniques were used to investigate failure on demand and failure rates including a quantal response method for predicting the failure probability as a function of time in service. The non-conservative failure mode for SORV is commonly termed 'stuck shut'; industry defined as the valve opening at greater than or equal to 1.5 times the cold set pressure. Actual time to failure is typically not known, only that failure occurred some time since the last proof test (censored data). This paper attempts to validate the assumptions underlying the statistical lifetime prediction results using Monte Carlo simulation. It employs an aging model for lift pressure as a function of set pressure, valve manufacturer, and a time-related aging effect. This paper attempts to answer two questions: (1) what is the predicted failure rate over the chosen maintenance/ inspection interval; and do we understand aging sufficient enough to estimate risk when basing proof test intervals on proof test results?

  16. Discussion of “Prediction intervals for short-term wind farm generation forecasts” and “Combined nonparametric prediction intervals for wind power generation”

    DEFF Research Database (Denmark)

    Pinson, Pierre; Tastu, Julija

    2014-01-01

    A new score for the evaluation of interval forecasts, the so-called coverage width-based criterion (CWC), was proposed and utilized.. This score has been used for the tuning (in-sample) and genuine evaluation (out-ofsample) of prediction intervals for various applications, e.g., electric load [1......], electricity prices [2], general purpose prediction [3], and wind power generation [4], [5]. Indeed, two papers by the same authors appearing in the IEEE Transactions On Sustainable Energy employ that score and use it to conclude on the comparative quality of alternative approaches to interval forecasting...

  17. An efficient sampling approach for variance-based sensitivity analysis based on the law of total variance in the successive intervals without overlapping

    Science.gov (United States)

    Yun, Wanying; Lu, Zhenzhou; Jiang, Xian

    2018-06-01

    To efficiently execute the variance-based global sensitivity analysis, the law of total variance in the successive intervals without overlapping is proved at first, on which an efficient space-partition sampling-based approach is subsequently proposed in this paper. Through partitioning the sample points of output into different subsets according to different inputs, the proposed approach can efficiently evaluate all the main effects concurrently by one group of sample points. In addition, there is no need for optimizing the partition scheme in the proposed approach. The maximum length of subintervals is decreased by increasing the number of sample points of model input variables in the proposed approach, which guarantees the convergence condition of the space-partition approach well. Furthermore, a new interpretation on the thought of partition is illuminated from the perspective of the variance ratio function. Finally, three test examples and one engineering application are employed to demonstrate the accuracy, efficiency and robustness of the proposed approach.

  18. Multichannel interval timer

    International Nuclear Information System (INIS)

    Turko, B.T.

    1983-10-01

    A CAMAC based modular multichannel interval timer is described. The timer comprises twelve high resolution time digitizers with a common start enabling twelve independent stop inputs. Ten time ranges from 2.5 μs to 1.3 μs can be preset. Time can be read out in twelve 24-bit words either via CAMAC Crate Controller or an external FIFO register. LSB time calibration is 78.125 ps. An additional word reads out the operational status of twelve stop channels. The system consists of two modules. The analog module contains a reference clock and 13 analog time stretchers. The digital module contains counters, logic and interface circuits. The timer has an excellent differential linearity, thermal stability and crosstalk free performance

  19. Adaptive control of theophylline therapy: importance of blood sampling times.

    Science.gov (United States)

    D'Argenio, D Z; Khakmahd, K

    1983-10-01

    A two-observation protocol for estimating theophylline clearance during a constant-rate intravenous infusion is used to examine the importance of blood sampling schedules with regard to the information content of resulting concentration data. Guided by a theory for calculating maximally informative sample times, population simulations are used to assess the effect of specific sampling times on the precision of resulting clearance estimates and subsequent predictions of theophylline plasma concentrations. The simulations incorporated noise terms for intersubject variability, dosing errors, sample collection errors, and assay error. Clearance was estimated using Chiou's method, least squares, and a Bayesian estimation procedure. The results of these simulations suggest that clinically significant estimation and prediction errors may result when using the above two-point protocol for estimating theophylline clearance if the time separating the two blood samples is less than one population mean elimination half-life.

  20. The effect of chorionicity and twin-to-twin delivery time interval on short-term outcome of the second twin

    DEFF Research Database (Denmark)

    Hjortø, Sofie; Nickelsen, Carsten; Petersen, Janne

    2013-01-01

    Abstract Objectives: To investigate the effect of chorionicity and twin-to-twin delivery time interval on short-term outcome in the second twin. Additionally, to investigate predictors of adverse outcome in both twins. Methods: Data included vaginally delivered twins (≥ 36 weeks) from Copenhagen ...

  1. INTERVAL OBSERVER FOR A BIOLOGICAL REACTOR MODEL

    Directory of Open Access Journals (Sweden)

    T. A. Kharkovskaia

    2014-05-01

    Full Text Available The method of an interval observer design for nonlinear systems with parametric uncertainties is considered. The interval observer synthesis problem for systems with varying parameters consists in the following. If there is the uncertainty restraint for the state values of the system, limiting the initial conditions of the system and the set of admissible values for the vector of unknown parameters and inputs, the interval existence condition for the estimations of the system state variables, containing the actual state at a given time, needs to be held valid over the whole considered time segment as well. Conditions of the interval observers design for the considered class of systems are shown. They are: limitation of the input and state, the existence of a majorizing function defining the uncertainty vector for the system, Lipschitz continuity or finiteness of this function, the existence of an observer gain with the suitable Lyapunov matrix. The main condition for design of such a device is cooperativity of the interval estimation error dynamics. An individual observer gain matrix selection problem is considered. In order to ensure the property of cooperativity for interval estimation error dynamics, a static transformation of coordinates is proposed. The proposed algorithm is demonstrated by computer modeling of the biological reactor. Possible applications of these interval estimation systems are the spheres of robust control, where the presence of various types of uncertainties in the system dynamics is assumed, biotechnology and environmental systems and processes, mechatronics and robotics, etc.

  2. A delay-dependent approach to robust control for neutral uncertain neural networks with mixed interval time-varying delays

    International Nuclear Information System (INIS)

    Lu, Chien-Yu

    2011-01-01

    This paper considers the problem of delay-dependent global robust stabilization for discrete, distributed and neutral interval time-varying delayed neural networks described by nonlinear delay differential equations of the neutral type. The parameter uncertainties are norm bounded. The activation functions are assumed to be bounded and globally Lipschitz continuous. Using a Lyapunov functional approach and linear matrix inequality (LMI) techniques, the stability criteria for the uncertain neutral neural networks with interval time-varying delays are established in the form of LMIs, which can be readily verified using the standard numerical software. An important feature of the result reported is that all the stability conditions are dependent on the upper and lower bounds of the delays. Another feature of the results lies in that it involves fewer free weighting matrix strategy, and upper bounds of the inner product between two vectors are not introduced to reduce the conservatism of the criteria. Two illustrative examples are provided to demonstrate the effectiveness and the reduced conservatism of the proposed method

  3. Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.

    Science.gov (United States)

    Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J

    2016-02-01

    It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.

  4. High-intensity interval training: Modulating interval duration in overweight/obese men.

    Science.gov (United States)

    Smith-Ryan, Abbie E; Melvin, Malia N; Wingfield, Hailee L

    2015-05-01

    High-intensity interval training (HIIT) is a time-efficient strategy shown to induce various cardiovascular and metabolic adaptations. Little is known about the optimal tolerable combination of intensity and volume necessary for adaptations, especially in clinical populations. In a randomized controlled pilot design, we evaluated the effects of two types of interval training protocols, varying in intensity and interval duration, on clinical outcomes in overweight/obese men. Twenty-five men [body mass index (BMI) > 25 kg · m(2)] completed baseline body composition measures: fat mass (FM), lean mass (LM) and percent body fat (%BF) and fasting blood glucose, lipids and insulin (IN). A graded exercise cycling test was completed for peak oxygen consumption (VO2peak) and power output (PO). Participants were randomly assigned to high-intensity short interval (1MIN-HIIT), high-intensity interval (2MIN-HIIT) or control groups. 1MIN-HIIT and 2MIN-HIIT completed 3 weeks of cycling interval training, 3 days/week, consisting of either 10 × 1 min bouts at 90% PO with 1 min rests (1MIN-HIIT) or 5 × 2 min bouts with 1 min rests at undulating intensities (80%-100%) (2MIN-HIIT). There were no significant training effects on FM (Δ1.06 ± 1.25 kg) or %BF (Δ1.13% ± 1.88%), compared to CON. Increases in LM were not significant but increased by 1.7 kg and 2.1 kg for 1MIN and 2MIN-HIIT groups, respectively. Increases in VO2peak were also not significant for 1MIN (3.4 ml·kg(-1) · min(-1)) or 2MIN groups (2.7 ml · kg(-1) · min(-1)). IN sensitivity (HOMA-IR) improved for both training groups (Δ-2.78 ± 3.48 units; p < 0.05) compared to CON. HIIT may be an effective short-term strategy to improve cardiorespiratory fitness and IN sensitivity in overweight males.

  5. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  6. Interval Size and Affect: An Ethnomusicological Perspective

    Directory of Open Access Journals (Sweden)

    Sarha Moore

    2013-08-01

    Full Text Available This commentary addresses Huron and Davis's question of whether "The Harmonic Minor Provides an Optimum Way of Reducing Average Melodic Interval Size, Consistent with Sad Affect Cues" within any non-Western musical cultures. The harmonic minor scale and other semitone-heavy scales, such as Bhairav raga and Hicaz makam, are featured widely in the musical cultures of North India and the Middle East. Do melodies from these genres also have a preponderance of semitone intervals and low incidence of the augmented second interval, as in Huron and Davis's sample? Does the presence of more semitone intervals in a melody affect its emotional connotations in different cultural settings? Are all semitone intervals equal in their effect? My own ethnographic research within these cultures reveals comparable connotations in melodies that linger on semitone intervals, centered on concepts of tension and metaphors of falling. However, across different musical cultures there may also be neutral or lively interpretations of these same pitch sets, dependent on context, manner of performance, and tradition. Small pitch movement may also be associated with social functions such as prayer or lullabies, and may not be described as "sad." "Sad," moreover may not connote the same affect cross-culturally.

  7. Extension of a chaos control method to unstable trajectories on infinite- or finite-time intervals: Experimental verification

    International Nuclear Information System (INIS)

    Yagasaki, Kazuyuki

    2007-01-01

    In experiments for single and coupled pendula, we demonstrate the effectiveness of a new control method based on dynamical systems theory for stabilizing unstable aperiodic trajectories defined on infinite- or finite-time intervals. The basic idea of the method is similar to that of the OGY method, which is a well-known, chaos control method. Extended concepts of the stable and unstable manifolds of hyperbolic trajectories are used here

  8. Novel global robust stability criteria for interval neural networks with multiple time-varying delays

    International Nuclear Information System (INIS)

    Xu Shengyuan; Lam, James; Ho, Daniel W.C.

    2005-01-01

    This Letter is concerned with the problem of robust stability analysis for interval neural networks with multiple time-varying delays and parameter uncertainties. The parameter uncertainties are assumed to be bounded in given compact sets and the activation functions are supposed to be bounded and globally Lipschitz continuous. A sufficient condition is obtained by means of Lyapunov functionals, which guarantees the existence, uniqueness and global asymptotic stability of the delayed neural network for all admissible uncertainties. This condition is in terms of a linear matrix inequality (LMI), which can be easily checked by using recently developed algorithms in solving LMIs. Finally, a numerical example is provided to demonstrate the effectiveness of the proposed method

  9. Differentially Private Confidence Intervals for Empirical Risk Minimization

    OpenAIRE

    Wang, Yue; Kifer, Daniel; Lee, Jaewoo

    2018-01-01

    The process of data mining with differential privacy produces results that are affected by two types of noise: sampling noise due to data collection and privacy noise that is designed to prevent the reconstruction of sensitive information. In this paper, we consider the problem of designing confidence intervals for the parameters of a variety of differentially private machine learning models. The algorithms can provide confidence intervals that satisfy differential privacy (as well as the mor...

  10. Method for Hot Real-Time Sampling of Gasification Products

    Energy Technology Data Exchange (ETDEWEB)

    Pomeroy, Marc D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-29

    The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beam Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.

  11. Influence Of Inspection Intervals On Mechanical System Reliability

    International Nuclear Information System (INIS)

    Zilberman, B.

    1998-01-01

    In this paper a methodology of reliability analysis of mechanical systems with latent failures is described. Reliability analysis of such systems must include appropriate usage of check intervals for latent failure detection. The methodology suggests, that based on system logic the analyst decides at the beginning if a system can fail actively or latently and propagates this approach through all system levels. All inspections are assumed to be perfect (all failures are detected and repaired and no new failures are introduced as a result of the maintenance). Additional assumptions are that mission time is much smaller, than check intervals and all components have constant failure rates. Analytical expressions for reliability calculates are provided, based on fault tree and Markov modeling techniques (for two and three redundant systems with inspection intervals). The proposed methodology yields more accurate results than are obtained by not using check intervals or using half check interval times. The conventional analysis assuming that at the beginning of each mission system is as new, give an optimistic prediction of system reliability. Some examples of reliability calculations of mechanical systems with latent failures and establishing optimum check intervals are provided

  12. Profile-likelihood Confidence Intervals in Item Response Theory Models.

    Science.gov (United States)

    Chalmers, R Philip; Pek, Jolynn; Liu, Yang

    2017-01-01

    Confidence intervals (CIs) are fundamental inferential devices which quantify the sampling variability of parameter estimates. In item response theory, CIs have been primarily obtained from large-sample Wald-type approaches based on standard error estimates, derived from the observed or expected information matrix, after parameters have been estimated via maximum likelihood. An alternative approach to constructing CIs is to quantify sampling variability directly from the likelihood function with a technique known as profile-likelihood confidence intervals (PL CIs). In this article, we introduce PL CIs for item response theory models, compare PL CIs to classical large-sample Wald-type CIs, and demonstrate important distinctions among these CIs. CIs are then constructed for parameters directly estimated in the specified model and for transformed parameters which are often obtained post-estimation. Monte Carlo simulation results suggest that PL CIs perform consistently better than Wald-type CIs for both non-transformed and transformed parameters.

  13. Sampling of ore

    International Nuclear Information System (INIS)

    Boehme, R.C.; Nicholas, B.L.

    1987-01-01

    This invention relates to a method of an apparatus for ore sampling. The method includes the steps of periodically removing a sample of the output material of a sorting machine, weighing each sample so that each is of the same weight, measuring a characteristic such as the radioactivity, magnetivity or the like of each sample, subjecting at least an equal portion of each sample to chemical analysis to determine the mineral content of the sample and comparing the characteristic measurement with desired mineral content of the chemically analysed portion of the sample to determine the characteristic/mineral ratio of the sample. The apparatus includes an ore sample collector, a deflector for deflecting a sample of ore particles from the output of an ore sorter into the collector and means for moving the deflector from a first position in which it is clear of the particle path from the sorter to a second position in which it is in the particle path at predetermined time intervals and for predetermined time periods to deflect the sample particles into the collector. The apparatus conveniently includes an ore crusher for comminuting the sample particle, a sample hopper means for weighing the hopper, a detector in the hopper for measuring a characteristic such as radioactivity, magnetivity or the like of particles in the hopper, a discharge outlet from the hopper and means for feeding the particles from the collector to the crusher and then to the hopper

  14. A New Time Calibration Method for Switched-capacitor-array-based Waveform Samplers.

    Science.gov (United States)

    Kim, H; Chen, C-T; Eclov, N; Ronzhin, A; Murat, P; Ramberg, E; Los, S; Moses, W; Choong, W-S; Kao, C-M

    2014-12-11

    We have developed a new time calibration method for the DRS4 waveform sampler that enables us to precisely measure the non-uniform sampling interval inherent in the switched-capacitor cells of the DRS4. The method uses the proportionality between the differential amplitude and sampling interval of adjacent switched-capacitor cells responding to a sawtooth-shape pulse. In the experiment, a sawtooth-shape pulse with a 40 ns period generated by a Tektronix AWG7102 is fed to a DRS4 evaluation board for calibrating the sampling intervals of all 1024 cells individually. The electronic time resolution of the DRS4 evaluation board with the new time calibration is measured to be ~2.4 ps RMS by using two simultaneous Gaussian pulses with 2.35 ns full-width at half-maximum and applying a Gaussian fit. The time resolution dependencies on the time difference with the new time calibration are measured and compared to results obtained by another method. The new method could be applicable for other switched-capacitor-array technology-based waveform samplers for precise time calibration.

  15. Prediction Interval: What to Expect When You're Expecting … A Replication.

    Directory of Open Access Journals (Sweden)

    Jeffrey R Spence

    Full Text Available A challenge when interpreting replications is determining whether the results of a replication "successfully" replicate the original study. Looking for consistency between two studies is challenging because individual studies are susceptible to many sources of error that can cause study results to deviate from each other and the population effect in unpredictable directions and magnitudes. In the current paper, we derive methods to compute a prediction interval, a range of results that can be expected in a replication due to chance (i.e., sampling error, for means and commonly used indexes of effect size: correlations and d-values. The prediction interval is calculable based on objective study characteristics (i.e., effect size of the original study and sample sizes of the original study and planned replication even when sample sizes across studies are unequal. The prediction interval provides an a priori method for assessing if the difference between an original and replication result is consistent with what can be expected due to sample error alone. We provide open-source software tools that allow researchers, reviewers, replicators, and editors to easily calculate prediction intervals.

  16. The influence of the anesthesia-to-stimulation time interval on seizure quality parameters in electroconvulsive therapy

    DEFF Research Database (Denmark)

    Jorgensen, A; Christensen, S J; Jensen, A E K

    2018-01-01

    BACKGROUND: Electroconvulsive therapy (ECT) continues to be the most efficacious treatment for severe depression and other life-threatening acute psychiatric conditions. Treatment efficacy is dependent upon the induced seizure quality, which may be influenced by a range of treatment related factors....... Recently, the time interval from anesthesia to the electrical stimulation (ASTI) has been suggested to be an important determinant of seizure quality. METHODS: We measured ASTI in 73 ECT sessions given to 22 individual patients, and analyzed its influence on five seizure quality parameters (EEG seizure...

  17. APTIMA assay on SurePath liquid-based cervical samples compared to endocervical swab samples facilitated by a real time database

    Directory of Open Access Journals (Sweden)

    Khader Samer

    2010-01-01

    samples transferred to APTIMA specimen transfer medium within seven days is sufficiently sensitive and specific to be used to screen for CT and GC. CT sensitivity may be somewhat reduced in samples from patients over 25 years. SP specimens retained in the original SP fixative for longer time intervals also may have decreased sensitivity, due to deterioration of RNA, but this was not assessed in this study. The ability to tap the live pathology database is a valuable tool that can useful to conduct clinical studies without a costly prospective clinical trial.

  18. Bootstrap confidence intervals for three-way methods

    NARCIS (Netherlands)

    Kiers, Henk A.L.

    Results from exploratory three-way analysis techniques such as CANDECOMP/PARAFAC and Tucker3 analysis are usually presented without giving insight into uncertainties due to sampling. Here a bootstrap procedure is proposed that produces percentile intervals for all output parameters. Special

  19. Transformation-cost time-series method for analyzing irregularly sampled data.

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  20. Transformation-cost time-series method for analyzing irregularly sampled data

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  1. Sampled-data-based vibration control for structural systems with finite-time state constraint and sensor outage.

    Science.gov (United States)

    Weng, Falu; Liu, Mingxin; Mao, Weijie; Ding, Yuanchun; Liu, Feifei

    2018-05-10

    The problem of sampled-data-based vibration control for structural systems with finite-time state constraint and sensor outage is investigated in this paper. The objective of designing controllers is to guarantee the stability and anti-disturbance performance of the closed-loop systems while some sensor outages happen. Firstly, based on matrix transformation, the state-space model of structural systems with sensor outages and uncertainties appearing in the mass, damping and stiffness matrices is established. Secondly, by considering most of those earthquakes or strong winds happen in a very short time, and it is often the peak values make the structures damaged, the finite-time stability analysis method is introduced to constrain the state responses in a given time interval, and the H-infinity stability is adopted in the controller design to make sure that the closed-loop system has a prescribed level of disturbance attenuation performance during the whole control process. Furthermore, all stabilization conditions are expressed in the forms of linear matrix inequalities (LMIs), whose feasibility can be easily checked by using the LMI Toolbox. Finally, numerical examples are given to demonstrate the effectiveness of the proposed theorems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  2. The impact of the time interval on in-vitro fertilisation success after failure of the first attempt.

    Science.gov (United States)

    Bayoglu Tekin, Y; Ceyhan, S T; Kilic, S; Korkmaz, C

    2015-05-01

    The aim of this study was to identify the optimal time interval for in-vitro fertilisation that would increase treatment success after failure of the first attempt. This retrospective study evaluated 454 consecutive cycles of 227 infertile women who had two consecutive attempts within a 6-month period at an IVF centre. Data were collected on duration of stimulation, consumption of gonadotropin, numbers of retrieved oocytes, mature oocytes, fertilised eggs, good quality embryos on day 3/5 following oocyte retrieval and clinical and ongoing pregnancy. There were significant increases in clinical pregnancy rates at 2-, 3- and 4-month intervals. The maximum increase was after two menstrual cycles (p = 0.001). The highest rate of ongoing pregnancy was in women that had the second attempt after the next menstrual cycle following failure of IVF (27.2%). After IVF failure, initiating the next attempt within 2-4 months increases the clinical pregnancy rates.

  3. Sampling and Timing: A Task for the Environmetal Process

    NARCIS (Netherlands)

    Hilderink, G.H.; Broenink, Johannes F.

    2003-01-01

    Sampling and timing is considered a responsibility of the environment of controller software. In this paper we will illustrate a concept whereby an environmental process and multi-way events play an important role in applying timing for untimed CSP software architectures. We use this timing concept

  4. Six Sessions of Sprint Interval Training Improves Running Performance in Trained Athletes.

    Science.gov (United States)

    Koral, Jerome; Oranchuk, Dustin J; Herrera, Roberto; Millet, Guillaume Y

    2018-03-01

    Koral, J, Oranchuk, DJ, Herrera, R, and Millet, GY. Six sessions of sprint interval training improves running performance in trained athletes. J Strength Cond Res 32(3): 617-623, 2018-Sprint interval training (SIT) is gaining popularity with endurance athletes. Various studies have shown that SIT allows for similar or greater endurance, strength, and power performance improvements than traditional endurance training but demands less time and volume. One of the main limitations in SIT research is that most studies were performed in a laboratory using expensive treadmills or ergometers. The aim of this study was to assess the performance effects of a novel short-term and highly accessible training protocol based on maximal shuttle runs in the field (SIT-F). Sixteen (12 male, 4 female) trained trail runners completed a 2-week procedure consisting of 4-7 bouts of 30 seconds at maximal intensity interspersed by 4 minutes of recovery, 3 times a week. Maximal aerobic speed (MAS), time to exhaustion at 90% of MAS before test (Tmax at 90% MAS), and 3,000-m time trial (TT3000m) were evaluated before and after training. Data were analyzed using a paired samples t-test, and Cohen's (d) effect sizes were calculated. Maximal aerobic speed improved by 2.3% (p = 0.01, d = 0.22), whereas peak power (PP) and mean power (MP) increased by 2.4% (p = 0.009, d = 0.33) and 2.8% (p = 0.002, d = 0.41), respectively. TT3000m was 6% shorter (p training in the field significantly improved the 3,000-m run, time to exhaustion, PP, and MP in trained trail runners. Sprint interval training in the field is a time-efficient and cost-free means of improving both endurance and power performance in trained athletes.

  5. Reconstruction of dynamical systems from interspike intervals

    International Nuclear Information System (INIS)

    Sauer, T.

    1994-01-01

    Attractor reconstruction from interspike interval (ISI) data is described, in rough analogy with Taken's theorem for attractor reconstruction from time series. Assuming a generic integrate-and-fire model coupling the dynamical system to the spike train, there is a one-to-one correspondence between the system states and interspike interval vectors of sufficiently large dimension. The correspondence has an important implication: interspike intervals can be forecast from past history. We show that deterministically driven ISI series can be distinguished from stochastically driven ISI series on the basis of prediction error

  6. Target Tracking of a Linear Time Invariant System under Irregular Sampling

    Directory of Open Access Journals (Sweden)

    Jin Xue-Bo

    2012-11-01

    Full Text Available Due to event-triggered sampling in a system, or maybe with the aim of reducing data storage, tracking many applications will encounter irregular sampling time. By calculating the matrix exponential using an inverse Laplace transform, this paper transforms the irregular sampling tracking problem to the problem of tracking with time-varying parameters of a system. Using the common Kalman filter, the developed method is used to track a target for the simulated trajectory and video tracking. The results of simulation experiments have shown that it can obtain good estimation performance even at a very high irregular rate of measurement sampling time.

  7. OPTIMASI OLSR ROUTING PROTOCOL PADA JARINGAN WIRELESS MESH DENGAN ADAPTIVE REFRESHING TIME INTERVAL DAN ENHANCE MULTI POINT RELAY SELECTING ALGORITHM

    Directory of Open Access Journals (Sweden)

    Faosan Mapa

    2014-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Wireless Mesh Network (WMN adalah suatu konektivitas jaringan yang self-organized, self-configured dan multi-hop. Tujuan dari WMN adalah menawarkan pengguna suatu bentuk jaringan nirkabel yang dapat dengan mudah berkomunikasi dengan jaringan konvensional dengan kecepatan tinggi dan dengan cakupan yang lebih luas serta biaya awal yang minimal. Diperlukan suatu desain protokol routing yang efisien untuk WMN yang secara adaptif dapat mendukung mesh routers dan mesh clients. Dalam tulisan ini, diusulkan untuk mengoptimalkan protokol OLSR, yang merupakan protokol routing proaktif. Digunakan heuristik yang meningkatkan protokol OLSR melalui adaptive refreshing time interval dan memperbaiki metode MPR selecting algorithm. Suatu analisa dalam meningkatkan protokol OLSR melalui adaptive refreshing time interval dan memperbaiki algoritma pemilihan MPR menunjukkan kinerja yang signifikan dalam hal throughput jika dibandingkan dengan protokol OLSR yang asli. Akan tetapi, terdapat kenaikan dalam hal delay. Pada simulasi yang dilakukan dapat disimpulkan bahwa OLSR dapat dioptimalkan dengan memodifikasi pemilihan node MPR berdasarkan cost effective dan penyesuaian waktu interval refreshing hello message sesuai dengan keadaan

  8. A spreadsheet template compatible with Microsoft Excel and iWork Numbers that returns the simultaneous confidence intervals for all pairwise differences between multiple sample means.

    Science.gov (United States)

    Brown, Angus M

    2010-04-01

    The objective of the method described in this paper is to develop a spreadsheet template for the purpose of comparing multiple sample means. An initial analysis of variance (ANOVA) test on the data returns F--the test statistic. If F is larger than the critical F value drawn from the F distribution at the appropriate degrees of freedom, convention dictates rejection of the null hypothesis and allows subsequent multiple comparison testing to determine where the inequalities between the sample means lie. A variety of multiple comparison methods are described that return the 95% confidence intervals for differences between means using an inclusive pairwise comparison of the sample means. 2009 Elsevier Ireland Ltd. All rights reserved.

  9. Resampling methods in Microsoft Excel® for estimating reference intervals.

    Science.gov (United States)

    Theodorsson, Elvar

    2015-01-01

    Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. 
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
 Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.

  10. A novel interval type-2 fractional order fuzzy PID controller: Design, performance evaluation, and its optimal time domain tuning.

    Science.gov (United States)

    Kumar, Anupam; Kumar, Vijay

    2017-05-01

    In this paper, a novel concept of an interval type-2 fractional order fuzzy PID (IT2FO-FPID) controller, which requires fractional order integrator and fractional order differentiator, is proposed. The incorporation of Takagi-Sugeno-Kang (TSK) type interval type-2 fuzzy logic controller (IT2FLC) with fractional controller of PID-type is investigated for time response measure due to both unit step response and unit load disturbance. The resulting IT2FO-FPID controller is examined on different delayed linear and nonlinear benchmark plants followed by robustness analysis. In order to design this controller, fractional order integrator-differentiator operators are considered as design variables including input-output scaling factors. A new hybridized algorithm named as artificial bee colony-genetic algorithm (ABC-GA) is used to optimize the parameters of the controller while minimizing weighted sum of integral of time absolute error (ITAE) and integral of square of control output (ISCO). To assess the comparative performance of the IT2FO-FPID, authors compared it against existing controllers, i.e., interval type-2 fuzzy PID (IT2-FPID), type-1 fractional order fuzzy PID (T1FO-FPID), type-1 fuzzy PID (T1-FPID), and conventional PID controllers. Furthermore, to show the effectiveness of the proposed controller, the perturbed processes along with the larger dead time are tested. Moreover, the proposed controllers are also implemented on multi input multi output (MIMO), coupled, and highly complex nonlinear two-link robot manipulator system in presence of un-modeled dynamics. Finally, the simulation results explicitly indicate that the performance of the proposed IT2FO-FPID controller is superior to its conventional counterparts in most of the cases. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Right Propositional Neighborhood Logic over Natural Numbers with Integer Constraints for Interval Lengths

    DEFF Research Database (Denmark)

    Bresolin, Davide; Goranko, Valentin; Montanari, Angelo

    2009-01-01

    Interval temporal logics are based on interval structures over linearly (or partially) ordered domains, where time intervals, rather than time instants, are the primitive ontological entities. In this paper we introduce and study Right Propositional Neighborhood Logic over natural numbers...... with integer constraints for interval lengths, which is a propositional interval temporal logic featuring a modality for the 'right neighborhood' relation between intervals and explicit integer constraints for interval lengths. We prove that it has the bounded model property with respect to ultimately periodic...

  12. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  13. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  14. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions

    Directory of Open Access Journals (Sweden)

    Jean-Louis eHoneine

    2014-10-01

    Full Text Available Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a subtract or integrate sensory inputs, (b move from allocentric to egocentric reference or vice versa, and (c adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1-2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training

  15. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions

    Science.gov (United States)

    Honeine, Jean-Louis; Schieppati, Marco

    2014-01-01

    Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1–2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices

  16. The Applicability of Confidence Intervals of Quantiles for the Generalized Logistic Distribution

    Science.gov (United States)

    Shin, H.; Heo, J.; Kim, T.; Jung, Y.

    2007-12-01

    The generalized logistic (GL) distribution has been widely used for frequency analysis. However, there is a little study related to the confidence intervals that indicate the prediction accuracy of distribution for the GL distribution. In this paper, the estimation of the confidence intervals of quantiles for the GL distribution is presented based on the method of moments (MOM), maximum likelihood (ML), and probability weighted moments (PWM) and the asymptotic variances of each quantile estimator are derived as functions of the sample sizes, return periods, and parameters. Monte Carlo simulation experiments are also performed to verify the applicability of the derived confidence intervals of quantile. As the results, the relative bias (RBIAS) and relative root mean square error (RRMSE) of the confidence intervals generally increase as return period increases and reverse as sample size increases. And PWM for estimating the confidence intervals performs better than the other methods in terms of RRMSE when the data is almost symmetric while ML shows the smallest RBIAS and RRMSE when the data is more skewed and sample size is moderately large. The GL model was applied to fit the distribution of annual maximum rainfall data. The results show that there are little differences in the estimated quantiles between ML and PWM while distinct differences in MOM.

  17. Blood and Plasma Biochemistry Reference Intervals for Wild Juvenile American Alligators ( Alligator mississippiensis ).

    Science.gov (United States)

    Hamilton, Matthew T; Kupar, Caitlin A; Kelley, Meghan D; Finger, John W; Tuberville, Tracey D

    2016-07-01

    : American alligators ( Alligator mississippiensis ) are one of the most studied crocodilian species in the world, yet blood and plasma biochemistry information is limited for juvenile alligators in their northern range, where individuals may be exposed to extreme abiotic and biotic stressors. We collected blood samples over a 2-yr period from 37 juvenile alligators in May, June, and July to establish reference intervals for 22 blood and plasma analytes. We observed no effect of either sex or blood collection time on any analyte investigated. However, our results indicate a significant correlation between a calculated body condition index and aspartate aminotransferase and creatine kinase. Glucose, total protein, and potassium varied significantly between sampling sessions. In addition, glucose and potassium were highly correlated between the two point-of-care devices used, although they were significantly lower with the i-STAT 1 CG8+ cartridge than with the Vetscan VS2 Avian/Reptile Rotor. The reference intervals presented herein should provide baseline data for evaluating wild juvenile alligators in the northern portion of their range.

  18. The Nordic Maintenance Care Program - Time intervals between treatments of patients with low back pain: how close and who decides?

    Directory of Open Access Journals (Sweden)

    Leboeuf-Yde Charlotte

    2010-03-01

    Full Text Available Abstract Background The management of chiropractic patients with acute and chronic/persistent conditions probably differs. However, little is known on this subject. There is, for example, a dearth of information on maintenance care (MC. Thus it is not known if patients on MC are coerced to partake in a program of frequent treatments over a long period of time, or if they are actively involved in designing their own individualized treatment program. Objectives It was the purpose of this study to investigate how chiropractic patients with low back pain were scheduled for treatment, with special emphasis on MC. The specific research questions were: 1. How many patients are on maintenance care? 2 Are there specific patterns of intervals between treatments for patients and, if so, do they differ between MC patients and non-MC patients? 3. Who decides on the next treatment, the patient, the chiropractor or both, and are there any differences between MC patients and non-MC patients? Methods Chiropractic students, who during their summer holidays were observers in chiropractic clinics in Norway and Denmark, recorded whether patients were classified by the treating chiropractor as a MC-patient or not, dates for last and subsequent visits, and made a judgement on whether the patient or the chiropractor decided on the next appointment. Results Observers in the study were 16 out of 30 available students. They collected data on 868 patients from 15 Danish and 13 Norwegian chiropractors. Twenty-two percent and 26%, respectively, were classified as MC patients. Non-MC patients were most frequently seen within 1 week. For MC patients, the previous visit was most often 2-4 weeks prior to the actual visit, and the next appointment between 1 and 3 months. This indicates a gradual increase in intervals. The decision of the next visit was mainly made by the chiropractor, also for MC patients. However, the study samples of chiropractors appear not to be

  19. A simple method for regional cerebral blood flow measurement by one-point arterial blood sampling and 123I-IMP microsphere model (part 2). A study of time correction of one-point blood sample count

    International Nuclear Information System (INIS)

    Masuda, Yasuhiko; Makino, Kenichi; Gotoh, Satoshi

    1999-01-01

    In our previous paper regarding determination of the regional cerebral blood flow (rCBF) using the 123 I-IMP microsphere model, we reported that the accuracy of determination of the integrated value of the input function from one-point arterial blood sampling can be increased by performing correction using the 5 min: 29 min ratio for the whole-brain count. However, failure to carry out the arterial blood collection at exactly 5 minutes after 123 I-IMP injection causes errors with this method, and there is thus a time limitation. We have now revised out method so that the one-point arterial blood sampling can be performed at any time during the interval between 5 minutes and 20 minutes after 123 I-IMP injection, with addition of a correction step for the sampling time. This revised method permits more accurate estimation of the integral of the input functions. This method was then applied to 174 experimental subjects: one-point blood samples collected at random times between 5 and 20 minutes, and the estimated values for the continuous arterial octanol extraction count (COC) were determined. The mean error rate between the COC and the actual measured continuous arterial octanol extraction count (OC) was 3.6%, and the standard deviation was 12.7%. Accordingly, in 70% of the cases, the rCBF was able to be estimated within an error rate of 13%, while estimation was possible in 95% of the cases within an error rate of 25%. This improved method is a simple technique for determination of the rCBF by 123 I-IMP microsphere model and one-point arterial blood sampling which no longer shows a time limitation and does not require any octanol extraction step. (author)

  20. Method of high precision interval measurement in pulse laser ranging system

    Science.gov (United States)

    Wang, Zhen; Lv, Xin-yuan; Mao, Jin-jin; Liu, Wei; Yang, Dong

    2013-09-01

    Laser ranging is suitable for laser system, for it has the advantage of high measuring precision, fast measuring speed,no cooperative targets and strong resistance to electromagnetic interference,the measuremen of laser ranging is the key paremeters affecting the performance of the whole system.The precision of the pulsed laser ranging system was decided by the precision of the time interval measurement, the principle structure of laser ranging system was introduced, and a method of high precision time interval measurement in pulse laser ranging system was established in this paper.Based on the analysis of the factors which affected the precision of range measure,the pulse rising edges discriminator was adopted to produce timing mark for the start-stop time discrimination,and the TDC-GP2 high precision interval measurement system based on TMS320F2812 DSP was designed to improve the measurement precision.Experimental results indicate that the time interval measurement method in this paper can obtain higher range accuracy. Compared with the traditional time interval measurement system,the method simplifies the system design and reduce the influence of bad weather conditions,furthermore,it satisfies the requirements of low costs and miniaturization.

  1. Non-Cartesian MRI scan time reduction through sparse sampling

    NARCIS (Netherlands)

    Wajer, F.T.A.W.

    2001-01-01

    Non-Cartesian MRI Scan-Time Reduction through Sparse Sampling Magnetic resonance imaging (MRI) signals are measured in the Fourier domain, also called k-space. Samples of the MRI signal can not be taken at will, but lie along k-space trajectories determined by the magnetic field gradients. MRI

  2. Brain response during the M170 time interval is sensitive to socially relevant information.

    Science.gov (United States)

    Arviv, Oshrit; Goldstein, Abraham; Weeting, Janine C; Becker, Eni S; Lange, Wolf-Gero; Gilboa-Schechtman, Eva

    2015-11-01

    Deciphering the social meaning of facial displays is a highly complex neurological process. The M170, an event related field component of MEG recording, like its EEG counterpart N170, was repeatedly shown to be associated with structural encoding of faces. However, the scope of information encoded during the M170 time window is still being debated. We investigated the neuronal origin of facial processing of integrated social rank cues (SRCs) and emotional facial expressions (EFEs) during the M170 time interval. Participants viewed integrated facial displays of emotion (happy, angry, neutral) and SRCs (indicated by upward, downward, or straight head tilts). We found that the activity during the M170 time window is sensitive to both EFEs and SRCs. Specifically, highly prominent activation was observed in response to SRC connoting dominance as compared to submissive or egalitarian head cues. Interestingly, the processing of EFEs and SRCs appeared to rely on different circuitry. Our findings suggest that vertical head tilts are processed not only for their sheer structural variance, but as social information. Exploring the temporal unfolding and brain localization of non-verbal cues processing may assist in understanding the functioning of the social rank biobehavioral system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Convex Interval Games

    NARCIS (Netherlands)

    Alparslan-Gok, S.Z.; Brânzei, R.; Tijs, S.H.

    2008-01-01

    In this paper, convex interval games are introduced and some characterizations are given. Some economic situations leading to convex interval games are discussed. The Weber set and the Shapley value are defined for a suitable class of interval games and their relations with the interval core for

  4. Spectral of electrocardiographic RR intervals to indicate atrial fibrillation

    Science.gov (United States)

    Nuryani, Nuryani; Satrio Nugroho, Anto

    2017-11-01

    Atrial fibrillation is a serious heart diseases, which is associated on the risk of death, and thus an early detection of atrial fibrillation is necessary. We have investigated spectral pattern of electrocardiogram in relation to atrial fibrillation. The utilized feature of electrocardiogram is RR interval. RR interval is the time interval between a two-consecutive R peaks. A series of RR intervals in a time segment is converted to a signal with a frequency domain. The frequency components are investigated to find the components which significantly associate to atrial fibrillation. A segment is defined as atrial fibrillation or normal segments by considering a defined number of atrial fibrillation RR in the segment. Using clinical data of 23 patients with atrial fibrillation, we find that the frequency components could be used to indicate atrial fibrillation.

  5. Overconfidence in Interval Estimates

    Science.gov (United States)

    Soll, Jack B.; Klayman, Joshua

    2004-01-01

    Judges were asked to make numerical estimates (e.g., "In what year was the first flight of a hot air balloon?"). Judges provided high and low estimates such that they were X% sure that the correct answer lay between them. They exhibited substantial overconfidence: The correct answer fell inside their intervals much less than X% of the time. This…

  6. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    Science.gov (United States)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  7. Effect of time interval between capecitabine intake and radiotherapy on local recurrence-free survival in preoperative chemoradiation for locally advanced rectal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeon Joo; Kim, Jong Hoon; Yu, Chang Sik; Kim, Tae Won; Jang, Se Jin; Choi, Eun Kyung; Kim, Jin Cheon [Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of); Choi, Won Sik [University of Ulsan College of Medicine, Gangneung (Korea, Republic of)

    2017-06-15

    The concentration of capecitabine peaks at 1–2 hours after administration. We therefore assumed that proper timing of capecitabine administration and radiotherapy would maximize radiosensitization and influence survival among patients with locally advanced rectal cancer. We retrospectively reviewed 223 patients with locally advanced rectal cancer who underwent preoperative chemoradiation, followed by surgery from January 2002 to May 2006. All patients underwent pelvic radiotherapy (50 Gy/25 fractions) and received capecitabine twice daily at 12-hour intervals (1,650 mg/m2/day). Patients were divided into two groups according to the time interval between capecitabine intake and radiotherapy. Patients who took capecitabine 1 hour before radiotherapy were classified as Group A (n = 109); all others were classified as Group B (n = 114). The median follow-up period was 72 months (range, 7 to 149 months). Although Group A had a significantly higher rate of good responses (44% vs. 25%; p = 0.005), the 5-year local recurrence-free survival rates of 93% in Group A and 97% in Group B did not differ significantly (p = 0.519). The 5-year disease-free survival and overall survival rates were also comparable between the groups. Despite the better pathological response in Group A, the time interval between capecitabine and radiotherapy administration did not have a significant effect on survivals. Further evaluations are needed to clarify the interaction of these treatment modalities.

  8. Using the Initial Systolic Time Interval to assess cardiac autonomic function in Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    Jan H. Meijer

    2011-12-01

    Full Text Available The Initial Systolic Time Interval (ISTI has been defined as the time difference between the peak electrical and peak mechanical activity of the heart. ISTI is obtained from the electro-cardiogram and the impedance cardiogram. The response of ISTI while breathing at rest and to a deep breathing stimulus was studied in a group of patients suffering from Parkinson's disease (PD and a group of healthy control subjects. ISTI showed substantial variability during these manoeuvres. The tests showed that the variability of RR and ISTI was substantially different between PD patients and controls. It is hypothesized that in PD patients the sympathetic system compensates for the loss of regulatory control function of the blood-pressure by the parasympathetic system. It is concluded that ISTI is a practical, additional and independent parameter that can be used to assist other tests in evaluating autonomic control of the heart in PD patients.doi:10.5617/jeb.216 J Electr Bioimp, vol. 2, pp. 98-101, 2011

  9. In-Sample Confidence Bands and Out-of-Sample Forecast Bands for Time-Varying Parameters in Observation Driven Models

    NARCIS (Netherlands)

    Blasques, F.; Koopman, S.J.; Lasak, K.A.; Lucas, A.

    2016-01-01

    We study the performances of alternative methods for calculating in-sample confidence and out-of-sample forecast bands for time-varying parameters. The in-sample bands reflect parameter uncertainty, while the out-of-sample bands reflect not only parameter uncertainty, but also innovation

  10. The prognostic value of the QT interval and QT interval dispersion in all-cause and cardiac mortality and morbidity in a population of Danish citizens.

    Science.gov (United States)

    Elming, H; Holm, E; Jun, L; Torp-Pedersen, C; Køber, L; Kircshoff, M; Malik, M; Camm, J

    1998-09-01

    To evaluate the prognostic value of the QT interval and QT interval dispersion in total and in cardiovascular mortality, as well as in cardiac morbidity, in a general population. The QT interval was measured in all leads from a standard 12-lead ECG in a random sample of 1658 women and 1797 men aged 30-60 years. QT interval dispersion was calculated from the maximal difference between QT intervals in any two leads. All cause mortality over 13 years, and cardiovascular mortality as well as cardiac morbidity over 11 years, were the main outcome parameters. Subjects with a prolonged QT interval (430 ms or more) or prolonged QT interval dispersion (80 ms or more) were at higher risk of cardiovascular death and cardiac morbidity than subjects whose QT interval was less than 360 ms, or whose QT interval dispersion was less than 30 ms. Cardiovascular death relative risk ratios, adjusted for age, gender, myocardial infarct, angina pectoris, diabetes mellitus, arterial hypertension, smoking habits, serum cholesterol level, and heart rate were 2.9 for the QT interval (95% confidence interval 1.1-7.8) and 4.4 for QT interval dispersion (95% confidence interval 1.0-19-1). Fatal and non-fatal cardiac morbidity relative risk ratios were similar, at 2.7 (95% confidence interval 1.4-5.5) for the QT interval and 2.2 (95% confidence interval 1.1-4.0) for QT interval dispersion. Prolongation of the QT interval and QT interval dispersion independently affected the prognosis of cardiovascular mortality and cardiac fatal and non-fatal morbidity in a general population over 11 years.

  11. Restricted Interval Valued Neutrosophic Sets and Restricted Interval Valued Neutrosophic Topological Spaces

    Directory of Open Access Journals (Sweden)

    Anjan Mukherjee

    2016-08-01

    Full Text Available In this paper we introduce the concept of restricted interval valued neutrosophic sets (RIVNS in short. Some basic operations and properties of RIVNS are discussed. The concept of restricted interval valued neutrosophic topology is also introduced together with restricted interval valued neutrosophic finer and restricted interval valued neutrosophic coarser topology. We also define restricted interval valued neutrosophic interior and closer of a restricted interval valued neutrosophic set. Some theorems and examples are cites. Restricted interval valued neutrosophic subspace topology is also studied.

  12. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang [Seoul National Univ., Seoul (Korea, Republic of)

    1997-07-15

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code.

  13. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang

    1997-07-01

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code

  14. Clinical and Biological Features of Interval Colorectal Cancer

    Directory of Open Access Journals (Sweden)

    Yu Mi Lee

    2017-05-01

    Full Text Available Interval colorectal cancer (I-CRC is defined as a CRC diagnosed within 60 months after a negative colonoscopy, taking into account that 5 years is the “mean sojourn time.” It is important to prevent the development of interval cancer. The development of interval colon cancer is associated with female sex, old age, family history of CRC, comorbidities, diverticulosis, and the skill of the endoscopist. During carcinogenesis, sessile serrated adenomas/polyps (SSA/Ps share many genomic and colonic site characteristics with I-CRCs. The clinical and biological features of I-CRC should be elucidated to prevent the development of interval colon cancer.

  15. Response-rate differences in variable-interval and variable-ratio schedules: An old problem revisited

    OpenAIRE

    Cole, Mark R.

    1994-01-01

    In Experiment 1, a variable-ratio 10 schedule became, successively, a variable-interval schedule with only the minimum interreinforcement intervals yoked to the variable ratio, or a variable-interval schedule with both interreinforcement intervals and reinforced interresponse times yoked to the variable ratio. Response rates in the variable-interval schedule with both interreinforcement interval and reinforced interresponse time yoking fell between the higher rates maintained by the variable-...

  16. Closeness-Centrality-Based Synchronization Criteria for Complex Dynamical Networks With Interval Time-Varying Coupling Delays.

    Science.gov (United States)

    Park, Myeongjin; Lee, Seung-Hoon; Kwon, Oh-Min; Seuret, Alexandre

    2017-09-06

    This paper investigates synchronization in complex dynamical networks (CDNs) with interval time-varying delays. The CDNs are representative of systems composed of a large number of interconnected dynamical units, and for the purpose of the mathematical analysis, the leading work is to model them as graphs whose nodes represent the dynamical units. At this time, we take note of the importance of each node in networks. One way, in this paper, is that the closeness-centrality mentioned in the field of social science is grafted onto the CDNs. By constructing a suitable Lyapunov-Krasovskii functional, and utilizing some mathematical techniques, the sufficient and closeness-centrality-based conditions for synchronization stability of the networks are established in terms of linear matrix inequalities. Ultimately, the use of the closeness-centrality can be weighted with regard to not only the interconnection relation among the nodes, which was utilized in the existing works but also more information about nodes. Here, the centrality will be added as the concerned information. Moreover, to avoid the computational burden causing the nonconvex term including the square of the time-varying delay, how to deal with it is applied by estimating it to the convex term including time-varying delay. Finally, two illustrative examples are given to show the advantage of the closeness-centrality in point of the robustness on time-delay.

  17. Readout electronics for fine dE/dx sampling with a longitudinal drift chamber

    International Nuclear Information System (INIS)

    Imanishi, A.; Shiino, K.; Ishii, T.; Ohshima, T.; Okuno, H.

    1982-04-01

    A fast, low noise preamplifier, a signal shaping filter and a fast sampling ADC circuit have been developed for fine sampling dE/dx measurement with a longitudinal drift chamber. dE/dx has been sampled with a time interval of 40 ns which corresponds to a gas thickness of 1.4 mm in Ar(90%)/CH 4 (10%). Parameters of each circuit have been adjusted to match with this sampling interval. It is found that the signal tail cancellation is crucial to obtain better dE/dx resolution when the wider drift space is used, and this can be realized with a pole-zero shortening filter and a semi-Gaussian shaping integrator. (author)

  18. RR-Interval variance of electrocardiogram for atrial fibrillation detection

    Science.gov (United States)

    Nuryani, N.; Solikhah, M.; Nugoho, A. S.; Afdala, A.; Anzihory, E.

    2016-11-01

    Atrial fibrillation is a serious heart problem originated from the upper chamber of the heart. The common indication of atrial fibrillation is irregularity of R peak-to-R-peak time interval, which is shortly called RR interval. The irregularity could be represented using variance or spread of RR interval. This article presents a system to detect atrial fibrillation using variances. Using clinical data of patients with atrial fibrillation attack, it is shown that the variance of electrocardiographic RR interval are higher during atrial fibrillation, compared to the normal one. Utilizing a simple detection technique and variances of RR intervals, we find a good performance of atrial fibrillation detection.

  19. MK-801 and memantine act differently on short-term memory tested with different time-intervals in the Morris water maze test.

    Science.gov (United States)

    Duda, Weronika; Wesierska, Malgorzata; Ostaszewski, Pawel; Vales, Karel; Nekovarova, Tereza; Stuchlik, Ales

    2016-09-15

    N-methyl-d-aspartate receptors (NMDARs) play a crucial role in spatial memory formation. In neuropharmacological studies their functioning strongly depends on testing conditions and the dosage of NMDAR antagonists. The aim of this study was to assess the immediate effects of NMDAR block by (+)MK-801 or memantine on short-term allothetic memory. Memory was tested in a working memory version of the Morris water maze test. In our version of the test, rats underwent one day of training with 8 trials, and then three experimental days when rats were injected intraperitoneally with low- 5 (MeL), high - 20 (MeH) mg/kg memantine, 0.1mg/kg MK-801 or 1ml/kg saline (SAL) 30min before testing, for three consecutive days. On each experimental day there was just one acquisition and one test trial, with an inter-trial interval of 5 or 15min. During training the hidden platform was relocated after each trial and during the experiment after each day. The follow-up effect was assessed on day 9. Intact rats improved their spatial memory across the one training day. With a 5min interval MeH rats had longer latency then all rats during retrieval. With a 15min interval the MeH rats presented worse working memory measured as retrieval minus acquisition trial for path than SAL and MeL and for latency than MeL rats. MK-801 rats had longer latency than SAL during retrieval. Thus, the high dose of memantine, contrary to low dose of MK-801 disrupts short-term memory independent on the time interval between acquisition and retrieval. This shows that short-term memory tested in a working memory version of water maze is sensitive to several parameters: i.e., NMDA receptor antagonist type, dosage and the time interval between learning and testing. Copyright © 2016. Published by Elsevier B.V.

  20. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    Science.gov (United States)

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from 60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Estimating reliable paediatric reference intervals in clinical chemistry and haematology.

    Science.gov (United States)

    Ridefelt, Peter; Hellberg, Dan; Aldrimer, Mattias; Gustafsson, Jan

    2014-01-01

    Very few high-quality studies on paediatric reference intervals for general clinical chemistry and haematology analytes have been performed. Three recent prospective community-based projects utilising blood samples from healthy children in Sweden, Denmark and Canada have substantially improved the situation. The present review summarises current reference interval studies for common clinical chemistry and haematology analyses. ©2013 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  2. Eigenvalue sensitivity of sampled time systems operating in closed loop

    Science.gov (United States)

    Bernal, Dionisio

    2018-05-01

    The use of feedback to create closed-loop eigenstructures with high sensitivity has received some attention in the Structural Health Monitoring field. Although practical implementation is necessarily digital, and thus in sampled time, work thus far has center on the continuous time framework, both in design and in checking performance. It is shown in this paper that the performance in discrete time, at typical sampling rates, can differ notably from that anticipated in the continuous time formulation and that discrepancies can be particularly large on the real part of the eigenvalue sensitivities; a consequence being important error on the (linear estimate) of the level of damage at which closed-loop stability is lost. As one anticipates, explicit consideration of the sampling rate poses no special difficulties in the closed-loop eigenstructure design and the relevant expressions are developed in the paper, including a formula for the efficient evaluation of the derivative of the matrix exponential based on the theory of complex perturbations. The paper presents an easily reproduced numerical example showing the level of error that can result when the discrete time implementation of the controller is not considered.

  3. Statistical inferences with jointly type-II censored samples from two Pareto distributions

    Science.gov (United States)

    Abu-Zinadah, Hanaa H.

    2017-08-01

    In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.

  4. Weighted statistical parameters for irregularly sampled time series

    Science.gov (United States)

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.

  5. Optimization of Spacecraft Rendezvous and Docking using Interval Analysis

    NARCIS (Netherlands)

    Van Kampen, E.; Chu, Q.P.; Mulder, J.A.

    2010-01-01

    This paper applies interval optimization to the fixed-time multiple impulse rendezvous and docking problem. Current methods for solving this type of optimization problem include for example genetic algorithms and gradient based optimization. Unlike these methods, interval methods can guarantee that

  6. Indirect methods for reference interval determination - review and recommendations.

    Science.gov (United States)

    Jones, Graham R D; Haeckel, Rainer; Loh, Tze Ping; Sikaris, Ken; Streichert, Thomas; Katayev, Alex; Barth, Julian H; Ozarda, Yesim

    2018-04-19

    Reference intervals are a vital part of the information supplied by clinical laboratories to support interpretation of numerical pathology results such as are produced in clinical chemistry and hematology laboratories. The traditional method for establishing reference intervals, known as the direct approach, is based on collecting samples from members of a preselected reference population, making the measurements and then determining the intervals. An alternative approach is to perform analysis of results generated as part of routine pathology testing and using appropriate statistical techniques to determine reference intervals. This is known as the indirect approach. This paper from a working group of the International Federation of Clinical Chemistry (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) aims to summarize current thinking on indirect approaches to reference intervals. The indirect approach has some major potential advantages compared with direct methods. The processes are faster, cheaper and do not involve patient inconvenience, discomfort or the risks associated with generating new patient health information. Indirect methods also use the same preanalytical and analytical techniques used for patient management and can provide very large numbers for assessment. Limitations to the indirect methods include possible effects of diseased subpopulations on the derived interval. The IFCC C-RIDL aims to encourage the use of indirect methods to establish and verify reference intervals, to promote publication of such intervals with clear explanation of the process used and also to support the development of improved statistical techniques for these studies.

  7. Degradation of hydrocarbons in soil samples analyzed within accepted analytical holding times

    International Nuclear Information System (INIS)

    Jackson, J.; Thomey, N.; Dietlein, L.F.

    1992-01-01

    Samples which are collected in conjunction with subsurface investigations at leaking petroleum storage tank sites and petroleum refineries are routinely analyzed for benzene, toluene, ethylbenzene, xylenes (BTEX), and total petroleum hydrocarbons (TPH). Water samples are preserved by the addition of hydrochloric acid and maintained at four degrees centigrade prior to analysis. This is done to prevent bacterial degradation of hydrocarbons. Chemical preservation is not presently performed on soil samples. Instead, the samples are cooled and maintained at four degrees centigrade. This study was done to measure the degree of degradation of hydrocarbons in soil samples which are analyzed within accepted holding times. Soil samples were collected and representative subsamples were prepared from the initial sample. Subsamples were analyzed in triplicate for BTEX and TPH throughout the length of the approved holding times to measure the extent of sample constituent degradation prior to analysis. Findings imply that for sandy soils, BTEX and TPH concentrations can be highly dependent upon the length of time which elapses between sample collection and analysis

  8. Sampling and Control Circuit Board for an Inertial Measurement Unit

    Science.gov (United States)

    Chelmins, David T (Inventor); Powis, Richard T., Jr. (Inventor); Sands, Obed (Inventor)

    2016-01-01

    A circuit board that serves as a control and sampling interface to an inertial measurement unit ("IMU") is provided. The circuit board is also configured to interface with a local oscillator and an external trigger pulse. The circuit board is further configured to receive the external trigger pulse from an external source that time aligns the local oscillator and initiates sampling of the inertial measurement device for data at precise time intervals based on pulses from the local oscillator. The sampled data may be synchronized by the circuit board with other sensors of a navigation system via the trigger pulse.

  9. Optimal time interval between capecitabine intake and radiotherapy in preoperative chemoradiation for locally advanced rectal cancer

    International Nuclear Information System (INIS)

    Yu, Chang Sik; Kim, Tae Won; Kim, Jong Hoon; Choi, Won Sik; Kim, Hee Cheol; Chang, Heung Moon; Ryu, Min Hee; Jang, Se Jin; Ahn, Seung Do; Lee, Sang-wook; Shin, Seong Soo; Choi, Eun Kyung; Kim, Jin Cheon

    2007-01-01

    Purpose: Capecitabine and its metabolites reach peak plasma concentrations 1 to 2 hours after a single oral administration, and concentrations rapidly decrease thereafter. We performed a retrospective analysis to find the optimal time interval between capecitabine administration and radiotherapy for rectal cancer. Methods and Materials: The time interval between capecitabine intake and radiotherapy was measured in patients who were treated with preoperative radiotherapy and concurrent capecitabine for rectal cancer. Patients were classified into the following groups. Group A1 included patients who took capecitabine 1 hour before radiotherapy, and Group B1 included all other patients. Group B1 was then subdivided into Group A2 (patients who took capecitabine 2 hours before radiotherapy) and Group B2. Group B2 was further divided into Group A3 and Group B3 with the same method. Total mesorectal excision was performed 6 weeks after completion of chemoradiation and the pathologic response was evaluated. Results: A total of 200 patients were enrolled in this study. Pathologic examination showed that Group A1 had higher rates of complete regression of primary tumors in the rectum (23.5% vs. 9.6%, p = 0.01), good response (44.7% vs. 25.2%, p = 0.006), and lower T stages (p = 0.021) compared with Group B1; however, Groups A2 and A3 did not show any improvement compared with Groups B2 and B3. Multivariate analysis showed that increases in primary tumors in the rectum and good response were only significant when capecitabine was administered 1 hour before radiotherapy. Conclusion: In preoperative chemoradiotherapy for rectal cancer, the pathologic response could be improved by administering capecitabine 1 hour before radiotherapy

  10. A confirmatory holding time study for purgeable VOCs in water samples

    International Nuclear Information System (INIS)

    West, O.R.; Bayne, C.K.; Siegrist, R.L.; Holden, W.H.; Bottrell, D.W.

    1996-01-01

    Analyte stability during pre-analytical storage is essential to the accurate quantification contaminants in environmental samples. This is particularly true for volatile organic compounds (VOCS) which can easily volatilize and/or degrade during sample storage. Recognizing this, regulatory agencies require water samples be collected in vials without headspace and stored at 4 degrees C, and that analyses be conducted within 14 days, 2048 even if samples are acid-preserved. Since the selection of a 14-day holding time was largely arbitrary, the appropriateness of this requirement must be re-evaluated. The goal of the study described here was to provide regulatory agencies with the necessary data to extend the maximum holding time for properly preserved VOC water samples to 28 days

  11. The incidence and clinical associated factors of interval colorectal cancers in Southern Taiwan

    Directory of Open Access Journals (Sweden)

    Cheng-En Tsai

    2018-03-01

    Conclusion: The prevalence of interval CRC in the present study is 3.28%. Comorbidity with ESRD and shorter ascending colon withdrawal time could be factors associated with interval CRC. Good colon preparation for the patients with ESRD and more ascending colon withdrawal time could reduce the interval CRC.

  12. Using an R Shiny to Enhance the Learning Experience of Confidence Intervals

    Science.gov (United States)

    Williams, Immanuel James; Williams, Kelley Kim

    2018-01-01

    Many students find understanding confidence intervals difficult, especially because of the amalgamation of concepts such as confidence levels, standard error, point estimates and sample sizes. An R Shiny application was created to assist the learning process of confidence intervals using graphics and data from the US National Basketball…

  13. Distortion of time interval reproduction in an epileptic patient with a focal lesion in the right anterior insular/inferior frontal cortices.

    Science.gov (United States)

    Monfort, Vincent; Pfeuty, Micha; Klein, Madelyne; Collé, Steffie; Brissart, Hélène; Jonas, Jacques; Maillard, Louis

    2014-11-01

    This case report on an epileptic patient suffering from a focal lesion at the junction of the right anterior insular cortex (AIC) and the adjacent inferior frontal cortex (IFC) provides the first evidence that damage to this brain region impairs temporal performance in a visual time reproduction task in which participants had to reproduce the presentation duration (3, 5 and 7s) of emotionally-neutral and -negative pictures. Strikingly, as compared to a group of healthy subjects, the AIC/IFC case considerably overestimated reproduction times despite normal variability. The effect was obtained in all duration and emotion conditions. Such a distortion in time reproduction was not observed in four other epileptic patients without insular or inferior frontal damage. Importantly, the absolute extent of temporal over-reproduction increased in proportion to the magnitude of the target durations, which concurs with the scalar property of interval timing, and points to an impairment of time-specific rather than of non temporal (such as motor) mechanisms. Our data suggest that the disability in temporal reproduction of the AIC/IFC case would result from a distorted memory representation of the encoded duration, occurring during the process of storage and/or of recovery from memory and leading to a deviation of the temporal judgment during the reproduction task. These findings support the recent proposal that the anterior insular/inferior frontal cortices would be involved in time interval representation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Study on risk insight for additional ILRT interval extension

    International Nuclear Information System (INIS)

    Seo, M. R.; Hong, S. Y.; Kim, M. K.; Chung, B. S.; Oh, H. C.

    2005-01-01

    In U.S., the containment Integrated Leakage Rate Test (ILRT) interval was extended from 3 times per 10 years to once per 10 years based on NUREG-1493 'Performance-Based Containment Leak-Test Program' in 1995. In September, 2001, ILRT interval was extended up to once per 15 years based on Nuclear Energy Industry (NEI) provisional guidance 'Interim Guidance for Performing Risk Impact Assessments In Support of One-Time Extensions for Containment Integrated Leakage Rate Test Surveillance Intervals'. In Korea, the containment ILRT was performed with 5 year interval. But, in MOST(Ministry of Science and Technology) Notice 2004-15 'Standard for the Leak- Rate Test of the Nuclear Reactor Containment', the extension of the ILRT interval to once per 10 year can be allowed if some conditions are met. So, the safety analysis for the extension of Yonggwang Nuclear (YGN) Unit 1 and 2 ILRT interval extension to once per 10 years was completed based on the methodology in NUREG-1493. But, during review process by regulatory body, KINS, it was required that some various risk insight or index for risk analysis should be developed. So, we began to study NEI interim report for 15 year ILRT interval extension. As previous analysis based on NUREG-1493, MACCS II (MELCOR Accident Consequence Code System) computer code was used for the risk analysis of the population, and the population dose was selected as a reference index for the risk evaluation

  15. The significance of sampling time in therapeutic drug monitoring of clozapine

    DEFF Research Database (Denmark)

    Jakobsen, M I; Larsen, J R; Svensson, C K

    2017-01-01

    OBJECTIVE: Therapeutic drug monitoring (TDM) of clozapine is standardized to 12-h postdose samplings. In clinical settings, sampling time often deviates from this time point, although the importance of the deviation is unknown. To this end, serum concentrations (s-) of clozapine and its metabolite...... N-desmethyl-clozapine (norclozapine) were measured at 12 ± 1 and 2 h postdose. METHOD: Forty-six patients with a diagnosis of schizophrenia, and on stable clozapine treatment, were enrolled for hourly, venous blood sampling at 10-14 h postdose. RESULTS: Minor changes in median percentage values were...

  16. Programming with Intervals

    Science.gov (United States)

    Matsakis, Nicholas D.; Gross, Thomas R.

    Intervals are a new, higher-level primitive for parallel programming with which programmers directly construct the program schedule. Programs using intervals can be statically analyzed to ensure that they do not deadlock or contain data races. In this paper, we demonstrate the flexibility of intervals by showing how to use them to emulate common parallel control-flow constructs like barriers and signals, as well as higher-level patterns such as bounded-buffer producer-consumer. We have implemented intervals as a publicly available library for Java and Scala.

  17. The Interval Slope Method for Long-Term Forecasting of Stock Price Trends

    Directory of Open Access Journals (Sweden)

    Chun-xue Nie

    2016-01-01

    Full Text Available A stock price is a typical but complex type of time series data. We used the effective prediction of long-term time series data to schedule an investment strategy and obtain higher profit. Due to economic, environmental, and other factors, it is very difficult to obtain a precise long-term stock price prediction. The exponentially segmented pattern (ESP is introduced here and used to predict the fluctuation of different stock data over five future prediction intervals. The new feature of stock pricing during the subinterval, named the interval slope, can characterize fluctuations in stock price over specific periods. The cumulative distribution function (CDF of MSE was compared to those of MMSE-BC and SVR. We concluded that the interval slope developed here can capture more complex dynamics of stock price trends. The mean stock price can then be predicted over specific time intervals relatively accurately, in which multiple mean values over time intervals are used to express the time series in the long term. In this way, the prediction of long-term stock price can be more precise and prevent the development of cumulative errors.

  18. Measurement of subcritical multiplication by the interval distribution method

    International Nuclear Information System (INIS)

    Nelson, G.W.

    1985-01-01

    The prompt decay constant or the subcritical neutron multiplication may be determined by measuring the distribution of the time intervals between successive neutron counts. The distribution data is analyzed by least-squares fitting to a theoretical distribution function derived from a point reactor probability model. Published results of measurements with one- and two-detector systems are discussed. Data collection times are shorter, and statistical errors are smaller the nearer the system is to delayed critical. Several of the measurements indicate that a shorter data collection time and higher accuracy are possible with the interval distribution method than with the Feynman variance method

  19. First Passage Time Intervals of Gaussian Processes

    Science.gov (United States)

    Perez, Hector; Kawabata, Tsutomu; Mimaki, Tadashi

    1987-08-01

    The first passage time problem of a stationary Guassian process is theretically and experimentally studied. Renewal functions are derived for a time-dependent boundary and numerically calculated for a Gaussian process having a seventh-order Butterworth spectrum. The results show a multipeak property not only for the constant boundary but also for a linearly increasing boundary. The first passage time distribution densities were experimentally determined for a constant boundary. The renewal functions were shown to be a fairly good approximation to the distribution density over a limited range.

  20. Effect of palady and cup feeding on premature neonates′ weight gain and reaching full oral feeding time interval

    Directory of Open Access Journals (Sweden)

    Maryam Marofi

    2016-01-01

    Full Text Available Background: Premature neonates′ feeding is of great importance due to its effective role in their growth. These neonates should reach an independent oral nutrition stage before being discharged from the Neonatal Intensive care Unit. Therefore, the researcher decided to conduct a study on the effect of palady and cup feeding on premature neonates′ weight gain and their reaching full oral feeding time interval. Materials and Methods: This is a clinical trial with a quantitative design conducted on 69 premature infants (gestational age between 29 and 32 weeks who were assigned to cup (n = 34 and palady (n = 35 feeding groups through random allocation. The first feeding was administrated either by cup or palady method in each shift within seven sequential days (total of 21 cup and palady feedings. Then, the rest of feeding was administrated by gavage. Results: Mean hospitalization time (cup = 39.01 and palady = 30.4; P < 0.001 and mean time interval to reach full oral feeding (cup = 33.7 and palady = 24.1; P < 0.001 were significantly lower in palady group compared to cup group. Mean weight changes of neonates 7 weeks after the intervention compared to those in the beginning of the intervention were significantly more in palady group compared to the cup group (cup = 146.7 and palady = 198.8; P < 0.001. Conclusions: The neonates in palady group reached full oral feeding earlier than those of cup group. Subjects′ weight gain was also higher in palady group compared to the cup group. Premature neonates with over 30 weeks of gestational age and physiological stability can be fed by palady.

  1. Sampling methods for rumen microbial counts by Real-Time PCR techniques

    Directory of Open Access Journals (Sweden)

    S. Puppo

    2010-02-01

    Full Text Available Fresh rumen samples were withdrawn from 4 cannulated buffalo females fed a fibrous diets in order to quantify bacteria concentration in the rumen by Real-Time PCR techniques. To obtain DNA of a good quality from whole rumen fluid, eight (M1-M8 different pre-filtration methods (cheese cloths, glass-fibre and nylon filter in combination with various centrifugation speeds (1000, 5000 and 14,000 rpm were tested. Genomic DNA extraction was performed either on fresh or frozen samples (-20°C. The quantitative bacteria analysis was realized according to Real-Time PCR procedure for Butyrivibrio fibrisolvens reported in literature. M5 resulted the best sampling procedure allowing to obtain a suitable genomic DNA. No differences were revealed between fresh and frozen samples.

  2. Comparison of Bootstrap Confidence Intervals Using Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    Roberto S. Flowers-Cano

    2018-02-01

    Full Text Available Design of hydraulic works requires the estimation of design hydrological events by statistical inference from a probability distribution. Using Monte Carlo simulations, we compared coverage of confidence intervals constructed with four bootstrap techniques: percentile bootstrap (BP, bias-corrected bootstrap (BC, accelerated bias-corrected bootstrap (BCA and a modified version of the standard bootstrap (MSB. Different simulation scenarios were analyzed. In some cases, the mother distribution function was fit to the random samples that were generated. In other cases, a distribution function different to the mother distribution was fit to the samples. When the fitted distribution had three parameters, and was the same as the mother distribution, the intervals constructed with the four techniques had acceptable coverage. However, the bootstrap techniques failed in several of the cases in which the fitted distribution had two parameters.

  3. Sampling and chemical analysis of groundwaters from the exploratory boreholes

    International Nuclear Information System (INIS)

    Wittwer, C.

    1986-10-01

    As a part of the Nagra geological investigation programme in northern Switzerland, numerous water samples were taken in the Boettstein, Weiach, Riniken, Schafisheim, Kaisten and Leuggern boreholes to obtain information on the chemistry and residence times of deep groundwaters. This report contains a compilation of hydrochemical data, comments on the individual water sampling actions and an evaluation of sample quality with respect to admixing of drilling fluids. The samples were taken from separate test intervals in the sediments and the crystalline rock. After removal of various types of drilling fluids such as mud as well as fresh water or deionised water during a cleaning phase, the samples were taken at the surface or at depth using pressure vessels. The tracers added to the drilling fluids (uranine, m-TFMBA) as well as the tritium content were used for a quantiative estimation of the content of drilling fluid in the samples (contamination). With a view fo further geochemical modelling, the samples were assessed with reference to the effect of contamination on the results of the chemical analyses. A total of 68 water samples were taken from 53 different intervals: - 27 samples had problem-free cleaning phases and were taken with negligible contamination. - 23 samples were taken under difficult conditions. Problems with hydraulic communication around packers, uncertain origin, inaccuracy as to extent of contamination, presence of cement, possible traces of salt from drilling fluid etc. meant that the analyses could only be used with extreme caution or after additional data-processing. - The analysis results from 18 samples will be disregarded due to significant drilling fluid content or because more reliable data are available for the same test interval. (author)

  4. An Intervention to Reduce the Time Interval Between Hospital Entry and Emergency Coronary Angiography in Patients with ST-Elevation Myocardial Infarction.

    Science.gov (United States)

    Karkabi, Basheer; Jaffe, Ronen; Halon, David A; Merdler, Amnon; Khader, Nader; Rubinshtein, Ronen; Goldstein, Jacob; Zafrir, Barak; Zissman, Keren; Ben-Dov, Nissan; Gabrielly, Michael; Fuks, Alex; Shiran, Avinoam; Adawi, Salim; Hellman, Yaron; Shahla, Johny; Halabi, Salim; Flugelman, Moshe Y; Cohen, Shai; Bergman, Irina; Kassem, Sameer; Shapira, Chen

    2017-09-01

    Outcomes of patients with acute ST-elevation myocardial infarction (STEMI) are strongly correlated to the time interval from hospital entry to primary percutaneous coronary intervention (PPCI). Current guidelines recommend a door to balloon time of < 90 minutes. To reduce the time from hospital admission to PPCI and to increase the proportion of patients treated within 90 minutes. In March 2013 the authors launched a seven-component intervention program:  Direct patient evacuation by out-of-hospital emergency medical services to the coronary intensive care unit or catheterization laboratory Education program for the emergency department staff Dissemination of information regarding the urgency of the PPCI decision Activation of the catheterization team by a single phone call Reimbursement for transportation costs to on-call staff who use their own cars Improvement in the quality of medical records Investigation of failed cases and feedback. During the 14 months prior to the intervention, initiation of catheterization occurred within 90 minutes of hospital arrival in 88/133 patients(65%); during the 18 months following the start of the intervention, the rate was 181/200 (90%) (P < 0.01). The respective mean/median times to treatment were 126/67 minutes and 52/47 minutes (P < 0.01). Intervention also resulted in shortening of the time interval from hospital entry to PPCI on nights and weekends. Following implementation of a comprehensive intervention, the time from hospital admission to PPCI of STEMI patients shortened significantly, as did the proportion of patients treated within 90 minutes of hospital arrival.

  5. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  6. Population-Based Pediatric Reference Intervals in General Clinical Chemistry: A Swedish Survey.

    Science.gov (United States)

    Ridefelt, Peter

    2015-01-01

    Very few high quality studies on pediatric reference intervals for general clinical chemistry and hematology analytes have been performed. Three recent prospective community-based projects utilising blood samples from healthy children in Sweden, Denmark and Canada have substantially improved the situation. The Swedish survey included 701 healthy children. Reference intervals for general clinical chemistry and hematology were defined.

  7. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions. II. A simplified implementation.

    Science.gov (United States)

    Tao, Guohua; Miller, William H

    2012-09-28

    An efficient time-dependent (TD) Monte Carlo (MC) importance sampling method has recently been developed [G. Tao and W. H. Miller, J. Chem. Phys. 135, 024104 (2011)] for the evaluation of time correlation functions using the semiclassical (SC) initial value representation (IVR) methodology. In this TD-SC-IVR method, the MC sampling uses information from both time-evolved phase points as well as their initial values, and only the "important" trajectories are sampled frequently. Even though the TD-SC-IVR was shown in some benchmark examples to be much more efficient than the traditional time-independent sampling method (which uses only initial conditions), the calculation of the SC prefactor-which is computationally expensive, especially for large systems-is still required for accepted trajectories. In the present work, we present an approximate implementation of the TD-SC-IVR method that is completely prefactor-free; it gives the time correlation function as a classical-like magnitude function multiplied by a phase function. Application of this approach to flux-flux correlation functions (which yield reaction rate constants) for the benchmark H + H(2) system shows very good agreement with exact quantum results. Limitations of the approximate approach are also discussed.

  8. Interpregnancy intervals: impact of postpartum contraceptive effectiveness and coverage.

    Science.gov (United States)

    Thiel de Bocanegra, Heike; Chang, Richard; Howell, Mike; Darney, Philip

    2014-04-01

    The purpose of this study was to determine the use of contraceptive methods, which was defined by effectiveness, length of coverage, and their association with short interpregnancy intervals, when controlling for provider type and client demographics. We identified a cohort of 117,644 women from the 2008 California Birth Statistical Master file with second or higher order birth and at least 1 Medicaid (Family Planning, Access, Care, and Treatment [Family PACT] program or Medi-Cal) claim within 18 months after index birth. We explored the effect of contraceptive method provision on the odds of having an optimal interpregnancy interval and controlled for covariates. The average length of contraceptive coverage was 3.81 months (SD = 4.84). Most women received user-dependent hormonal contraceptives as their most effective contraceptive method (55%; n = 65,103 women) and one-third (33%; n = 39,090 women) had no contraceptive claim. Women who used long-acting reversible contraceptive methods had 3.89 times the odds and women who used user-dependent hormonal methods had 1.89 times the odds of achieving an optimal birth interval compared with women who used barrier methods only; women with no method had 0.66 times the odds. When user-dependent methods are considered, the odds of having an optimal birth interval increased for each additional month of contraceptive coverage by 8% (odds ratio, 1.08; 95% confidence interval, 1.08-1.09). Women who were seen by Family PACT or by both Family PACT and Medi-Cal providers had significantly higher odds of optimal birth intervals compared with women who were served by Medi-Cal only. To achieve optimal birth spacing and ultimately to improve birth outcomes, attention should be given to contraceptive counseling and access to contraceptive methods in the postpartum period. Copyright © 2014 Mosby, Inc. All rights reserved.

  9. Prognostic Value of Cardiac Time Intervals by Tissue Doppler Imaging M-Mode in Patients With Acute ST-Segment-Elevation Myocardial Infarction Treated With Primary Percutaneous Coronary Intervention

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Mogelvang, Rasmus; Søgaard, Peter

    2013-01-01

    Background- Color tissue Doppler imaging M-mode through the mitral leaflet is an easy and precise method to estimate all cardiac time intervals from 1 cardiac cycle and thereby obtain the myocardial performance index (MPI). However, the prognostic value of the cardiac time intervals and the MPI...... assessed by color tissue Doppler imaging M-mode through the mitral leaflet in patients with ST-segment-elevation myocardial infarction (MI) is unknown. Methods and Results- In total, 391 patients were admitted with an ST-segment-elevation MI, treated with primary percutaneous coronary intervention...

  10. Chaos on the interval

    CERN Document Server

    Ruette, Sylvie

    2017-01-01

    The aim of this book is to survey the relations between the various kinds of chaos and related notions for continuous interval maps from a topological point of view. The papers on this topic are numerous and widely scattered in the literature; some of them are little known, difficult to find, or originally published in Russian, Ukrainian, or Chinese. Dynamical systems given by the iteration of a continuous map on an interval have been broadly studied because they are simple but nevertheless exhibit complex behaviors. They also allow numerical simulations, which enabled the discovery of some chaotic phenomena. Moreover, the "most interesting" part of some higher-dimensional systems can be of lower dimension, which allows, in some cases, boiling it down to systems in dimension one. Some of the more recent developments such as distributional chaos, the relation between entropy and Li-Yorke chaos, sequence entropy, and maps with infinitely many branches are presented in book form for the first time. The author gi...

  11. On sampling social networking services

    OpenAIRE

    Wang, Baiyang

    2012-01-01

    This article aims at summarizing the existing methods for sampling social networking services and proposing a faster confidence interval for related sampling methods. It also includes comparisons of common network sampling techniques.

  12. A Time Interval of More Than 18 Months Between a Pregnancy and a Roux-en-Y Gastric Bypass Increases the Risk of Iron Deficiency and Anaemia in Pregnancy.

    Science.gov (United States)

    Crusell, Mie; Nilas, Lisbeth; Svare, Jens; Lauenborg, Jeannet

    2016-10-01

    The aim of the study is to explore the impact of time between Roux-en-Y gastric bypass (RYGB) and pregnancy on obstetrical outcome and nutritional derangements. In a retrospective cross-sectional study of pregnant women admitted for antenatal care at two tertiary hospitals, we examined 153 women with RYGB and a singleton pregnancy of at least 24 weeks. The women were stratified according to a pregnancy pregnancy, gestational hypertension, length of pregnancy, mode of delivery and foetal birth weight. The two groups were comparable regarding age, parity and prepregnancy body mass index. The frequency of iron deficiency anaemia (ferritin pregnancy outcome or birth weight between the two groups. A long surgery-to-pregnancy time interval after a RYGB increases the risk of iron deficiency anaemia but not of other nutritional deficits. Time interval does not seem to have an adverse effect on the obstetrical outcome, including intrauterine growth restriction. Specific attention is needed on iron deficit with increasing surgery-to-pregnancy time interval.

  13. CLSI-based transference of CALIPER pediatric reference intervals to Beckman Coulter AU biochemical assays.

    Science.gov (United States)

    Abou El Hassan, Mohamed; Stoianov, Alexandra; Araújo, Petra A T; Sadeghieh, Tara; Chan, Man Khun; Chen, Yunqi; Randell, Edward; Nieuwesteeg, Michelle; Adeli, Khosrow

    2015-11-01

    The CALIPER program has established a comprehensive database of pediatric reference intervals using largely the Abbott ARCHITECT biochemical assays. To expand clinical application of CALIPER reference standards, the present study is aimed at transferring CALIPER reference intervals from the Abbott ARCHITECT to Beckman Coulter AU assays. Transference of CALIPER reference intervals was performed based on the CLSI guidelines C28-A3 and EP9-A2. The new reference intervals were directly verified using up to 100 reference samples from the healthy CALIPER cohort. We found a strong correlation between Abbott ARCHITECT and Beckman Coulter AU biochemical assays, allowing the transference of the vast majority (94%; 30 out of 32 assays) of CALIPER reference intervals previously established using Abbott assays. Transferred reference intervals were, in general, similar to previously published CALIPER reference intervals, with some exceptions. Most of the transferred reference intervals were sex-specific and were verified using healthy reference samples from the CALIPER biobank based on CLSI criteria. It is important to note that the comparisons performed between the Abbott and Beckman Coulter assays make no assumptions as to assay accuracy or which system is more correct/accurate. The majority of CALIPER reference intervals were transferrable to Beckman Coulter AU assays, allowing the establishment of a new database of pediatric reference intervals. This further expands the utility of the CALIPER database to clinical laboratories using the AU assays; however, each laboratory should validate these intervals for their analytical platform and local population as recommended by the CLSI. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. A novel recursive Fourier transform for nonuniform sampled signals: application to heart rate variability spectrum estimation.

    Science.gov (United States)

    Holland, Alexander; Aboy, Mateo

    2009-07-01

    We present a novel method to iteratively calculate discrete Fourier transforms for discrete time signals with sample time intervals that may be widely nonuniform. The proposed recursive Fourier transform (RFT) does not require interpolation of the samples to uniform time intervals, and each iterative transform update of N frequencies has computational order N. Because of the inherent non-uniformity in the time between successive heart beats, an application particularly well suited for this transform is power spectral density (PSD) estimation for heart rate variability. We compare RFT based spectrum estimation with Lomb-Scargle Transform (LST) based estimation. PSD estimation based on the LST also does not require uniform time samples, but the LST has a computational order greater than Nlog(N). We conducted an assessment study involving the analysis of quasi-stationary signals with various levels of randomly missing heart beats. Our results indicate that the RFT leads to comparable estimation performance to the LST with significantly less computational overhead and complexity for applications requiring iterative spectrum estimations.

  15. A probabilistic approach for representation of interval uncertainty

    International Nuclear Information System (INIS)

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  16. Synchronization of Switched Interval Networks and Applications to Chaotic Neural Networks

    OpenAIRE

    Cao, Jinde; Alofi, Abdulaziz; Al-Mazrooei, Abdullah; Elaiw, Ahmed

    2013-01-01

    This paper investigates synchronization problem of switched delay networks with interval parameters uncertainty, based on the theories of the switched systems and drive-response technique, a mathematical model of the switched interval drive-response error system is established. Without constructing Lyapunov-Krasovskii functions, introducing matrix measure method for the first time to switched time-varying delay networks, combining Halanay inequality technique, synchroniza...

  17. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  18. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    Science.gov (United States)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  19. An Interval Bound Algorithm of optimizing reactor core loading pattern by using reactivity interval schema

    International Nuclear Information System (INIS)

    Gong Zhaohu; Wang Kan; Yao Dong

    2011-01-01

    Highlights: → We present a new Loading Pattern Optimization method - Interval Bound Algorithm (IBA). → IBA directly uses the reactivity of fuel assemblies and burnable poison. → IBA can optimize fuel assembly orientation in a coupled way. → Numerical experiment shows that IBA outperforms genetic algorithm and engineers. → We devise DDWF technique to deal with multiple objectives and constraints. - Abstract: In order to optimize the core loading pattern in Nuclear Power Plants, the paper presents a new optimization method - Interval Bound Algorithm (IBA). Similar to the typical population based algorithms, e.g. genetic algorithm, IBA maintains a population of solutions and evolves them during the optimization process. IBA acquires the solution by statistical learning and sampling the control variable intervals of the population in each iteration. The control variables are the transforms of the reactivity of fuel assemblies or the worth of burnable poisons, which are the crucial heuristic information for loading pattern optimization problems. IBA can deal with the relationship between the dependent variables by defining the control variables. Based on the IBA algorithm, a parallel Loading Pattern Optimization code, named IBALPO, has been developed. To deal with multiple objectives and constraints, the Dynamic Discontinuous Weight Factors (DDWF) for the fitness function have been used in IBALPO. Finally, the code system has been used to solve a realistic reloading problem and a better pattern has been obtained compared with the ones searched by engineers and genetic algorithm, thus the performance of the code is proved.

  20. Transmission line sag calculations using interval mathematics

    Energy Technology Data Exchange (ETDEWEB)

    Shaalan, H. [Institute of Electrical and Electronics Engineers, Washington, DC (United States)]|[US Merchant Marine Academy, Kings Point, NY (United States)

    2007-07-01

    Electric utilities are facing the need for additional generating capacity, new transmission systems and more efficient use of existing resources. As such, there are several uncertainties associated with utility decisions. These uncertainties include future load growth, construction times and costs, and performance of new resources. Regulatory and economic environments also present uncertainties. Uncertainty can be modeled based on a probabilistic approach where probability distributions for all of the uncertainties are assumed. Another approach to modeling uncertainty is referred to as unknown but bounded. In this approach, the upper and lower bounds on the uncertainties are assumed without probability distributions. Interval mathematics is a tool for the practical use and extension of the unknown but bounded concept. In this study, the calculation of transmission line sag was used as an example to demonstrate the use of interval mathematics. The objective was to determine the change in cable length, based on a fixed span and an interval of cable sag values for a range of temperatures. The resulting change in cable length was an interval corresponding to the interval of cable sag values. It was shown that there is a small change in conductor length due to variation in sag based on the temperature ranges used in this study. 8 refs.

  1. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    OpenAIRE

    Yan, Ying; Suo, Bin

    2017-01-01

    Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix b...

  2. Method for Hot Real-Time Sampling of Pyrolysis Vapors

    Energy Technology Data Exchange (ETDEWEB)

    Pomeroy, Marc D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-29

    Biomass Pyrolysis has been an increasing topic of research, in particular as a replacement for crude oil. This process utilizes moderate temperatures to thermally deconstruct the biomass which is then condensed into a mixture of liquid oxygenates to be used as fuel precursors. Pyrolysis oils contain more than 400 compounds, up to 60 percent of which do not re-volatilize for subsequent chemical analysis. Vapor chemical composition is also complicated as additional condensation reactions occur during the condensation and collection of the product. Due to the complexity of the pyrolysis oil, and a desire to catalytically upgrade the vapor composition before condensation, online real-time analytical techniques such as Molecular Beam Mass Spectrometry (MBMS) are of great use. However, in order to properly sample hot pyrolysis vapors, many challenges must be overcome. Sampling must occur within a narrow range of temperatures to reduce product composition changes from overheating or partial condensation or plugging of lines from condensed products. Residence times must be kept at a minimum to reduce further reaction chemistries. Pyrolysis vapors also form aerosols that are carried far downstream and can pass through filters resulting in build-up in downstream locations. The co-produced bio-char and ash from the pyrolysis process can lead to plugging of the sample lines, and must be filtered out at temperature, even with the use of cyclonic separators. A practical approach for considerations and sampling system design, as well as lessons learned are integrated into the hot analytical sampling system of the National Renewable Energy Laboratory's (NREL) Thermochemical Process Development Unit (TCPDU) to provide industrially relevant demonstrations of thermochemical transformations of biomass feedstocks at the pilot scale.

  3. Chosen interval methods for solving linear interval systems with special type of matrix

    Science.gov (United States)

    Szyszka, Barbara

    2013-10-01

    The paper is devoted to chosen direct interval methods for solving linear interval systems with special type of matrix. This kind of matrix: band matrix with a parameter, from finite difference problem is obtained. Such linear systems occur while solving one dimensional wave equation (Partial Differential Equations of hyperbolic type) by using the central difference interval method of the second order. Interval methods are constructed so as the errors of method are enclosed in obtained results, therefore presented linear interval systems contain elements that determining the errors of difference method. The chosen direct algorithms have been applied for solving linear systems because they have no errors of method. All calculations were performed in floating-point interval arithmetic.

  4. Identification of clinical biomarkers for pre-analytical quality control of blood samples.

    Science.gov (United States)

    Kang, Hyun Ju; Jeon, Soon Young; Park, Jae-Sun; Yun, Ji Young; Kil, Han Na; Hong, Won Kyung; Lee, Mee-Hee; Kim, Jun-Woo; Jeon, Jae-Pil; Han, Bok Ghee

    2013-04-01

    Pre-analytical conditions are key factors in maintaining the high quality of biospecimens. They are necessary for accurate reproducibility of experiments in the field of biomarker discovery as well as achieving optimal specificity of laboratory tests for clinical diagnosis. In research at the National Biobank of Korea, we evaluated the impact of pre-analytical conditions on the stability of biobanked blood samples by measuring biochemical analytes commonly used in clinical laboratory tests. We measured 10 routine laboratory analytes in serum and plasma samples from healthy donors (n = 50) with a chemistry autoanalyzer (Hitachi 7600-110). The analyte measurements were made at different time courses based on delay of blood fractionation, freezing delay of fractionated serum and plasma samples, and at different cycles (0, 1, 3, 6, 9) of freeze-thawing. Statistically significant changes from the reference sample mean were determined using the repeated-measures ANOVA and the significant change limit (SCL). The serum levels of GGT and LDH were changed significantly depending on both the time interval between blood collection and fractionation and the time interval between fractionation and freezing of serum and plasma samples. The glucose level was most sensitive only to the elapsed time between blood collection and centrifugation for blood fractionation. Based on these findings, a simple formula (glucose decrease by 1.387 mg/dL per hour) was derived to estimate the length of time delay after blood collection. In addition, AST, BUN, GGT, and LDH showed sensitive responses to repeated freeze-thaw cycles of serum and plasma samples. These results suggest that GGT and LDH measurements can be used as quality control markers for certain pre-analytical conditions (eg, delayed processing or repeated freeze-thawing) of blood samples which are either directly used in the laboratory tests or stored for future research in the biobank.

  5. Construction of prediction intervals for Palmer Drought Severity Index using bootstrap

    Science.gov (United States)

    Beyaztas, Ufuk; Bickici Arikan, Bugrayhan; Beyaztas, Beste Hamiye; Kahya, Ercan

    2018-04-01

    In this study, we propose an approach based on the residual-based bootstrap method to obtain valid prediction intervals using monthly, short-term (three-months) and mid-term (six-months) drought observations. The effects of North Atlantic and Arctic Oscillation indexes on the constructed prediction intervals are also examined. Performance of the proposed approach is evaluated for the Palmer Drought Severity Index (PDSI) obtained from Konya closed basin located in Central Anatolia, Turkey. The finite sample properties of the proposed method are further illustrated by an extensive simulation study. Our results revealed that the proposed approach is capable of producing valid prediction intervals for future PDSI values.

  6. Rational Arithmetic Mathematica Functions to Evaluate the Two-Sided One Sample K-S Cumulative Sampling Distribution

    Directory of Open Access Journals (Sweden)

    J. Randall Brown

    2007-06-01

    Full Text Available One of the most widely used goodness-of-fit tests is the two-sided one sample Kolmogorov-Smirnov (K-S test which has been implemented by many computer statistical software packages. To calculate a two-sided p value (evaluate the cumulative sampling distribution, these packages use various methods including recursion formulae, limiting distributions, and approximations of unknown accuracy developed over thirty years ago. Based on an extensive literature search for the two-sided one sample K-S test, this paper identifies an exact formula for sample sizes up to 31, six recursion formulae, and one matrix formula that can be used to calculate a p value. To ensure accurate calculation by avoiding catastrophic cancelation and eliminating rounding error, each of these formulae is implemented in rational arithmetic. For the six recursion formulae and the matrix formula, computational experience for sample sizes up to 500 shows that computational times are increasing functions of both the sample size and the number of digits in the numerator and denominator integers of the rational number test statistic. The computational times of the seven formulae vary immensely but the Durbin recursion formula is almost always the fastest. Linear search is used to calculate the inverse of the cumulative sampling distribution (find the confidence interval half-width and tables of calculated half-widths are presented for sample sizes up to 500. Using calculated half-widths as input, computational times for the fastest formula, the Durbin recursion formula, are given for sample sizes up to two thousand.

  7. Time interval measurement between two emissions: Kr + Au; Mesure de l`intervalle de temps entre deux emissions: Kr + Au

    Energy Technology Data Exchange (ETDEWEB)

    Aboufirassi, M; Bougault, R.; Brou, R.; Colin, J.; Durand, D.; Genoux-Lubain, A.; Horn, D.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Lefebvres, F.; Lopez, O.; Louvel, M.; Mahi, M.; Steckmeyer, J.C.; Tamain, B. [Lab. de Physique Corpusculaire, Caen Univ., 14 (France); LPC (Caen) - CRN (Strasbourg) - GANIL Collaboration

    1998-04-01

    To indicate the method allowing the determination of the emission intervals, the results obtained with the Kr + Au system at 43 and 60 A.MeV are presented. The experiments were performed with the NAUTILUS exclusive detectors. Central collisions were selected by means of a relative velocity criterion to reject the events containing a forward emitted fragment. For the two bombardment energies the data analysis shows that the formation of a compound of mass around A = 200. By comparing the fragment dynamical variables with simulations one can conclude about the simultaneity of the compound deexcitation processes. It was found that a 5 MeV/A is able to reproduce the characteristics of the detected fragments. Also, it was found that to reproduce the dynamical characteristics of the fragments issued from central collisions it was not necessary to superimpose a radial collective energy upon the Coulomb and thermal motion. The distribution of the relative angles between detected fragments is used here as a chronometer. For simultaneous ruptures the small relative angles are forbidden by the Coulomb repulsion, while for sequential processes this interdiction is the more lifted the longer the interval between the two emissions is. For the system discussed here the comparison between simulation and data has been carried out for the extreme cases, i.e. for a vanishing and infinite time interval between the two emissions, respectively. More sophisticated simulations to describe angular distributions between the emitted fragments were also developed 2 refs.

  8. Assessment of Natural Radioactivity in TENORM Samples Using Different Techniques

    International Nuclear Information System (INIS)

    Salman, Kh.A.; Shahein, A.Y.

    2009-01-01

    In petroleum oil industries, technologically-enhanced, naturally occurring radioactive materials are produced. The presence of TENORM constitutes a significant radiological human health hazard. In the present work, liquid scintillation counting technique was used to determine both 222 Rn and 226 Ra concentrations in TENORM samples, by measuring 222 Rn concentrations in the sample at different intervals of time after preparation. The radiation doses from the TENORM samples were estimated using thermoluminenscent detector (TLD-4000). The estimated radiation doses were found to be proportional to both the measured radiation doses in site and natural activity concentration in the samples that measured with LSC

  9. A critical evaluation of the Beckman Coulter Access hsTnI: Analytical performance, reference interval and concordance.

    Science.gov (United States)

    Pretorius, Carel J; Tate, Jillian R; Wilgen, Urs; Cullen, Louise; Ungerer, Jacobus P J

    2018-05-01

    We investigated the analytical performance, outlier rate, carryover and reference interval of the Beckman Coulter Access hsTnI in detail and compared it with historical and other commercial assays. We compared the imprecision, detection capability, analytical sensitivity, outlier rate and carryover against two previous Access AccuTnI assay versions. We established the reference interval with stored samples from a previous study and compared the concordances and variances with the Access AccuTnI+3 as well as with two commercial assays. The Access hsTnI had excellent analytical sensitivity with the calibration slope 5.6 times steeper than the Access AccuTnI+3. The detection capability was markedly improved with the SD of the blank 0.18-0.20 ng/L, LoB 0.29-0.33 ng/L and LoD 0.58-0.69 ng/L. All the reference interval samples had a result above the LoB value. At a mean concentration of 2.83 ng/L the SD was 0.28 ng/L (CV 9.8%). Carryover (0.005%) and outlier (0.046%) rates were similar to the Access AccuTnI+3. The combined male and female 99th percentile reference interval was 18.2 ng/L (90% CI 13.2-21.1 ng/L). Concordance amongst the assays was poor with only 16.7%, 19.6% and 15.2% of samples identified by all 4 assays as above the 99th, 97.5th and 95th percentiles. Analytical imprecision was a minor contributor to the observed variances between assays. The Beckman Coulter Access hsTnI assay has excellent analytical sensitivity and precision characteristics close to zero. This allows cTnI measurement in all healthy individuals and the capability to identify numerically small differences between serial samples as statistically significant. Concordance in healthy individuals remains poor amongst assays. Crown Copyright © 2018. Published by Elsevier Inc. All rights reserved.

  10. Acute high-intensity interval running increases markers of gastrointestinal damage and permeability but not gastrointestinal symptoms.

    Science.gov (United States)

    Pugh, Jamie N; Impey, Samuel G; Doran, Dominic A; Fleming, Simon C; Morton, James P; Close, Graeme L

    2017-09-01

    The purpose of this study was to investigate the effects of high-intensity interval running on markers of gastrointestinal (GI) damage and permeability alongside subjective symptoms of GI discomfort. Eleven male runners completed an acute bout of high-intensity interval training (HIIT) (eighteen 400-m runs at 120% maximal oxygen uptake) where markers of GI permeability, intestinal damage, and GI discomfort symptoms were assessed and compared with resting conditions. Compared with rest, HIIT significantly increased serum lactulose/rhamnose ratio (0.051 ± 0.016 vs. 0.031 ± 0.021, p = 0.0047; 95% confidence interval (CI) = 0.006 to 0.036) and sucrose concentrations (0.388 ± 0.217 vs. 0.137 ± 0.148 mg·L -1 ; p HIIT and resting conditions. Plasma intestinal-fatty acid binding protein (I-FABP) was significantly increased (p HIIT whereas no changes were observed during rest. Mild symptoms of GI discomfort were reported immediately and at 24 h post-HIIT, although these symptoms did not correlate to GI permeability or I-FABP. In conclusion, acute HIIT increased GI permeability and intestinal I-FABP release, although these do not correlate with symptoms of GI discomfort. Furthermore, by using serum sampling, we provide data showing that it is possible to detect changes in intestinal permeability that is not observed using urinary sampling over a shorter time-period.

  11. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    Science.gov (United States)

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  12. The Alteration of Emotion Regulation Precedes the Deficits in Interval Timing in the BACHD Rat Model for Huntington Disease

    Directory of Open Access Journals (Sweden)

    Daniel Garces

    2018-05-01

    Full Text Available Huntington disease (HD is an autosomal dominantly inherited, progressive neurodegenerative disorder which is accompanied by executive dysfunctions and emotional alteration. The aim of the present study was to assess the impact of emotion/stress on on-going highly demanding cognitive tasks, i.e., temporal processing, as a function of age in BACHD rats (a “full length” model of HD. Middle-aged (4–6 months and old (10–12 months rats were first trained on a 2 vs. 8-s temporal discrimination task, and then exposed to a series of bisection tests under normal and stressful (10 mild unpredictable foot-shocks conditions. The animals were then trained on a peak interval task, in which reinforced fixed-interval (FI 30-s trials were randomly intermixed with non-reinforced probe trials. After training, the effect of stress upon time perception was again assessed. Sensitivity to foot-shocks was also assessed independently. The results show effects of both age and genotype, with largely greater effects in old BACHD animals. The older BACHD animals had impaired learning in both tasks, but reached equivalent levels of performance as WT animals at the end of training in the temporal discrimination task, while remaining impaired in the peak interval task. Whereas sensitivity to foot-shock did not differ between BACHD and WT rats, delivery of foot-shocks during the test sessions had a disruptive impact on temporal behavior in WT animals, an effect which increased with age. In contrast, BACHD rats, independent of age, did not show any significant disruption under stress. In conclusion, BACHD rats showed a disruption in temporal learning in late symptomatic animals. Age-related modification in stress-induced impairment of temporal control of behavior was also observed, an effect which was greatly reduced in BACHD animals, thus confirming previous results suggesting reduced emotional reactivity in HD animals. The results suggest a staggered onset in cognitive

  13. The real-time fitting of radioactive decay curves. Pt. 3. Counting during sampling

    International Nuclear Information System (INIS)

    Hartley, B.M.

    1994-01-01

    An analysis of a least-squares method for the real-time fitting of the theoretical total count function to the actual total count from radioactive decays has been given previously for the case where counting takes place after a sample is taken. The counting may be done in a number of different counting systems which distinguish between different types or energies of radiation emitted from the sample. The method would allow real-time determination of the numbers of atoms and hence activities of the individual isotopes present and has been designated the Time Evolved Least-Squares method (TELS). If the radioactivity which is to be measured exists as an aerosol or in a form where a sample is taken at a constant rate it may be possible to count during sampling and by so doing reduce the total time required to determine the activity of the individual isotopes present. The TELS method is extended here to the case where counting and the evaluation of the activity takes place concurrently with the sampling. The functions which need to be evaluated are derived and the calculations required to implement the method are discussed. As with the TELS method of counting after sampling the technique of counting during sampling and the simultaneous evaluation of activity could be achieved in real-time. Results of testing the method by computer simulation for two counting schemes for the descendants of radon are presented. ((orig.))

  14. Reference Interval and Subject Variation in Excretion of Urinary Metabolites of Nicotine from Non-Smoking Healthy Subjects in Denmark

    DEFF Research Database (Denmark)

    Hansen, Å. M.; Garde, A. H.; Christensen, J. M.

    2001-01-01

    for determination of cotinine was carried out on 27 samples from non-smokers and smokers. Results obtained from the RIA method showed 2.84 [confidence interval (CI): 2.50; 3.18] times higher results compared to the GC-MS method. A linear correlation between the two methods was demonstrated (rho=0.96). CONCLUSION......BACKGROUND: Passive smoking has been found to be a respiratory health hazard in humans. The present study describes the calculation of a reference interval for urinary nicotine metabolites calculated as cotinine equivalents on the basis of 72 non-smokers exposed to tobacco smoke less than 25....... Parametric reference interval for excretion of nicotine metabolites in urine from non-smokers was established according to International Union of Pure and Applied Chemistry (IUPAC) and International Federation for Clinical Chemistry (IFCC) for use of risk assessment of exposure to tobacco smoke...

  15. Optimal parallel algorithms for problems modeled by a family of intervals

    Science.gov (United States)

    Olariu, Stephan; Schwing, James L.; Zhang, Jingyuan

    1992-01-01

    A family of intervals on the real line provides a natural model for a vast number of scheduling and VLSI problems. Recently, a number of parallel algorithms to solve a variety of practical problems on such a family of intervals have been proposed in the literature. Computational tools are developed, and it is shown how they can be used for the purpose of devising cost-optimal parallel algorithms for a number of interval-related problems including finding a largest subset of pairwise nonoverlapping intervals, a minimum dominating subset of intervals, along with algorithms to compute the shortest path between a pair of intervals and, based on the shortest path, a parallel algorithm to find the center of the family of intervals. More precisely, with an arbitrary family of n intervals as input, all algorithms run in O(log n) time using O(n) processors in the EREW-PRAM model of computation.

  16. Discrete-Time Mixing Receiver Architecture for RF-Sampling Software-Defined Radio

    NARCIS (Netherlands)

    Ru, Z.; Klumperink, Eric A.M.; Nauta, Bram

    2010-01-01

    Abstract—A discrete-time (DT) mixing architecture for RF-sampling receivers is presented. This architecture makes RF sampling more suitable for software-defined radio (SDR) as it achieves wideband quadrature demodulation and wideband harmonic rejection. The paper consists of two parts. In the first

  17. Synchronization of Switched Interval Networks and Applications to Chaotic Neural Networks

    Directory of Open Access Journals (Sweden)

    Jinde Cao

    2013-01-01

    Full Text Available This paper investigates synchronization problem of switched delay networks with interval parameters uncertainty, based on the theories of the switched systems and drive-response technique, a mathematical model of the switched interval drive-response error system is established. Without constructing Lyapunov-Krasovskii functions, introducing matrix measure method for the first time to switched time-varying delay networks, combining Halanay inequality technique, synchronization criteria are derived for switched interval networks under the arbitrary switching rule, which are easy to verify in practice. Moreover, as an application, the proposed scheme is then applied to chaotic neural networks. Finally, numerical simulations are provided to illustrate the effectiveness of the theoretical results.

  18. Effect of Remote Back-Up Protection System Failure on the Optimum Routine Test Time Interval of Power System Protection

    Directory of Open Access Journals (Sweden)

    Y Damchi

    2013-12-01

    Full Text Available Appropriate operation of protection system is one of the effective factors to have a desirable reliability in power systems, which vitally needs routine test of protection system. Precise determination of optimum routine test time interval (ORTTI plays a vital role in predicting the maintenance costs of protection system. In the most previous studies, ORTTI has been determined while remote back-up protection system was considered fully reliable. This assumption is not exactly correct since remote back-up protection system may operate incorrectly or fail to operate, the same as the primary protection system. Therefore, in order to determine the ORTTI, an extended Markov model is proposed in this paper considering failure probability for remote back-up protection system. In the proposed Markov model of the protection systems, monitoring facility is taken into account. Moreover, it is assumed that the primary and back-up protection systems are maintained simultaneously. Results show that the effect of remote back-up protection system failures on the reliability indices and optimum routine test intervals of protection system is considerable.

  19. A Comparative Test of the Interval-Scale Properties of Magnitude Estimation and Case III Scaling and Recommendations for Equal-Interval Frequency Response Anchors.

    Science.gov (United States)

    Schriesheim, Chester A.; Novelli, Luke, Jr.

    1989-01-01

    Differences between recommended sets of equal-interval response anchors derived from scaling techniques using magnitude estimations and Thurstone Case III pair-comparison treatment of complete ranks were compared. Differences in results for 205 undergraduates reflected differences in the samples as well as in the tasks and computational…

  20. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Young Soo Suh

    2009-04-01

    Full Text Available This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD, sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen.

  1. Effects of High Intensity Interval Training on Increasing Explosive Power, Speed, and Agility

    Science.gov (United States)

    Fajrin, F.; Kusnanik, N. W.; Wijono

    2018-01-01

    High Intensity Interval Training (HIIT) is a type of exercise that combines high-intensity exercise and low intensity exercise in a certain time interval. This type of training is very effective and efficient to improve the physical components. The process of improving athletes achievement related to how the process of improving the physical components, so the selection of a good practice method will be very helpful. This study aims to analyze how is the effects of HIIT on increasing explosive power, speed, and agility. This type of research is quantitative with quasi-experimental methods. The design of this study used the Matching-Only Design, with data analysis using the t-test (paired sample t-test). After being given the treatment for six weeks, the results showed there are significant increasing in explosive power, speed, and agility. HIIT in this study used a form of exercise plyometric as high-intensity exercise and jogging as mild or moderate intensity exercise. Increase was due to the improvement of neuromuscular characteristics that affect the increase in muscle strength and performance. From the data analysis, researchers concluded that, Exercises of High Intensity Interval Training significantly effect on the increase in Power Limbs, speed, and agility.

  2. Assessing accuracy of point fire intervals across landscapes with simulation modelling

    Science.gov (United States)

    Russell A. Parsons; Emily K. Heyerdahl; Robert E. Keane; Brigitte Dorner; Joseph Fall

    2007-01-01

    We assessed accuracy in point fire intervals using a simulation model that sampled four spatially explicit simulated fire histories. These histories varied in fire frequency and size and were simulated on a flat landscape with two forest types (dry versus mesic). We used three sampling designs (random, systematic grids, and stratified). We assessed the sensitivity of...

  3. INTERVALS OF ACTIVE PLAY AND BREAK IN BASKETBALL GAMES

    Directory of Open Access Journals (Sweden)

    Pavle Rubin

    2010-09-01

    Full Text Available The problem of the research comes from the need for decomposition of a basketball game. The aim was to determine the intervals of active game (“live ball” - term defined by rules and break (“dead ball” - term defined by rules, by analyzing basketball games. In order to obtain the relevant information, basketball games from five different competitions (top level of quality were analyzed. The sample consists of seven games played in the 2006/2007 season: NCAA Play - Off Final game, Adriatic League finals, ULEB Cup final game, Euroleague (2 games and the NBA league (2 games. The most important information gained by this research is that the average interval of active play lasts approximately 47 seconds, while the average break interval lasts approximately 57 seconds. This information is significant for coaches and should be used in planning the training process.

  4. Birth interval and its predictors among married women in Dabat ...

    African Journals Online (AJOL)

    2008-12-30

    Birth intervals (time between two successive live births) if short are associated with diverse complications. We assessed birth interval and its predictors among 613 married women who gave birth from January 1 to December 30, 2008. Data were collected in April 2012. Life table and Kaplan-Meier curve were used to ...

  5. Tuning for temporal interval in human apparent motion detection.

    Science.gov (United States)

    Bours, Roger J E; Stuur, Sanne; Lankheet, Martin J M

    2007-01-08

    Detection of apparent motion in random dot patterns requires correlation across time and space. It has been difficult to study the temporal requirements for the correlation step because motion detection also depends on temporal filtering preceding correlation and on integration at the next levels. To specifically study tuning for temporal interval in the correlation step, we performed an experiment in which prefiltering and postintegration were held constant and in which we used a motion stimulus containing coherent motion for a single interval value only. The stimulus consisted of a sparse random dot pattern in which each dot was presented in two frames only, separated by a specified interval. On each frame, half of the dots were refreshed and the other half was a displaced reincarnation of the pattern generated one or several frames earlier. Motion energy statistics in such a stimulus do not vary from frame to frame, and the directional bias in spatiotemporal correlations is similar for different interval settings. We measured coherence thresholds for left-right direction discrimination by varying motion coherence levels in a Quest staircase procedure, as a function of both step size and interval. Results show that highest sensitivity was found for an interval of 17-42 ms, irrespective of viewing distance. The falloff at longer intervals was much sharper than previously described. Tuning for temporal interval was largely, but not completely, independent of step size. The optimal temporal interval slightly decreased with increasing step size. Similarly, the optimal step size decreased with increasing temporal interval.

  6. On interval and cyclic interval edge colorings of (3,5)-biregular graphs

    DEFF Research Database (Denmark)

    Casselgren, Carl Johan; Petrosyan, Petros; Toft, Bjarne

    2017-01-01

    A proper edge coloring f of a graph G with colors 1,2,3,…,t is called an interval coloring if the colors on the edges incident to every vertex of G form an interval of integers. The coloring f is cyclic interval if for every vertex v of G, the colors on the edges incident to v either form an inte...

  7. Electro-optic sampling for time resolving relativistic ultrafast electron diffraction

    International Nuclear Information System (INIS)

    Scoby, C. M.; Musumeci, P.; Moody, J.; Gutierrez, M.; Tran, T.

    2009-01-01

    The Pegasus laboratory at UCLA features a state-of-the-art electron photoinjector capable of producing ultrashort (<100 fs) high-brightness electron bunches at energies of 3.75 MeV. These beams recently have been used to produce static diffraction patterns from scattering off thin metal foils, and it is foreseen to take advantage of the ultrashort nature of these bunches in future pump-probe time-resolved diffraction studies. In this paper, single shot 2-d electro-optic sampling is presented as a potential technique for time of arrival stamping of electron bunches used for diffraction. Effects of relatively low bunch charge (a few 10's of pC) and modestly relativistic beams are discussed and background compensation techniques to obtain high signal-to-noise ratio are explored. From these preliminary tests, electro-optic sampling is suitable to be a reliable nondestructive time stamping method for relativistic ultrafast electron diffraction at the Pegasus lab.

  8. Heat Generation on Implant Surface During Abutment Preparation at Different Elapsed Time Intervals.

    Science.gov (United States)

    Al-Keraidis, Abdullah; Aleisa, Khalil; Al-Dwairi, Ziad Nawaf; Al-Tahawi, Hamdi; Hsu, Ming-Lun; Lynch, Edward; Özcan, Mutlu

    2017-10-01

    The purpose of this study was to evaluate heat generation at the implant surface caused by abutment preparation using a diamond bur in a high-speed dental turbine in vitro at 2 different water-coolant temperatures. Thirty-two titanium-alloy abutments were connected to a titanium-alloy implant embedded in an acrylic resin placed within a water bath at a controlled temperature of 37°C. The specimens were equally distributed into 2 groups (16 each). Group 1: the temperature was maintained at 20 ± 1°C; and group 2: the temperature was maintained at 32 ± 1°C. Each abutment was prepared in the axial plane for 1 minute and in the occlusal plane for 1 minute. The temperature of the heat generated from abutment preparation was recorded and measured at 3 distinct time intervals. Water-coolant temperature (20°C vs 32°C) had a statistically significant effect on the implant's temperature change during preparation of the abutment (P water-coolant temperature of 20 ± 1°C during preparation of the implant abutment decreased the temperature recorded at the implant surface to 34.46°C, whereas the coolant temperature of 32 ± 1°C increased the implant surface temperature to 40.94°C.

  9. Emigration Rates From Sample Surveys: An Application to Senegal.

    Science.gov (United States)

    Willekens, Frans; Zinn, Sabine; Leuchter, Matthias

    2017-12-01

    What is the emigration rate of a country, and how reliable is that figure? Answering these questions is not at all straightforward. Most data on international migration are census data on foreign-born population. These migrant stock data describe the immigrant population in destination countries but offer limited information on the rate at which people leave their country of origin. The emigration rate depends on the number leaving in a given period and the population at risk of leaving, weighted by the duration at risk. Emigration surveys provide a useful data source for estimating emigration rates, provided that the estimation method accounts for sample design. In this study, emigration rates and confidence intervals are estimated from a sample survey of households in the Dakar region in Senegal, which was part of the Migration between Africa and Europe survey. The sample was a stratified two-stage sample with oversampling of households with members abroad or return migrants. A combination of methods of survival analysis (time-to-event data) and replication variance estimation (bootstrapping) yields emigration rates and design-consistent confidence intervals that are representative for the study population.

  10. INTERVAL STATE ESTIMATION FOR SINGULAR DIFFERENTIAL EQUATION SYSTEMS WITH DELAYS

    Directory of Open Access Journals (Sweden)

    T. A. Kharkovskaia

    2016-07-01

    Full Text Available The paper deals with linear differential equation systems with algebraic restrictions (singular systems and a method of interval observer design for this kind of systems. The systems contain constant time delay, measurement noise and disturbances. Interval observer synthesis is based on monotone and cooperative systems technique, linear matrix inequations, Lyapunov function theory and interval arithmetic. The set of conditions that gives the possibility for interval observer synthesis is proposed. Results of synthesized observer operation are shown on the example of dynamical interindustry balance model. The advantages of proposed method are that it is adapted to observer design for uncertain systems, if the intervals of admissible values for uncertain parameters are given. The designed observer is capable to provide asymptotically definite limits on the estimation accuracy, since the interval of admissible values for the object state is defined at every instant. The obtained result provides an opportunity to develop the interval estimation theory for complex systems that contain parametric uncertainty, varying delay and nonlinear elements. Interval observers increasingly find applications in economics, electrical engineering, mechanical systems with constraints and optimal flow control.

  11. Hybrid neuro-heuristic methodology for simulation and control of dynamic systems over time interval.

    Science.gov (United States)

    Woźniak, Marcin; Połap, Dawid

    2017-09-01

    Simulation and positioning are very important aspects of computer aided engineering. To process these two, we can apply traditional methods or intelligent techniques. The difference between them is in the way they process information. In the first case, to simulate an object in a particular state of action, we need to perform an entire process to read values of parameters. It is not very convenient for objects for which simulation takes a long time, i.e. when mathematical calculations are complicated. In the second case, an intelligent solution can efficiently help on devoted way of simulation, which enables us to simulate the object only in a situation that is necessary for a development process. We would like to present research results on developed intelligent simulation and control model of electric drive engine vehicle. For a dedicated simulation method based on intelligent computation, where evolutionary strategy is simulating the states of the dynamic model, an intelligent system based on devoted neural network is introduced to control co-working modules while motion is in time interval. Presented experimental results show implemented solution in situation when a vehicle transports things over area with many obstacles, what provokes sudden changes in stability that may lead to destruction of load. Therefore, applied neural network controller prevents the load from destruction by positioning characteristics like pressure, acceleration, and stiffness voltage to absorb the adverse changes of the ground. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Constraint-based Attribute and Interval Planning

    Science.gov (United States)

    Jonsson, Ari; Frank, Jeremy

    2013-01-01

    In this paper we describe Constraint-based Attribute and Interval Planning (CAIP), a paradigm for representing and reasoning about plans. The paradigm enables the description of planning domains with time, resources, concurrent activities, mutual exclusions among sets of activities, disjunctive preconditions and conditional effects. We provide a theoretical foundation for the paradigm, based on temporal intervals and attributes. We then show how the plans are naturally expressed by networks of constraints, and show that the process of planning maps directly to dynamic constraint reasoning. In addition, we de ne compatibilities, a compact mechanism for describing planning domains. We describe how this framework can incorporate the use of constraint reasoning technology to improve planning. Finally, we describe EUROPA, an implementation of the CAIP framework.

  13. Pediatric Reference Intervals for Free Thyroxine and Free Triiodothyronine

    Science.gov (United States)

    Jang, Megan; Guo, Tiedong; Soldin, Steven J.

    2009-01-01

    Background The clinical value of free thyroxine (FT4) and free triiodothyronine (FT3) analysis depends on the reference intervals with which they are compared. We determined age- and sex-specific reference intervals for neonates, infants, and children 0–18 years of age for FT4 and FT3 using tandem mass spectrometry. Methods Reference intervals were calculated for serum FT4 (n = 1426) and FT3 (n = 1107) obtained from healthy children between January 1, 2008, and June 30, 2008, from Children's National Medical Center and Georgetown University Medical Center Bioanalytical Core Laboratory, Washington, DC. Serum samples were analyzed using isotope dilution liquid chromatography tandem mass spectrometry (LC/MS/MS) with deuterium-labeled internal standards. Results FT4 reference intervals were very similar for males and females of all ages and ranged between 1.3 and 2.4 ng/dL for children 1 to 18 years old. FT4 reference intervals for 1- to 12-month-old infants were 1.3–2.8 ng/dL. These 2.5 to 97.5 percentile intervals were much tighter than reference intervals obtained using immunoassay platforms 0.48–2.78 ng/dL for males and 0.85–2.09 ng/dL for females. Similarly, FT3 intervals were consistent and similar for males and females and for all ages, ranging between 1.5 pg/mL and approximately 6.0 pg/mL for children 1 month of age to 18 years old. Conclusions This is the first study to provide pediatric reference intervals of FT4 and FT3 for children from birth to 18 years of age using LC/MS/MS. Analysis using LC/MS/MS provides more specific quantification of thyroid hormones. A comparison of the ultrafiltration tandem mass spectrometric method with equilibrium dialysis showed very good correlation. PMID:19583487

  14. The Geomagnetic Field Recorded in Sediments of the Tuzla Section (the Krasnodar Territory, Russia) over the Time Interval 120-70 ka

    DEFF Research Database (Denmark)

    Pilipenko, Olga; Abrahamsen, N.; Trubikhin, V. M.

    2007-01-01

    Petro- and paleomagnetic methods are applied to the study of the lower part of the Early Pleistocene Tuzla section on the Black Sea coast of the Taman Peninsula. This part of the section is composed of marine and lagoonal sediments deposited over the time interval 120-70 ka. The measured curves...... of the variation in the geomagnetic field inclination reveal an anomalous direction dated at ~110 ka which coincides with a similar anomalous direction in the Eltigen section (Ukraine) correlating with the Blake paleomagnetic event. The significant correlation between the time series NRM0.015/SIRM0.015 (Tuzla...

  15. Comparison of sampling techniques for use in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.J.

    1984-01-01

    The Stephen Howe review (reference TR-STH-1) recommended the use of a deterministic generator (DG) sampling technique for sampling the input values to the SYVAC (SYstems Variability Analysis Code) program. This technique was compared with Monte Carlo simple random sampling (MC) by taking a 1000 run case of SYVAC using MC as the reference case. The results show that DG appears relatively inaccurate for most values of consequence when used with 11 sample intervals. If 22 sample intervals are used then DG generates cumulative distribution functions that are statistically similar to the reference distribution. 400 runs of DG or MC are adequate to generate a representative cumulative distribution function. The MC technique appears to perform better than DG for the same number of runs. However, the DG predicts higher doses and in view of the importance of generating data in the high dose region this sampling technique with 22 sample intervals is recommended for use in SYVAC. (author)

  16. Late recovery of damage in rat spinal cord and bone marrow observed in split dose irradiation with long time intervals for 300 kV X-rays and 15 MeV neutrons

    International Nuclear Information System (INIS)

    Kogel, A.J. van der; Sissingh, H.A.

    The authors have performed an extended study on the capacity of the spinal cord for recovery of damage over long time intervals in split-dose experiments with 300 kV X-rays and 15 MeV neutrons, with time intervals of up to 30 weeks. The dose-response relationships for long term bone marrow depletion have been analysed and compared with those obtained for acute and late spinal cord damage. (Auth.)

  17. Reference intervals and longitudinal changes in copeptin and MR-proADM concentrations during pregnancy.

    Science.gov (United States)

    Joosen, Annemiek M C P; van der Linden, Ivon J M; Schrauwen, Lianne; Theeuwes, Alisia; de Groot, Monique J M; Ermens, Antonius A M

    2017-11-27

    Vasopressin and adrenomedullin and their stable by-products copeptin and midregional part of proadrenomedullin (MR-proADM) are promising biomarkers for the development of preeclampsia. However, clinical use is hampered by the lack of trimester-specific reference intervals. We therefore estimated reference intervals for copeptin and MR-proADM in disease-free Dutch women throughout pregnancy. Apparently healthy low risk pregnant women were recruited. Exclusion criteria included current or past history of endocrine disease, multiple pregnancy, use of medication known to influence thyroid function and current pregnancy as a result of hormonal stimulation. Women who miscarried, developed hyperemesis gravidarum, hypertension, pre-eclampsia, hemolysis elevated liver enzymes and low platelets, diabetes or other disease, delivered prematurely or had a small for gestational age neonate were excluded from analyses. Blood samples were collected at 9-13 weeks (n=98), 27-29 weeks (n=94) and 36-39 weeks (n=91) of gestation and at 4-13 weeks post-partum (PP) (n=89). Sixty-two women had complete data during pregnancy and PP. All analyses were performed on a Kryptor compact plus. Copeptin increases during pregnancy, but 97.5th percentiles remain below the non-pregnant upper reference limit (URL) provided by the manufacturer. MR-proADM concentrations increase as well during pregnancy. In trimesters 2 and 3 the 97.5th percentiles are over three times the non-pregnant URL provided by the manufacturer. Trimester- and assay-specific reference intervals for copeptin and MR-proADM should be used. In addition, consecutive measurements and the time frame between measurements should be considered as the differences seen with or in advance of preeclampsia can be expected to be relatively small compared to the reference intervals.

  18. Circadian profile of QT interval and QT interval variability in 172 healthy volunteers

    DEFF Research Database (Denmark)

    Bonnemeier, Hendrik; Wiegand, Uwe K H; Braasch, Wiebke

    2003-01-01

    of sleep. QT and R-R intervals revealed a characteristic day-night-pattern. Diurnal profiles of QT interval variability exhibited a significant increase in the morning hours (6-9 AM; P ... lower at day- and nighttime. Aging was associated with an increase of QT interval mainly at daytime and a significant shift of the T wave apex towards the end of the T wave. The circadian profile of ventricular repolarization is strongly related to the mean R-R interval, however, there are significant...

  19. Adaptive sampling algorithm for detection of superpoints

    Institute of Scientific and Technical Information of China (English)

    CHENG Guang; GONG Jian; DING Wei; WU Hua; QIANG ShiQiang

    2008-01-01

    The superpoints are the sources (or the destinations) that connect with a great deal of destinations (or sources) during a measurement time interval, so detecting the superpoints in real time is very important to network security and management. Previous algorithms are not able to control the usage of the memory and to deliver the desired accuracy, so it is hard to detect the superpoints on a high speed link in real time. In this paper, we propose an adaptive sampling algorithm to detect the superpoints in real time, which uses a flow sample and hold module to reduce the detection of the non-superpoints and to improve the measurement accuracy of the superpoints. We also design a data stream structure to maintain the flow records, which compensates for the flow Hash collisions statistically. An adaptive process based on different sampling probabilities is used to maintain the recorded IP ad dresses in the limited memory. This algorithm is compared with the other algo rithms by analyzing the real network trace data. Experiment results and mathematic analysis show that this algorithm has the advantages of both the limited memory requirement and high measurement accuracy.

  20. A Novel Finite-Sum Inequality-Based Method for Robust H∞ Control of Uncertain Discrete-Time Takagi-Sugeno Fuzzy Systems With Interval-Like Time-Varying Delays.

    Science.gov (United States)

    Zhang, Xian-Ming; Han, Qing-Long; Ge, Xiaohua

    2017-09-22

    This paper is concerned with the problem of robust H∞ control of an uncertain discrete-time Takagi-Sugeno fuzzy system with an interval-like time-varying delay. A novel finite-sum inequality-based method is proposed to provide a tighter estimation on the forward difference of certain Lyapunov functional, leading to a less conservative result. First, an auxiliary vector function is used to establish two finite-sum inequalities, which can produce tighter bounds for the finite-sum terms appearing in the forward difference of the Lyapunov functional. Second, a matrix-based quadratic convex approach is employed to equivalently convert the original matrix inequality including a quadratic polynomial on the time-varying delay into two boundary matrix inequalities, which delivers a less conservative bounded real lemma (BRL) for the resultant closed-loop system. Third, based on the BRL, a novel sufficient condition on the existence of suitable robust H∞ fuzzy controllers is derived. Finally, two numerical examples and a computer-simulated truck-trailer system are provided to show the effectiveness of the obtained results.

  1. CLSI-based transference of the CALIPER database of pediatric reference intervals from Abbott to Beckman, Ortho, Roche and Siemens Clinical Chemistry Assays: direct validation using reference samples from the CALIPER cohort.

    Science.gov (United States)

    Estey, Mathew P; Cohen, Ashley H; Colantonio, David A; Chan, Man Khun; Marvasti, Tina Binesh; Randell, Edward; Delvin, Edgard; Cousineau, Jocelyne; Grey, Vijaylaxmi; Greenway, Donald; Meng, Qing H; Jung, Benjamin; Bhuiyan, Jalaluddin; Seccombe, David; Adeli, Khosrow

    2013-09-01

    The CALIPER program recently established a comprehensive database of age- and sex-stratified pediatric reference intervals for 40 biochemical markers. However, this database was only directly applicable for Abbott ARCHITECT assays. We therefore sought to expand the scope of this database to biochemical assays from other major manufacturers, allowing for a much wider application of the CALIPER database. Based on CLSI C28-A3 and EP9-A2 guidelines, CALIPER reference intervals were transferred (using specific statistical criteria) to assays performed on four other commonly used clinical chemistry platforms including Beckman Coulter DxC800, Ortho Vitros 5600, Roche Cobas 6000, and Siemens Vista 1500. The resulting reference intervals were subjected to a thorough validation using 100 reference specimens (healthy community children and adolescents) from the CALIPER bio-bank, and all testing centers participated in an external quality assessment (EQA) evaluation. In general, the transferred pediatric reference intervals were similar to those established in our previous study. However, assay-specific differences in reference limits were observed for many analytes, and in some instances were considerable. The results of the EQA evaluation generally mimicked the similarities and differences in reference limits among the five manufacturers' assays. In addition, the majority of transferred reference intervals were validated through the analysis of CALIPER reference samples. This study greatly extends the utility of the CALIPER reference interval database which is now directly applicable for assays performed on five major analytical platforms in clinical use, and should permit the worldwide application of CALIPER pediatric reference intervals. Copyright © 2013 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Learning Bounds of ERM Principle for Sequences of Time-Dependent Samples

    Directory of Open Access Journals (Sweden)

    Mingchen Yao

    2015-01-01

    Full Text Available Many generalization results in learning theory are established under the assumption that samples are independent and identically distributed (i.i.d.. However, numerous learning tasks in practical applications involve the time-dependent data. In this paper, we propose a theoretical framework to analyze the generalization performance of the empirical risk minimization (ERM principle for sequences of time-dependent samples (TDS. In particular, we first present the generalization bound of ERM principle for TDS. By introducing some auxiliary quantities, we also give a further analysis of the generalization properties and the asymptotical behaviors of ERM principle for TDS.

  3. Multivariate survivorship analysis using two cross-sectional samples.

    Science.gov (United States)

    Hill, M E

    1999-11-01

    As an alternative to survival analysis with longitudinal data, I introduce a method that can be applied when one observes the same cohort in two cross-sectional samples collected at different points in time. The method allows for the estimation of log-probability survivorship models that estimate the influence of multiple time-invariant factors on survival over a time interval separating two samples. This approach can be used whenever the survival process can be adequately conceptualized as an irreversible single-decrement process (e.g., mortality, the transition to first marriage among a cohort of never-married individuals). Using data from the Integrated Public Use Microdata Series (Ruggles and Sobek 1997), I illustrate the multivariate method through an investigation of the effects of race, parity, and educational attainment on the survival of older women in the United States.

  4. Modified stochastic fragmentation of an interval as an ageing process

    Science.gov (United States)

    Fortin, Jean-Yves

    2018-02-01

    We study a stochastic model based on modified fragmentation of a finite interval. The mechanism consists of cutting the interval at a random location and substituting a unique fragment on the right of the cut to regenerate and preserve the interval length. This leads to a set of segments of random sizes, with the accumulation of small fragments near the origin. This model is an example of record dynamics, with the presence of ‘quakes’ and slow dynamics. The fragment size distribution is a universal inverse power law with logarithmic corrections. The exact distribution for the fragment number as function of time is simply related to the unsigned Stirling numbers of the first kind. Two-time correlation functions are defined, and computed exactly. They satisfy scaling relations, and exhibit aging phenomena. In particular, the probability that the same number of fragments is found at two different times t>s is asymptotically equal to [4πlog(s)]-1/2 when s\\gg 1 and the ratio t/s is fixed, in agreement with the numerical simulations. The same process with a reset impedes the aging phenomenon-beyond a typical time scale defined by the reset parameter.

  5. The analysis and attribution of the time-dependent neutron background resultant from sample irradiation in a SLOWPOKE-2 reactor

    International Nuclear Information System (INIS)

    Sellers, M.T.; Corcoran, E.C.; Kelly, D.G.

    2013-01-01

    The Royal Military College of Canada (RMCC) has commissioned a Delayed Neutron Counting (DNC) system for the analysis of special nuclear materials. A significant, time-dependent neutron background with an initial maximum count rate, more than 50 times that of the time-independent background, was characterised during the validation of this system. This time-dependent background was found to be dependent on the presence of the polyethylene (PE) vials used to transport the fissile samples, yet was not an activation product of vial impurities. The magnitude of the time-dependent background was found to be irradiation site specific and independent of the mass of PE. The capability of RMCC's DNC system to analyze the neutron count rates in time intervals 235 U contamination was present on each irradiated vial. However, Inductively Coupled Plasma-Mass Spectroscopy measurements of material leached from the outer vial surfaces after their irradiations found only trace amounts of uranium, 0.118 ± 0.048 ng of 235 U derived from natural uranium. These quantities are insufficient to account for the time-independent background, and in fact could not be discriminated from the noise associated with time-independent background. It is suggested that delayed neutron emitters are deposited in the vial surface following fission recoil, leaving the main body of uranium within the irradiation site. This hypothesis is supported by the physical cleaning of the site with materials soaked in distilled water and HNO 3 , which lowered the background from a nominal 235 U mass equivalent of 120 to 50 ng per vial. (author)

  6. Dissociation of the role of the prelimbic cortex in interval timing and resource allocation: beneficial effect of norepinephrine and dopamine reuptake inhibitor nomifensine on anxiety-inducing distraction

    Directory of Open Access Journals (Sweden)

    Alexander R Matthews

    2012-12-01

    Full Text Available Emotional distracters impair cognitive function. Emotional processing is dysregulated in affective disorders such as depression, phobias, schizophrenia, and PTSD. Among the processes impaired by emotional distracters, and whose dysregulation is documented in affective disorders, is the ability to time in the seconds-to-minutes range, i.e. interval timing. Presentation of task-irrelevant distracters during a timing task results in a delay in responding suggesting a failure to maintain subjective time in working memory, possibly due to attentional and working memory resources being diverted away from timing, as proposed by the Relative Time-Sharing model. We investigated the role of the prelimbic cortex in the detrimental effect of anxiety-inducing task-irrelevant distracters on the cognitive ability to keep track of time, using local infusions of norepinephrine and dopamine reuptake inhibitor nomifensine in a modified peak-interval procedure with neutral and anxiety-inducing distracters. Given that some antidepressants have beneficial effects on attention and working memory, e.g., decreasing emotional response to negative events, we hypothesized that nomifensine would improve maintenance of information in working memory in trials with distracters, resulting in a decrease of the disruptive effect of emotional events on the timekeeping abilities. Our results revealed a dissociation of the effects of nomifensine infusion in prelimbic cortex between interval timing and resource allocation, and between neutral and anxiety-inducing distraction. Nomifensine was effective only during trials with distracters, but not during trials without distracters. Nomifensine reduced the detrimental effect of the distracters only when the distracters were anxiety-inducing, but not when they were neutral. Results are discussed in relation to the brain circuits involved in Relative Time-Sharing of resources, and the pharmacological management of affective disorders.

  7. Study of the Effect of Temporal Sampling Frequency on DSCOVR Observations Using the GEOS-5 Nature Run Results. Part II; Cloud Coverage

    Science.gov (United States)

    Holdaway, Daniel; Yang, Yuekui

    2016-01-01

    This is the second part of a study on how temporal sampling frequency affects satellite retrievals in support of the Deep Space Climate Observatory (DSCOVR) mission. Continuing from Part 1, which looked at Earth's radiation budget, this paper presents the effect of sampling frequency on DSCOVR-derived cloud fraction. The output from NASA's Goddard Earth Observing System version 5 (GEOS-5) Nature Run is used as the "truth". The effect of temporal resolution on potential DSCOVR observations is assessed by subsampling the full Nature Run data. A set of metrics, including uncertainty and absolute error in the subsampled time series, correlation between the original and the subsamples, and Fourier analysis have been used for this study. Results show that, for a given sampling frequency, the uncertainties in the annual mean cloud fraction of the sunlit half of the Earth are larger over land than over ocean. Analysis of correlation coefficients between the subsamples and the original time series demonstrates that even though sampling at certain longer time intervals may not increase the uncertainty in the mean, the subsampled time series is further and further away from the "truth" as the sampling interval becomes larger and larger. Fourier analysis shows that the simulated DSCOVR cloud fraction has underlying periodical features at certain time intervals, such as 8, 12, and 24 h. If the data is subsampled at these frequencies, the uncertainties in the mean cloud fraction are higher. These results provide helpful insights for the DSCOVR temporal sampling strategy.

  8. Diagnostic interval and mortality in colorectal cancer

    DEFF Research Database (Denmark)

    Tørring, Marie Louise; Frydenberg, Morten; Hamilton, William

    2012-01-01

    Objective To test the theory of a U-shaped association between time from the first presentation of symptoms in primary care to the diagnosis (the diagnostic interval) and mortality after diagnosis of colorectal cancer (CRC). Study Design and Setting Three population-based studies in Denmark...

  9. Reference Intervals for Non-Fasting CVD Lipids and Inflammation Markers in Pregnant Indigenous Australian Women.

    Science.gov (United States)

    Schumacher, Tracy L; Oldmeadow, Christopher; Clausen, Don; Weatherall, Loretta; Keogh, Lyniece; Pringle, Kirsty G; Rae, Kym M

    2017-10-14

    Indigenous Australians experience high rates of cardiovascular disease (CVD). The origins of CVD may commence during pregnancy, yet few serum reference values for CVD biomarkers exist specific to the pregnancy period. The Gomeroi gaaynggal research project is a program that undertakes research and provides some health services to pregnant Indigenous women. Three hundred and ninety-nine non-fasting samples provided by the study participants (206 pregnancies and 175 women) have been used to construct reference intervals for CVD biomarkers during this critical time. A pragmatic design was used, in that women were not excluded for the presence of chronic or acute health states. Percentile bands for non-linear relationships were constructed according to the methods of Wright and Royston (2008), using the xriml package in StataIC 13.1. Serum cholesterol, triglycerides, cystatin-C and alkaline phosphatase increased as gestational age progressed, with little change seen in high-sensitivity C-Reactive Protein and γ glutamyl transferase. Values provided in the reference intervals are consistent with findings from other research projects. These reference intervals will form a basis with which future CVD biomarkers for pregnant Indigenous Australian women can be compared.

  10. High-intensity interval training improves VO2peak, maximal lactate accumulation, time trial and competition performance in 9?11-year-old swimmers

    OpenAIRE

    Sperlich, Billy; Zinner, Christoph; Heilemann, Ilka; Kjendlie, Per-Ludvik; Holmberg, Hans-Christer; Mester, Joachim

    2010-01-01

    Training volume in swimming is usually very high when compared to the relatively short competition time. High-intensity interval training (HIIT) has been demonstrated to improve performance in a relatively short training period. The main purpose of the present study was to examine the effects of a 5-week HIIT versus high-volume training (HVT) in 9?11-year-old swimmers on competition performance, 100 and 2,000?m time (T 100?m and T 2,000?m), VO2peak and rate of maximal lactate accumulation (La...

  11. Modular 125 ps resolution time interval digitizer for 10 MHz stop burst rates and 33 ms range

    International Nuclear Information System (INIS)

    Turko, B.

    1978-01-01

    A high resolution multiple stop time interval digitizer is described. It is capable of resolving stop burst rates of up to 10 MHz with an incremental resolution of 125 ps within a range of 33 ms. The digitizer consists of five CAMAC modules and uses a standard CAMAC crate and controller. All the functions and ranges are completely computer controlled. Any two subsequent stop pulses in a burst can be resolved within 100 ns due to a new dual interpolation technique employed. The accuracy is maintained by a high stability 125 MHz reference clock. Up to 131 stop events can be stored in a 48-bit, 10 MHz derandomizing storage register before the digitizer overflows. The experimental data are also given

  12. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  13. Multifactor analysis of multiscaling in volatility return intervals.

    Science.gov (United States)

    Wang, Fengzhong; Yamasaki, Kazuko; Havlin, Shlomo; Stanley, H Eugene

    2009-01-01

    We study the volatility time series of 1137 most traded stocks in the U.S. stock markets for the two-year period 2001-2002 and analyze their return intervals tau , which are time intervals between volatilities above a given threshold q . We explore the probability density function of tau , P_(q)(tau) , assuming a stretched exponential function, P_(q)(tau) approximately e;(-tau;(gamma)) . We find that the exponent gamma depends on the threshold in the range between q=1 and 6 standard deviations of the volatility. This finding supports the multiscaling nature of the return interval distribution. To better understand the multiscaling origin, we study how gamma depends on four essential factors, capitalization, risk, number of trades, and return. We show that gamma depends on the capitalization, risk, and return but almost does not depend on the number of trades. This suggests that gamma relates to the portfolio selection but not on the market activity. To further characterize the multiscaling of individual stocks, we fit the moments of tau , mu_(m) identical with(tautau);(m);(1m) , in the range of 10portfolio optimization.

  14. Multifactor analysis of multiscaling in volatility return intervals

    Science.gov (United States)

    Wang, Fengzhong; Yamasaki, Kazuko; Havlin, Shlomo; Stanley, H. Eugene

    2009-01-01

    We study the volatility time series of 1137 most traded stocks in the U.S. stock markets for the two-year period 2001-2002 and analyze their return intervals τ , which are time intervals between volatilities above a given threshold q . We explore the probability density function of τ , Pq(τ) , assuming a stretched exponential function, Pq(τ)˜e-τγ . We find that the exponent γ depends on the threshold in the range between q=1 and 6 standard deviations of the volatility. This finding supports the multiscaling nature of the return interval distribution. To better understand the multiscaling origin, we study how γ depends on four essential factors, capitalization, risk, number of trades, and return. We show that γ depends on the capitalization, risk, and return but almost does not depend on the number of trades. This suggests that γ relates to the portfolio selection but not on the market activity. To further characterize the multiscaling of individual stocks, we fit the moments of τ , μm≡⟨(τ/⟨τ⟩)m⟩1/m , in the range of 10portfolio optimization.

  15. Reference intervals for putative biomarkers of drug-induced liver injury and liver regeneration in healthy human volunteers.

    Science.gov (United States)

    Francis, Ben; Clarke, Joanna I; Walker, Lauren E; Brillant, Nathalie; Jorgensen, Andrea L; Park, B Kevin; Pirmohamed, Munir; Antoine, Daniel J

    2018-05-02

    The potential of mechanistic biomarkers to improve the prediction of drug-induced liver injury (DILI) and hepatic regeneration is widely acknowledged. We sought to determine reference intervals for new biomarkers of DILI and regeneration as well as to characterize their natural variability and impact of diurnal variation. Serum samples from 200 healthy volunteers were recruited as part of a cross sectional study; of these, 50 subjects had weekly serial sampling over 3 weeks, while 24 had intensive blood sampling over a 24h period. Alanine aminotransferase (ALT), MicroRNA-122 (miR-122), high mobility group box-1 (HMGB1), total keratin-18 (FL-K18), caspase cleaved keratin-18 (cc-K18), glutamate dehydrogenase (GLDH) and colony stimulating factor-1 (CSF-1) were assessed by validated assays. Reference intervals were established for each biomarker based on the 97.5% quantile (90% CI) following the assessment of fixed effects in univariate and multivariable models (ALT 50 (41-50) U/l, miR-122 3548 (2912-4321) copies/µl, HMGB1 2.3 (2.2-2.4) ng/ml, FL-K18 475 (456-488) U/l, cc-K18 272 (256-291) U/l, GLDH 27 (26-30) U/l and CSF-1 2.4 (2.3-2.9) ng/ml). There was a small but significant intra-individual time random effect detected but no significant impact of diurnal variation was observed, with the exception of GLDH. Reference intervals for novel DILI biomarkers have been described for the first time. An upper limit of a reference range might represent the most appropriate method to utilize these data. Regulatory authorities have published letters of support encouraging further qualification of leading candidate biomarkers. These data can now be used to interpret data from exploratory clinical DILI studies and to assist their further qualification. Drug-induced liver injury (DILI) has a big impact on patient health and the development of new medicines. Unfortunately, currently used blood-based tests to assess liver injury and recovery suffer from insufficiencies. Newer blood

  16. The Optimal Confidence Intervals for Agricultural Products’ Price Forecasts Based on Hierarchical Historical Errors

    Directory of Open Access Journals (Sweden)

    Yi Wang

    2016-12-01

    Full Text Available With the levels of confidence and system complexity, interval forecasts and entropy analysis can deliver more information than point forecasts. In this paper, we take receivers’ demands as our starting point, use the trade-off model between accuracy and informativeness as the criterion to construct the optimal confidence interval, derive the theoretical formula of the optimal confidence interval and propose a practical and efficient algorithm based on entropy theory and complexity theory. In order to improve the estimation precision of the error distribution, the point prediction errors are STRATIFIED according to prices and the complexity of the system; the corresponding prediction error samples are obtained by the prices stratification; and the error distributions are estimated by the kernel function method and the stability of the system. In a stable and orderly environment for price forecasting, we obtain point prediction error samples by the weighted local region and RBF (Radial basis function neural network methods, forecast the intervals of the soybean meal and non-GMO (Genetically Modified Organism soybean continuous futures closing prices and implement unconditional coverage, independence and conditional coverage tests for the simulation results. The empirical results are compared from various interval evaluation indicators, different levels of noise, several target confidence levels and different point prediction methods. The analysis shows that the optimal interval construction method is better than the equal probability method and the shortest interval method and has good anti-noise ability with the reduction of system entropy; the hierarchical estimation error method can obtain higher accuracy and better interval estimation than the non-hierarchical method in a stable system.

  17. Method for evaluation of radiative properties of glass samples

    Energy Technology Data Exchange (ETDEWEB)

    Mohelnikova, Jitka [Faculty of Civil Engineering, Brno University of Technology, Veveri 95, 602 00 Brno (Czech Republic)], E-mail: mohelnikova.j@fce.vutbr.cz

    2008-04-15

    The paper presents a simple calculation method which serves for an evaluation of radiative properties of window glasses. The method is based on a computer simulation model of the energy balance of a thermally insulated box with selected glass samples. A temperature profile of the air inside of the box with a glass sample exposed to affecting radiation was determined for defined boundary conditions. The spectral range of the radiation was considered in the interval between 280 and 2500 nm. This interval is adequate to the spectral range of solar radiation affecting windows in building facades. The air temperature rise within the box was determined in a response to the affecting radiation in the time between the beginning of the radiation exposition and the time of steady-state thermal conditions. The steady state temperature inside of the insulated box serves for the evaluation of the box energy balance and determination of the glass sample radiative properties. These properties are represented by glass characteristics as mean values of transmittance, reflectance and absorptance calculated for a defined spectral range. The data of the computer simulations were compared to experimental measurements on a real model of the insulated box. Results of both the calculations and measurements are in a good compliance. The method is recommended for preliminary evaluation of window glass radiative properties which serve as data for energy evaluation of buildings.

  18. Interference Cancellation Using Replica Signal for HTRCI-MIMO/OFDM in Time-Variant Large Delay Spread Longer Than Guard Interval

    Directory of Open Access Journals (Sweden)

    Yuta Ida

    2012-01-01

    Full Text Available Orthogonal frequency division multiplexing (OFDM and multiple-input multiple-output (MIMO are generally known as the effective techniques for high data rate services. In MIMO/OFDM systems, the channel estimation (CE is very important to obtain an accurate channel state information (CSI. However, since the orthogonal pilot-based CE requires the large number of pilot symbols, the total transmission rate is degraded. To mitigate this problem, a high time resolution carrier interferometry (HTRCI for MIMO/OFDM has been proposed. In wireless communication systems, if the maximum delay spread is longer than the guard interval (GI, the system performance is significantly degraded due to the intersymbol interference (ISI and intercarrier interference (ICI. However, the conventional HTRCI-MIMO/OFDM does not consider the case with the time-variant large delay spread longer than the GI. In this paper, we propose the ISI and ICI compensation methods for a HTRCI-MIMO/OFDM in the time-variant large delay spread longer than the GI.

  19. Wind Information Uplink to Aircraft Performing Interval Management Operations

    Science.gov (United States)

    Ahmad, Nashat N.; Barmore, Bryan E.; Swieringa, Kurt A.

    2016-01-01

    Interval Management (IM) is an ADS-B-enabled suite of applications that use ground and flight deck capabilities and procedures designed to support the relative spacing of aircraft (Barmore et al., 2004, Murdoch et al. 2009, Barmore 2009, Swieringa et al. 2011; Weitz et al. 2012). Relative spacing refers to managing the position of one aircraft to a time or distance relative to another aircraft, as opposed to a static reference point such as a point over the ground or clock time. This results in improved inter-aircraft spacing precision and is expected to allow aircraft to be spaced closer to the applicable separation standard than current operations. Consequently, if the reduced spacing is used in scheduling, IM can reduce the time interval between the first and last aircraft in an overall arrival flow, resulting in increased throughput. Because IM relies on speed changes to achieve precise spacing, it can reduce costly, low-altitude, vectoring, which increases both efficiency and throughput in capacity-constrained airspace without negatively impacting controller workload and task complexity. This is expected to increase overall system efficiency. The Flight Deck Interval Management (FIM) equipment provides speeds to the flight crew that will deliver them to the achieve-by point at the controller-specified time, i.e., assigned spacing goal, after the target aircraft crosses the achieve-by point (Figure 1.1). Since the IM and target aircraft may not be on the same arrival procedure, the FIM equipment predicts the estimated times of arrival (ETA) for both the IM and target aircraft to the achieve-by point. This involves generating an approximate four-dimensional trajectory for each aircraft. The accuracy of the wind data used to generate those trajectories is critical to the success of the IM operation. There are two main forms of uncertainty in the wind information used by the FIM equipment. The first is the accuracy of the forecast modeling done by the weather

  20. Evaluation of test intervals strategies with a risk monitor

    International Nuclear Information System (INIS)

    Soerman, J.

    2005-01-01

    The Swedish nuclear power utility Oskarshamn Power Group (OKG), is investigating how the use of a risk monitor can facilitate and improve risk-informed decision-making at their nuclear power plants. The intent is to evaluate if risk-informed decision-making can be accepted. A pilot project was initiated and carried out in 2004. The project included investigating if a risk monitor can be used for optimising test intervals for diesel- and gas turbine generators with regard to risk level. The Oskarhamn 2 (O2), PSA Level 1 model was converted into a risk monitor using RiskSpectrum RiskWatcher (RSRW) software. The converted PSA model included the complete PSA model for the power operation mode. RSRW then performs a complete requantification for every analysis. Time dependent reliability data are taken into account, i.e. a shorter test interval will increases the components availability (possibility to e.g. start on demand). The converted O2 model was then used to investigate whether it would be possible to balance longer test intervals for diesel generators, gas turbine generators and high pressure injection system with shorter test intervals for the low pressure injection system, while maintaining a low risk level at the plant. The results show that a new mixture of test intervals can be implemented with only marginally changes in the risk calculated with the risk monitor model. The results indicate that the total number of test activities for the systems included in the pilot study could be reduced by 20% with a maintained level of risk. A risk monitor taking into account the impact from test intervals in availability calculations for components is well suited for evaluation of test interval strategies. It also enables the analyst to evaluate the risk level over a period of time including the impact the actual status of the plant may have on the risk level. (author)

  1. A comparative analysis of spectral exponent estimation techniques for 1/f(β) processes with applications to the analysis of stride interval time series.

    Science.gov (United States)

    Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin

    2014-01-30

    The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series

    Science.gov (United States)

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin

    2013-01-01

    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  3. REAL-TIME PCR DETECTION OF LISTERIA MONOCYTOGENES IN FOOD SAMPLES OF ANIMAL ORIGIN

    Directory of Open Access Journals (Sweden)

    Jaroslav Pochop

    2013-02-01

    Full Text Available The aim of this study was to follow the contamination of food with Listeria monocytogenes by using Step One real time polymerase chain reaction (PCR. We used the PrepSEQ Rapid Spin Sample Preparation Kit for isolation of DNA and SensiFAST SYBR Hi-ROX Kit for the real-time PCR performance. In 24 samples of food of animal origin without incubation were detected strains of Listeria monocytogenes in 15 samples (swabs. Nine samples were negative. Our results indicated that the real-time PCR assay developed in this study could sensitively detect Listeria monocytogenes in food of animal origin without incubation. This could prevent infection caused by Listeria monocytogenes, and also could benefit food manufacturing companies by extending their product’s shelf-life as well as saving the cost of warehousing their food products while awaiting pathogen testing results. The rapid real-time PCR-based method performed very well compared to the conventional method. It is a fast, simple, specific and sensitive way to detect nucleic acids, which could be used in clinical diagnostic tests in the future.

  4. Count-to-count time interval distribution analysis in a fast reactor; Estudio de la distribucion de intervalos de tiempo entre detecciones consecutivas de neutrones en un reactor rapido

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Navarro Gomez, A

    1973-07-01

    The most important kinetic parameters have been measured at the zero power fast reactor CORAL-I by means of the reactor noise analysis in the time domain, using measurements of the count-to-count time intervals. (Author) 69 refs.

  5. Effect of time intervals between irradiation and chemotherapeutic agents on the normal tissue damage. Comparison between in vivo and in vitro experiments

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Hisao; Nakayama, Toshitake; Hashimoto, Shozo (Keio Univ., Tokyo (Japan). School of Medicine)

    1989-05-01

    Experiments have been carried out to determine the effect on the cell survivals at different time intervals between irradiation and chemotherapeutic agents (BLM, cisDDP, ADM and ACNU) in either the in vivo or the in vitro system. The intestinal epithelial assay was applied on the in vivo system. The clonogenic cell survivals of V/sub 79/ cells, both in the proliferative and the plateau phases, were determined in the in vitro system. The V/sub 79/ cells in the plateau phase were more sensitive to BLM, cisDDP and ACNU than those in the proliferative phase, however, the result was reverse with ADM. When BLM, cisDDP or ACNU was combined with irradiation at different time intervals, the response of the plateau phase V/sub 79/ cells to combination therapies were very similar to those of the intestinal epithelial cells. On the other hand, V/sub 79/ cells in the proliferative phase, which were treated with ADM and irradiation, showed the similar response as the intestinal cells. These results suggest that studies of chemo-radiotherapy with cultured cells which are sensitive to chemotherapeutic agents might be suitable to expect the in vivo damage of the normal tissue. (author).

  6. Most probable dimension value and most flat interval methods for automatic estimation of dimension from time series

    International Nuclear Information System (INIS)

    Corana, A.; Bortolan, G.; Casaleggio, A.

    2004-01-01

    We present and compare two automatic methods for dimension estimation from time series. Both methods, based on conceptually different approaches, work on the derivative of the bi-logarithmic plot of the correlation integral versus the correlation length (log-log plot). The first method searches for the most probable dimension values (MPDV) and associates to each of them a possible scaling region. The second one searches for the most flat intervals (MFI) in the derivative of the log-log plot. The automatic procedures include the evaluation of the candidate scaling regions using two reliability indices. The data set used to test the methods consists of time series from known model attractors with and without the addition of noise, structured time series, and electrocardiographic signals from the MIT-BIH ECG database. Statistical analysis of results was carried out by means of paired t-test, and no statistically significant differences were found in the large majority of the trials. Consistent results are also obtained dealing with 'difficult' time series. In general for a more robust and reliable estimate, the use of both methods may represent a good solution when time series from complex systems are analyzed. Although we present results for the correlation dimension only, the procedures can also be used for the automatic estimation of generalized q-order dimensions and pointwise dimension. We think that the proposed methods, eliminating the need of operator intervention, allow a faster and more objective analysis, thus improving the usefulness of dimension analysis for the characterization of time series obtained from complex dynamical systems

  7. Hypotensive response magnitude and duration in hypertensives: continuous and interval exercise.

    Science.gov (United States)

    Carvalho, Raphael Santos Teodoro de; Pires, Cássio Mascarenhas Robert; Junqueira, Gustavo Cardoso; Freitas, Dayana; Marchi-Alves, Leila Maria

    2015-03-01

    Although exercise training is known to promote post-exercise hypotension, there is currently no consistent argument about the effects of manipulating its various components (intensity, duration, rest periods, types of exercise, training methods) on the magnitude and duration of hypotensive response. To compare the effect of continuous and interval exercises on hypotensive response magnitude and duration in hypertensive patients by using ambulatory blood pressure monitoring (ABPM). The sample consisted of 20 elderly hypertensives. Each participant underwent three ABPM sessions: one control ABPM, without exercise; one ABPM after continuous exercise; and one ABPM after interval exercise. Systolic blood pressure (SBP), diastolic blood pressure (DBP), mean arterial pressure (MAP), heart rate (HR) and double product (DP) were monitored to check post-exercise hypotension and for comparison between each ABPM. ABPM after continuous exercise and after interval exercise showed post-exercise hypotension and a significant reduction (p ABPM. Comparing ABPM after continuous and ABPM after interval exercise, a significant reduction (p < 0.05) in SBP, DBP, MAP and DP was observed in the latter. Continuous and interval exercise trainings promote post-exercise hypotension with reduction in SBP, DBP, MAP and DP in the 20 hours following exercise. Interval exercise training causes greater post-exercise hypotension and lower cardiovascular overload as compared with continuous exercise.

  8. A test of alternative estimators for volume at time 1 from remeasured point samples

    Science.gov (United States)

    Francis A. Roesch; Edwin J. Green; Charles T. Scott

    1993-01-01

    Two estimators for volume at time 1 for use with permanent horizontal point samples are evaluated. One estimator, used traditionally, uses only the trees sampled at time 1, while the second estimator, originally presented by Roesch and coauthors (F.A. Roesch, Jr., E.J. Green, and C.T. Scott. 1989. For. Sci. 35(2):281-293). takes advantage of additional sample...

  9. Estimating time to pregnancy from current durations in a cross-sectional sample

    DEFF Research Database (Denmark)

    Keiding, Niels; Kvist, Kajsa; Hartvig, Helle

    2002-01-01

    A new design for estimating the distribution of time to pregnancy is proposed and investigated. The design is based on recording current durations in a cross-sectional sample of women, leading to statistical problems similar to estimating renewal time distributions from backward recurrence times....

  10. A comparison of confidence/credible interval methods for the area under the ROC curve for continuous diagnostic tests with small sample size.

    Science.gov (United States)

    Feng, Dai; Cortese, Giuliana; Baumgartner, Richard

    2017-12-01

    The receiver operating characteristic (ROC) curve is frequently used as a measure of accuracy of continuous markers in diagnostic tests. The area under the ROC curve (AUC) is arguably the most widely used summary index for the ROC curve. Although the small sample size scenario is common in medical tests, a comprehensive study of small sample size properties of various methods for the construction of the confidence/credible interval (CI) for the AUC has been by and large missing in the literature. In this paper, we describe and compare 29 non-parametric and parametric methods for the construction of the CI for the AUC when the number of available observations is small. The methods considered include not only those that have been widely adopted, but also those that have been less frequently mentioned or, to our knowledge, never applied to the AUC context. To compare different methods, we carried out a simulation study with data generated from binormal models with equal and unequal variances and from exponential models with various parameters and with equal and unequal small sample sizes. We found that the larger the true AUC value and the smaller the sample size, the larger the discrepancy among the results of different approaches. When the model is correctly specified, the parametric approaches tend to outperform the non-parametric ones. Moreover, in the non-parametric domain, we found that a method based on the Mann-Whitney statistic is in general superior to the others. We further elucidate potential issues and provide possible solutions to along with general guidance on the CI construction for the AUC when the sample size is small. Finally, we illustrate the utility of different methods through real life examples.

  11. Periodic, chaotic, and doubled earthquake recurrence intervals on the deep San Andreas fault.

    Science.gov (United States)

    Shelly, David R

    2010-06-11

    Earthquake recurrence histories may provide clues to the timing of future events, but long intervals between large events obscure full recurrence variability. In contrast, small earthquakes occur frequently, and recurrence intervals are quantifiable on a much shorter time scale. In this work, I examine an 8.5-year sequence of more than 900 recurring low-frequency earthquake bursts composing tremor beneath the San Andreas fault near Parkfield, California. These events exhibit tightly clustered recurrence intervals that, at times, oscillate between approximately 3 and approximately 6 days, but the patterns sometimes change abruptly. Although the environments of large and low-frequency earthquakes are different, these observations suggest that similar complexity might underlie sequences of large earthquakes.

  12. Periodic, chaotic, and doubled earthquake recurrence intervals on the deep San Andreas Fault

    Science.gov (United States)

    Shelly, David R.

    2010-01-01

    Earthquake recurrence histories may provide clues to the timing of future events, but long intervals between large events obscure full recurrence variability. In contrast, small earthquakes occur frequently, and recurrence intervals are quantifiable on a much shorter time scale. In this work, I examine an 8.5-year sequence of more than 900 recurring low-frequency earthquake bursts composing tremor beneath the San Andreas fault near Parkfield, California. These events exhibit tightly clustered recurrence intervals that, at times, oscillate between ~3 and ~6 days, but the patterns sometimes change abruptly. Although the environments of large and low-frequency earthquakes are different, these observations suggest that similar complexity might underlie sequences of large earthquakes.

  13. Effects of holding time and measurement error on culturing Legionella in environmental water samples.

    Science.gov (United States)

    Flanders, W Dana; Kirkland, Kimberly H; Shelton, Brian G

    2014-10-01

    Outbreaks of Legionnaires' disease require environmental testing of water samples from potentially implicated building water systems to identify the source of exposure. A previous study reports a large impact on Legionella sample results due to shipping and delays in sample processing. Specifically, this same study, without accounting for measurement error, reports more than half of shipped samples tested had Legionella levels that arbitrarily changed up or down by one or more logs, and the authors attribute this result to shipping time. Accordingly, we conducted a study to determine the effects of sample holding/shipping time on Legionella sample results while taking into account measurement error, which has previously not been addressed. We analyzed 159 samples, each split into 16 aliquots, of which one-half (8) were processed promptly after collection. The remaining half (8) were processed the following day to assess impact of holding/shipping time. A total of 2544 samples were analyzed including replicates. After accounting for inherent measurement error, we found that the effect of holding time on observed Legionella counts was small and should have no practical impact on interpretation of results. Holding samples increased the root mean squared error by only about 3-8%. Notably, for only one of 159 samples, did the average of the 8 replicate counts change by 1 log. Thus, our findings do not support the hypothesis of frequent, significant (≥= 1 log10 unit) Legionella colony count changes due to holding. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Design and Analysis of Schemes for Adapting Migration Intervals in Parallel Evolutionary Algorithms.

    Science.gov (United States)

    Mambrini, Andrea; Sudholt, Dirk

    2015-01-01

    The migration interval is one of the fundamental parameters governing the dynamic behaviour of island models. Yet, there is little understanding on how this parameter affects performance, and how to optimally set it given a problem in hand. We propose schemes for adapting the migration interval according to whether fitness improvements have been found. As long as no improvement is found, the migration interval is increased to minimise communication. Once the best fitness has improved, the migration interval is decreased to spread new best solutions more quickly. We provide a method for obtaining upper bounds on the expected running time and the communication effort, defined as the expected number of migrants sent. Example applications of this method to common example functions show that our adaptive schemes are able to compete with, or even outperform, the optimal fixed choice of the migration interval, with regard to running time and communication effort.

  15. Determination of detection equipment dead time

    International Nuclear Information System (INIS)

    Sacha, J.

    1980-01-01

    A method is described of determining dead time by short-lived source measurement. It is based on measuring the sample count rates in different time intervals when only dead time correction is changed with the changing count of recorded pulses. The dead time may be determined from the measured values by a numerical-graphical method. The method is described. The advantage of the method is the minimization of errors and inaccuracies; the disadvantage is that the half-life of the source used should very accurately be known. (J.P.)

  16. Synchronization of Hierarchical Time-Varying Neural Networks Based on Asynchronous and Intermittent Sampled-Data Control.

    Science.gov (United States)

    Xiong, Wenjun; Patel, Ragini; Cao, Jinde; Zheng, Wei Xing

    In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.

  17. Analysis of Low Frequency Oscillation Using the Multi-Interval Parameter Estimation Method on a Rolling Blackout in the KEPCO System

    Directory of Open Access Journals (Sweden)

    Kwan-Shik Shim

    2017-04-01

    Full Text Available This paper describes a multiple time interval (“multi-interval” parameter estimation method. The multi-interval parameter estimation method estimates a parameter from a new multi-interval prediction error polynomial that can simultaneously consider multiple time intervals. The root of the multi-interval prediction error polynomial includes the effect on each time interval, and the important mode can be estimated by solving one polynomial for multiple time intervals or signals. The algorithm of the multi-interval parameter estimation method proposed in this paper is applied to the test function and the data measured from a PMU (phasor measurement unit installed in the KEPCO (Korea Electric Power Corporation system. The results confirm that the proposed multi-interval parameter estimation method accurately and reliably estimates important parameters.

  18. Growth Estimators and Confidence Intervals for the Mean of Negative Binomial Random Variables with Unknown Dispersion

    Directory of Open Access Journals (Sweden)

    David Shilane

    2013-01-01

    Full Text Available The negative binomial distribution becomes highly skewed under extreme dispersion. Even at moderately large sample sizes, the sample mean exhibits a heavy right tail. The standard normal approximation often does not provide adequate inferences about the data's expected value in this setting. In previous work, we have examined alternative methods of generating confidence intervals for the expected value. These methods were based upon Gamma and Chi Square approximations or tail probability bounds such as Bernstein's inequality. We now propose growth estimators of the negative binomial mean. Under high dispersion, zero values are likely to be overrepresented in the data. A growth estimator constructs a normal-style confidence interval by effectively removing a small, predetermined number of zeros from the data. We propose growth estimators based upon multiplicative adjustments of the sample mean and direct removal of zeros from the sample. These methods do not require estimating the nuisance dispersion parameter. We will demonstrate that the growth estimators' confidence intervals provide improved coverage over a wide range of parameter values and asymptotically converge to the sample mean. Interestingly, the proposed methods succeed despite adding both bias and variance to the normal approximation.

  19. Studying DDT Susceptibility at Discriminating Time Intervals Focusing on Maximum Limit of Exposure Time Survived by DDT Resistant Phlebotomus argentipes (Diptera: Psychodidae): an Investigative Report.

    Science.gov (United States)

    Rama, Aarti; Kesari, Shreekant; Das, Pradeep; Kumar, Vijay

    2017-07-24

    Extensive application of routine insecticide i.e., dichlorodiphenyltrichloroethane (DDT) to control Phlebotomus argentipes (Diptera: Psychodidae), the proven vector of visceral leishmaniasis in India, had evoked the problem of resistance/tolerance against DDT, eventually nullifying the DDT dependent strategies to control this vector. Because tolerating an hour-long exposure to DDT is not challenging enough for the resistant P. argentipes, estimating susceptibility by exposing sand flies to insecticide for just an hour becomes a trivial and futile task.Therefore, this bioassay study was carried out to investigate the maximum limit of exposure time to which DDT resistant P. argentipes can endure the effect of DDT for their survival. The mortality rate of laboratory-reared DDT resistant strain P. argentipes exposed to DDT was studied at discriminating time intervals of 60 min and it was concluded that highly resistant sand flies could withstand up to 420 min of exposure to this insecticide. Additionally, the lethal time for female P. argentipes was observed to be higher than for males suggesting that they are highly resistant to DDT's toxicity. Our results support the monitoring of tolerance limit with respect to time and hence points towards an urgent need to change the World Health Organization's protocol for susceptibility identification in resistant P. argentipes.

  20. Molecular dynamics based enhanced sampling of collective variables with very large time steps

    Science.gov (United States)

    Chen, Pei-Yang; Tuckerman, Mark E.

    2018-01-01

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.