WorldWideScience

Sample records for regular time intervals

  1. Interval matrices: Regularity generates singularity

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří; Shary, S.P.

    2018-01-01

    Roč. 540, 1 March (2018), s. 149-159 ISSN 0024-3795 Institutional support: RVO:67985807 Keywords : interval matrix * regularity * singularity * P-matrix * absolute value equation * diagonally singilarizable matrix Subject RIV: BA - General Mathematics Impact factor: 0.973, year: 2016

  2. Timing intervals using population synchrony and spike timing dependent plasticity

    Directory of Open Access Journals (Sweden)

    Wei Xu

    2016-12-01

    Full Text Available We present a computational model by which ensembles of regularly spiking neurons can encode different time intervals through synchronous firing. We show that a neuron responding to a large population of convergent inputs has the potential to learn to produce an appropriately-timed output via spike-time dependent plasticity. We explain why temporal variability of this population synchrony increases with increasing time intervals. We also show that the scalar property of timing and its violation at short intervals can be explained by the spike-wise accumulation of jitter in the inter-spike intervals of timing neurons. We explore how the challenge of encoding longer time intervals can be overcome and conclude that this may involve a switch to a different population of neurons with lower firing rate, with the added effect of producing an earlier bias in response. Experimental data on human timing performance show features in agreement with the model’s output.

  3. A Characterization of Strong Regularity of Interval Matrices

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří

    2010-01-01

    Roč. 20, - (2010), s. 717-722 E-ISSN 1081-3810 R&D Projects: GA ČR GA201/09/1957; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : interval matrix * strong regularity * spectral radius * matrix inequality * solvability Subject RIV: BA - General Mathematics Impact factor: 0.808, year: 2010 http://www.math.technion.ac.il/iic/ ela / ela -articles/articles/vol20_pp717-722.pdf

  4. Entrained rhythmic activities of neuronal ensembles as perceptual memory of time interval.

    Science.gov (United States)

    Sumbre, Germán; Muto, Akira; Baier, Herwig; Poo, Mu-ming

    2008-11-06

    The ability to process temporal information is fundamental to sensory perception, cognitive processing and motor behaviour of all living organisms, from amoebae to humans. Neural circuit mechanisms based on neuronal and synaptic properties have been shown to process temporal information over the range of tens of microseconds to hundreds of milliseconds. How neural circuits process temporal information in the range of seconds to minutes is much less understood. Studies of working memory in monkeys and rats have shown that neurons in the prefrontal cortex, the parietal cortex and the thalamus exhibit ramping activities that linearly correlate with the lapse of time until the end of a specific time interval of several seconds that the animal is trained to memorize. Many organisms can also memorize the time interval of rhythmic sensory stimuli in the timescale of seconds and can coordinate motor behaviour accordingly, for example, by keeping the rhythm after exposure to the beat of music. Here we report a form of rhythmic activity among specific neuronal ensembles in the zebrafish optic tectum, which retains the memory of the time interval (in the order of seconds) of repetitive sensory stimuli for a duration of up to approximately 20 s. After repetitive visual conditioning stimulation (CS) of zebrafish larvae, we observed rhythmic post-CS activities among specific tectal neuronal ensembles, with a regular interval that closely matched the CS. Visuomotor behaviour of the zebrafish larvae also showed regular post-CS repetitions at the entrained time interval that correlated with rhythmic neuronal ensemble activities in the tectum. Thus, rhythmic activities among specific neuronal ensembles may act as an adjustable 'metronome' for time intervals in the order of seconds, and serve as a mechanism for the short-term perceptual memory of rhythmic sensory experience.

  5. Forty Necessary and Sufficient Conditions for Regularity of Interval Matrices: A survey

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří

    2009-01-01

    Roč. 18, - (2009), s. 500-512 E-ISSN 1081-3810 R&D Projects: GA ČR GA201/09/1957; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : interval matrix * regularity * singularity * necessary and sufficient condition * algorithm Subject RIV: BA - General Mathematics Impact factor: 0.892, year: 2009 http://www.math.technion.ac.il/iic/ ela / ela -articles/articles/vol18_pp500-512.pdf

  6. Time-Homogeneous Parabolic Wick-Anderson Model in One Space Dimension: Regularity of Solution

    OpenAIRE

    Kim, Hyun-Jung; Lototsky, Sergey V

    2017-01-01

    Even though the heat equation with random potential is a well-studied object, the particular case of time-independent Gaussian white noise in one space dimension has yet to receive the attention it deserves. The paper investigates the stochastic heat equation with space-only Gaussian white noise on a bounded interval. The main result is that the space-time regularity of the solution is the same for additive noise and for multiplicative noise in the Wick-It\\^o-Skorokhod interpretation.

  7. Temporal regularity of the environment drives time perception

    OpenAIRE

    van Rijn, H; Rhodes, D; Di Luca, M

    2016-01-01

    It’s reasonable to assume that a regularly paced sequence should be perceived as regular, but here we show that perceived regularity depends on the context in which the sequence is embedded. We presented one group of participants with perceptually regularly paced sequences, and another group of participants with mostly irregularly paced sequences (75% irregular, 25% regular). The timing of the final stimulus in each sequence could be var- ied. In one experiment, we asked whether the last stim...

  8. Regular transport dynamics produce chaotic travel times.

    Science.gov (United States)

    Villalobos, Jorge; Muñoz, Víctor; Rogan, José; Zarama, Roberto; Johnson, Neil F; Toledo, Benjamín; Valdivia, Juan Alejandro

    2014-06-01

    In the hope of making passenger travel times shorter and more reliable, many cities are introducing dedicated bus lanes (e.g., Bogota, London, Miami). Here we show that chaotic travel times are actually a natural consequence of individual bus function, and hence of public transport systems more generally, i.e., chaotic dynamics emerge even when the route is empty and straight, stops and lights are equidistant and regular, and loading times are negligible. More generally, our findings provide a novel example of chaotic dynamics emerging from a single object following Newton's laws of motion in a regularized one-dimensional system.

  9. Online evolution reconstruction from a single measurement record with random time intervals for quantum communication

    Science.gov (United States)

    Zhou, Hua; Su, Yang; Wang, Rong; Zhu, Yong; Shen, Huiping; Pu, Tao; Wu, Chuanxin; Zhao, Jiyong; Zhang, Baofu; Xu, Zhiyong

    2017-10-01

    Online reconstruction of a time-variant quantum state from the encoding/decoding results of quantum communication is addressed by developing a method of evolution reconstruction from a single measurement record with random time intervals. A time-variant two-dimensional state is reconstructed on the basis of recovering its expectation value functions of three nonorthogonal projectors from a random single measurement record, which is composed from the discarded qubits of the six-state protocol. The simulated results prove that our method is robust to typical metro quantum channels. Our work extends the Fourier-based method of evolution reconstruction from the version for a regular single measurement record with equal time intervals to a unified one, which can be applied to arbitrary single measurement records. The proposed protocol of evolution reconstruction runs concurrently with the one of quantum communication, which can facilitate the online quantum tomography.

  10. Intact interval timing in circadian CLOCK mutants.

    Science.gov (United States)

    Cordes, Sara; Gallistel, C R

    2008-08-28

    While progress has been made in determining the molecular basis for the circadian clock, the mechanism by which mammalian brains time intervals measured in seconds to minutes remains a mystery. An obvious question is whether the interval-timing mechanism shares molecular machinery with the circadian timing mechanism. In the current study, we trained circadian CLOCK +/- and -/- mutant male mice in a peak-interval procedure with 10 and 20-s criteria. The mutant mice were more active than their wild-type littermates, but there were no reliable deficits in the accuracy or precision of their timing as compared with wild-type littermates. This suggests that expression of the CLOCK protein is not necessary for normal interval timing.

  11. Department of Defense Precise Time and Time Interval program improvement plan

    Science.gov (United States)

    Bowser, J. R.

    1981-01-01

    The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.

  12. Time-to-code converter with selection of time intervals on duration

    International Nuclear Information System (INIS)

    Atanasov, I.Kh.; Rusanov, I.R.; )

    2001-01-01

    Identification of elementary particles on the basis of time-of-flight represents the important approach of the preliminary selection procedure. Paper describes a time-to-code converter with preliminary selection of the measured time intervals as to duration. It consists of a time-to-amplitude converter, an analog-to-digital converter, a unit of selection of time intervals as to duration, a unit of total reset and CAMAC command decoder. The time-to-code converter enables to measure time intervals with 100 ns accuracy within 0-100 ns range. Output code capacity is of 10. Selection time constitutes 50 ns [ru

  13. Continuous time modelling with individually varying time intervals for oscillating and non-oscillating processes.

    Science.gov (United States)

    Voelkle, Manuel C; Oud, Johan H L

    2013-02-01

    When designing longitudinal studies, researchers often aim at equal intervals. In practice, however, this goal is hardly ever met, with different time intervals between assessment waves and different time intervals between individuals being more the rule than the exception. One of the reasons for the introduction of continuous time models by means of structural equation modelling has been to deal with irregularly spaced assessment waves (e.g., Oud & Delsing, 2010). In the present paper we extend the approach to individually varying time intervals for oscillating and non-oscillating processes. In addition, we show not only that equal intervals are unnecessary but also that it can be advantageous to use unequal sampling intervals, in particular when the sampling rate is low. Two examples are provided to support our arguments. In the first example we compare a continuous time model of a bivariate coupled process with varying time intervals to a standard discrete time model to illustrate the importance of accounting for the exact time intervals. In the second example the effect of different sampling intervals on estimating a damped linear oscillator is investigated by means of a Monte Carlo simulation. We conclude that it is important to account for individually varying time intervals, and encourage researchers to conceive of longitudinal studies with different time intervals within and between individuals as an opportunity rather than a problem. © 2012 The British Psychological Society.

  14. Identifying factors associated with regular physical activity in leisure time among Canadian adolescents.

    Science.gov (United States)

    Godin, Gaston; Anderson, Donna; Lambert, Léo-Daniel; Desharnais, Raymond

    2005-01-01

    The purpose of this study was to identify the factors explaining regular physical activity among Canadian adolescents. A cohort study conducted over a period of 2 years. A French-language high school located near Québec City. A cohort of 740 students (352 girls; 388 boys) aged 13.3 +/- 1.0 years at baseline. Psychosocial, life context, profile, and sociodemographic variables were assessed at baseline and 1 and 2 years after baseline. Exercising almost every day during leisure time at each measurement time was the dependent variable. The Generalized Estimating Equations (GEE) analysis indicated that exercising almost every day was significantly associated with a high intention to exercise (odds ratio [OR]: 8.33, confidence interval [CI] 95%: 5.26, 13.18), being satisfied with the activity practiced (OR: 2.07, CI 95%: 1.27, 3.38), perceived descriptive norm (OR: 1.82, CI 95%: 1.41, 2.35), being a boy (OR: 1.83, CI 95%: 1.37, 2.46), practicing "competitive" activities (OR: 1.80, CI 95%: 1.37, 2.36), eating a healthy breakfast (OR: 1.68, CI 95%: 1.09, 2.60), and normative beliefs (OR: 1.48, CI 95%: 1.14, 1.90). Specific GEE analysis for gender indicated slight but significant differences. This study provides evidence for the need to design interventions that are gender specific and that focus on increasing intention to exercise regularly.

  15. Delay-Dependent Guaranteed Cost Control of an Interval System with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Xiao Min

    2009-01-01

    Full Text Available This paper concerns the problem of the delay-dependent robust stability and guaranteed cost control for an interval system with time-varying delay. The interval system with matrix factorization is provided and leads to less conservative conclusions than solving a square root. The time-varying delay is assumed to belong to an interval and the derivative of the interval time-varying delay is not a restriction, which allows a fast time-varying delay; also its applicability is broad. Based on the Lyapunov-Ktasovskii approach, a delay-dependent criterion for the existence of a state feedback controller, which guarantees the closed-loop system stability, the upper bound of cost function, and disturbance attenuation lever for all admissible uncertainties as well as out perturbation, is proposed in terms of linear matrix inequalities (LMIs. The criterion is derived by free weighting matrices that can reduce the conservatism. The effectiveness has been verified in a number example and the compute results are presented to validate the proposed design method.

  16. Randomness control of vehicular motion through a sequence of traffic signals at irregular intervals

    International Nuclear Information System (INIS)

    Nagatani, Takashi

    2010-01-01

    We study the regularization of irregular motion of a vehicle moving through the sequence of traffic signals with a disordered configuration. Each traffic signal is controlled by both cycle time and phase shift. The cycle time is the same for all signals, while the phase shift varies from signal to signal by synchronizing with intervals between a signal and the next signal. The nonlinear dynamic model of the vehicular motion is presented by the stochastic nonlinear map. The vehicle exhibits the very complex behavior with varying both cycle time and strength of irregular intervals. The irregular motion induced by the disordered configuration is regularized by adjusting the phase shift within the regularization regions.

  17. Reviewing interval cancers: Time well spent?

    International Nuclear Information System (INIS)

    Gower-Thomas, Kate; Fielder, Hilary M.P.; Branston, Lucy; Greening, Sarah; Beer, Helen; Rogers, Cerilan

    2002-01-01

    OBJECTIVES: To categorize interval cancers, and thus identify false-negatives, following prevalent and incident screens in the Welsh breast screening programme. SETTING: Breast Test Wales (BTW) Llandudno, Cardiff and Swansea breast screening units. METHODS: Five hundred and sixty interval breast cancers identified following negative mammographic screening between 1989 and 1997 were reviewed by eight screening radiologists. The blind review was achieved by mixing the screening films of women who subsequently developed an interval cancer with screen negative films of women who did not develop cancer, in a ratio of 4:1. Another radiologist used patients' symptomatic films to record a reference against which the reviewers' reports of the screening films were compared. Interval cancers were categorized as 'true', 'occult', 'false-negative' or 'unclassified' interval cancers or interval cancers with minimal signs, based on the National Health Service breast screening programme (NHSBSP) guidelines. RESULTS: Of the classifiable interval films, 32% were false-negatives, 55% were true intervals and 12% occult. The proportion of false-negatives following incident screens was half that following prevalent screens (P = 0.004). Forty percent of the seed films were recalled by the panel. CONCLUSIONS: Low false-negative interval cancer rates following incident screens (18%) versus prevalent screens (36%) suggest that lower cancer detection rates at incident screens may have resulted from fewer cancers than expected being present, rather than from a failure to detect tumours. The panel method for categorizing interval cancers has significant flaws as the results vary markedly with different protocol and is no more accurate than other, quicker and more timely methods. Gower-Thomas, K. et al. (2002)

  18. Is high-intensity interval training a time-efficient exercise strategy to improve health and fitness?

    Science.gov (United States)

    Gillen, Jenna B; Gibala, Martin J

    2014-03-01

    Growing research suggests that high-intensity interval training (HIIT) is a time-efficient exercise strategy to improve cardiorespiratory and metabolic health. "All out" HIIT models such as Wingate-type exercise are particularly effective, but this type of training may not be safe, tolerable or practical for many individuals. Recent studies, however, have revealed the potential for other models of HIIT, which may be more feasible but are still time-efficient, to stimulate adaptations similar to more demanding low-volume HIIT models and high-volume endurance-type training. As little as 3 HIIT sessions per week, involving ≤10 min of intense exercise within a time commitment of ≤30 min per session, including warm-up, recovery between intervals and cool down, has been shown to improve aerobic capacity, skeletal muscle oxidative capacity, exercise tolerance and markers of disease risk after only a few weeks in both healthy individuals and people with cardiometabolic disorders. Additional research is warranted, as studies conducted have been relatively short-term, with a limited number of measurements performed on small groups of subjects. However, given that "lack of time" remains one of the most commonly cited barriers to regular exercise participation, low-volume HIIT is a time-efficient exercise strategy that warrants consideration by health practitioners and fitness professionals.

  19. Recall intervals and time used for examination and prevention by dentists in child dental care in Denmark, Iceland, Norway and Sweden in 1996 and 2014

    DEFF Research Database (Denmark)

    Wang, N J; Petersen, P E; Sveinsdóttir, E G

    2018-01-01

    OBJECTIVE: The purpose of the present study was to explore intervals between regular dental examination and the time dentists spent for examination and preventive dental care of children in 1996 and 2014. PARTICIPANTS AND METHODS: In Denmark, Norway and Sweden, random samples of dentists working...... examinations in three of the four countries in 2014 than in 1996. CONCLUSIONS: This study of trends in dental care delivered by dentists during recent decades showed moves towards extended recall intervals and preventive care individualized according to caries risk. In addition, extending intervals could...... dentists used ample time delivering preventive care to children. Dentists reported spending significantly more time providing preventive care for caries risk children than for other children both in 1996 and 2014. Concurrent with extended intervals, dentists reported spending longer performing routine...

  20. Continuous-time interval model identification of blood glucose dynamics for type 1 diabetes

    Science.gov (United States)

    Kirchsteiger, Harald; Johansson, Rolf; Renard, Eric; del Re, Luigi

    2014-07-01

    While good physiological models of the glucose metabolism in type 1 diabetic patients are well known, their parameterisation is difficult. The high intra-patient variability observed is a further major obstacle. This holds for data-based models too, so that no good patient-specific models are available. Against this background, this paper proposes the use of interval models to cover the different metabolic conditions. The control-oriented models contain a carbohydrate and insulin sensitivity factor to be used for insulin bolus calculators directly. Available clinical measurements were sampled on an irregular schedule which prompts the use of continuous-time identification, also for the direct estimation of the clinically interpretable factors mentioned above. An identification method is derived and applied to real data from 28 diabetic patients. Model estimation was done on a clinical data-set, whereas validation results shown were done on an out-of-clinic, everyday life data-set. The results show that the interval model approach allows a much more regular estimation of the parameters and avoids physiologically incompatible parameter estimates.

  1. Interval timing in genetically modified mice: a simple paradigm

    OpenAIRE

    Balci, F.; Papachristos, E. B.; Gallistel, C. R.; Brunner, D.; Gibson, J.; Shumyatsky, G. P.

    2007-01-01

    We describe a behavioral screen for the quantitative study of interval timing and interval memory in mice. Mice learn to switch from a short-latency feeding station to a long-latency station when the short latency has passed without a feeding. The psychometric function is the cumulative distribution of switch latencies. Its median measures timing accuracy and its interquartile interval measures timing precision. Next, using this behavioral paradigm, we have examined mice with a gene knockout ...

  2. A model of interval timing by neural integration.

    Science.gov (United States)

    Simen, Patrick; Balci, Fuat; de Souza, Laura; Cohen, Jonathan D; Holmes, Philip

    2011-06-22

    We show that simple assumptions about neural processing lead to a model of interval timing as a temporal integration process, in which a noisy firing-rate representation of time rises linearly on average toward a response threshold over the course of an interval. Our assumptions include: that neural spike trains are approximately independent Poisson processes, that correlations among them can be largely cancelled by balancing excitation and inhibition, that neural populations can act as integrators, and that the objective of timed behavior is maximal accuracy and minimal variance. The model accounts for a variety of physiological and behavioral findings in rodents, monkeys, and humans, including ramping firing rates between the onset of reward-predicting cues and the receipt of delayed rewards, and universally scale-invariant response time distributions in interval timing tasks. It furthermore makes specific, well-supported predictions about the skewness of these distributions, a feature of timing data that is usually ignored. The model also incorporates a rapid (potentially one-shot) duration-learning procedure. Human behavioral data support the learning rule's predictions regarding learning speed in sequences of timed responses. These results suggest that simple, integration-based models should play as prominent a role in interval timing theory as they do in theories of perceptual decision making, and that a common neural mechanism may underlie both types of behavior.

  3. Time interval approach to the pulsed neutron logging method

    International Nuclear Information System (INIS)

    Zhao Jingwu; Su Weining

    1994-01-01

    The time interval of neighbouring neutrons emitted from a steady state neutron source can be treated as that from a time-dependent neutron source. In the rock space, the neutron flux is given by the neutron diffusion equation and is composed of an infinite terms. Each term s composed of two die-away curves. The delay action is discussed and used to measure the time interval with only one detector in the experiment. Nuclear reactions with the time distribution due to different types of radiations observed in the neutron well-logging methods are presented with a view to getting the rock nuclear parameters from the time interval technique

  4. Delay-Dependent Guaranteed Cost H∞ Control of an Interval System with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Zhongke Shi

    2009-01-01

    Full Text Available This paper concerns the problem of the delay-dependent robust stability and guaranteed cost H∞ control for an interval system with time-varying delay. The interval system with matrix factorization is provided and leads to less conservative conclusions than solving a square root. The time-varying delay is assumed to belong to an interval and the derivative of the interval time-varying delay is not a restriction, which allows a fast time-varying delay; also its applicability is broad. Based on the Lyapunov-Ktasovskii approach, a delay-dependent criterion for the existence of a state feedback controller, which guarantees the closed-loop system stability, the upper bound of cost function, and disturbance attenuation lever for all admissible uncertainties as well as out perturbation, is proposed in terms of linear matrix inequalities (LMIs. The criterion is derived by free weighting matrices that can reduce the conservatism. The effectiveness has been verified in a number example and the compute results are presented to validate the proposed design method.

  5. Discrete-time optimal control and games on large intervals

    CERN Document Server

    Zaslavski, Alexander J

    2017-01-01

    Devoted to the structure of approximate solutions of discrete-time optimal control problems and approximate solutions of dynamic discrete-time two-player zero-sum games, this book presents results on properties of approximate solutions in an interval that is independent lengthwise, for all sufficiently large intervals. Results concerning the so-called turnpike property of optimal control problems and zero-sum games in the regions close to the endpoints of the time intervals are the main focus of this book. The description of the structure of approximate solutions on sufficiently large intervals and its stability will interest graduate students and mathematicians in optimal control and game theory, engineering, and economics. This book begins with a brief overview and moves on to analyze the structure of approximate solutions of autonomous nonconcave discrete-time optimal control Lagrange problems.Next the structures of approximate solutions of autonomous discrete-time optimal control problems that are discret...

  6. Specifying real-time systems with interval logic

    Science.gov (United States)

    Rushby, John

    1988-01-01

    Pure temporal logic makes no reference to time. An interval temporal logic and an extension to that logic which includes real time constraints are described. The application of this logic by giving a specification for the well-known lift (elevator) example is demonstrated. It is shown how interval logic can be extended to include a notion of process. How the specification language and verification environment of EHDM could be enhanced to support this logic is described. A specification of the alternating bit protocol in this extended version of the specification language of EHDM is given.

  7. Unpacking a time interval lengthens its perceived temporal distance

    Directory of Open Access Journals (Sweden)

    Yang eLiu

    2014-11-01

    Full Text Available In quantity estimation, people often perceive that the whole is less than the sum of its parts. The current study investigated such an unpacking effect in temporal distance judgment. Our results showed that participants in the unpacked condition judged a given time interval longer than those in the packed condition, even the time interval was kept constant between the two conditions. Furthermore, this unpacking effect persists regardless of the unpacking ways we employed. Results suggest that unpacking a time interval may be a good strategy for lengthening its perceived temporal distance.

  8. Traces of times past : Representations of temporal intervals in memory

    NARCIS (Netherlands)

    Taatgen, Niels; van Rijn, Hedderik

    2011-01-01

    Theories of time perception typically assume that some sort of memory represents time intervals. This memory component is typically underdeveloped in theories of time perception. Following earlier work that suggested that representations of different time intervals contaminate each other (Grondin,

  9. Learned Interval Time Facilitates Associate Memory Retrieval

    Science.gov (United States)

    van de Ven, Vincent; Kochs, Sarah; Smulders, Fren; De Weerd, Peter

    2017-01-01

    The extent to which time is represented in memory remains underinvestigated. We designed a time paired associate task (TPAT) in which participants implicitly learned cue-time-target associations between cue-target pairs and specific cue-target intervals. During subsequent memory testing, participants showed increased accuracy of identifying…

  10. Interval-Censored Time-to-Event Data Methods and Applications

    CERN Document Server

    Chen, Ding-Geng

    2012-01-01

    Interval-Censored Time-to-Event Data: Methods and Applications collects the most recent techniques, models, and computational tools for interval-censored time-to-event data. Top biostatisticians from academia, biopharmaceutical industries, and government agencies discuss how these advances are impacting clinical trials and biomedical research. Divided into three parts, the book begins with an overview of interval-censored data modeling, including nonparametric estimation, survival functions, regression analysis, multivariate data analysis, competing risks analysis, and other models for interva

  11. Early diastolic time intervals during hypertensive pregnancy.

    Science.gov (United States)

    Spinelli, L; Ferro, G; Nappi, C; Farace, M J; Talarico, G; Cinquegrana, G; Condorelli, M

    1987-10-01

    Early diastolic time intervals have been assessed by means of the echopolycardiographic method in 17 pregnant women who developed hypertension during pregnancy (HP) and in 14 normal pregnant women (N). Systolic time intervals (STI), stroke volume (SV), ejection fraction (EF), and mean velocity of myocardial fiber shortening (VCF) were also evaluated. Recordings were performed in the left lateral decubitus (LLD) and then in the supine decubitus (SD). In LLD, isovolumic relaxation period (IRP) was prolonged in the hypertensive pregnant women compared with normal pregnant women (HP 51 +/- 12.5 ms, N 32.4 +/- 15 ms p less than 0.05), whereas time of the mitral valve maximum opening (DE) was not different in the groups. There was no difference in SV, EF, and mean VCF, whereas STI showed only a significant (p less than 0.05) lengthening of pre-ejection period (PEP) in HP. When the subjects shifted from the left lateral to the supine decubitus position, left ventricular ejection time index (LVETi) and SV decreased significantly (p less than 0.05) in both normotensive hypertensive pregnant women. IRP and PEP lengthened significantly (p less than 0.05) only in normals, whereas they were unchanged in HP. DE time did not vary in either group. In conclusion, hypertension superimposed on pregnancy induces lengthening of IRP, as well as of PEP, and minimizes the effects of the postural changes in preload on the above-mentioned time intervals.

  12. Total variation regularization for a backward time-fractional diffusion problem

    International Nuclear Information System (INIS)

    Wang, Liyan; Liu, Jijun

    2013-01-01

    Consider a two-dimensional backward problem for a time-fractional diffusion process, which can be considered as image de-blurring where the blurring process is assumed to be slow diffusion. In order to avoid the over-smoothing effect for object image with edges and to construct a fast reconstruction scheme, the total variation regularizing term and the data residual error in the frequency domain are coupled to construct the cost functional. The well posedness of this optimization problem is studied. The minimizer is sought approximately using the iteration process for a series of optimization problems with Bregman distance as a penalty term. This iteration reconstruction scheme is essentially a new regularizing scheme with coupling parameter in the cost functional and the iteration stopping times as two regularizing parameters. We give the choice strategy for the regularizing parameters in terms of the noise level of measurement data, which yields the optimal error estimate on the iterative solution. The series optimization problems are solved by alternative iteration with explicit exact solution and therefore the amount of computation is much weakened. Numerical implementations are given to support our theoretical analysis on the convergence rate and to show the significant reconstruction improvements. (paper)

  13. Timing of multiple overlapping intervals : How many clocks do we have?

    NARCIS (Netherlands)

    van Rijn, Hedderik; Taatgen, Niels A.

    2008-01-01

    Humans perceive and reproduce short intervals of time (e.g. 1-60 s) relatively accurately, and are capable of timing multiple overlapping intervals if these intervals are presented in different modalities [e.g., Rousseau, L., & Rousseau, RL (1996). Stop-reaction time and the internal clock.

  14. The Time Is Up: Compression of Visual Time Interval Estimations of Bimodal Aperiodic Patterns

    Science.gov (United States)

    Duarte, Fabiola; Lemus, Luis

    2017-01-01

    The ability to estimate time intervals subserves many of our behaviors and perceptual experiences. However, it is not clear how aperiodic (AP) stimuli affect our perception of time intervals across sensory modalities. To address this question, we evaluated the human capacity to discriminate between two acoustic (A), visual (V) or audiovisual (AV) time intervals of trains of scattered pulses. We first measured the periodicity of those stimuli and then sought for correlations with the accuracy and reaction times (RTs) of the subjects. We found that, for all time intervals tested in our experiment, the visual system consistently perceived AP stimuli as being shorter than the periodic (P) ones. In contrast, such a compression phenomenon was not apparent during auditory trials. Our conclusions are: first, the subjects exposed to P stimuli are more likely to measure their durations accurately. Second, perceptual time compression occurs for AP visual stimuli. Lastly, AV discriminations are determined by A dominance rather than by AV enhancement. PMID:28848406

  15. Monitoring molecular interactions using photon arrival-time interval distribution analysis

    Science.gov (United States)

    Laurence, Ted A [Livermore, CA; Weiss, Shimon [Los Angels, CA

    2009-10-06

    A method for analyzing/monitoring the properties of species that are labeled with fluorophores. A detector is used to detect photons emitted from species that are labeled with one or more fluorophores and located in a confocal detection volume. The arrival time of each of the photons is determined. The interval of time between various photon pairs is then determined to provide photon pair intervals. The number of photons that have arrival times within the photon pair intervals is also determined. The photon pair intervals are then used in combination with the corresponding counts of intervening photons to analyze properties and interactions of the molecules including brightness, concentration, coincidence and transit time. The method can be used for analyzing single photon streams and multiple photon streams.

  16. Interval timing in genetically modified mice: a simple paradigm.

    Science.gov (United States)

    Balci, F; Papachristos, E B; Gallistel, C R; Brunner, D; Gibson, J; Shumyatsky, G P

    2008-04-01

    We describe a behavioral screen for the quantitative study of interval timing and interval memory in mice. Mice learn to switch from a short-latency feeding station to a long-latency station when the short latency has passed without a feeding. The psychometric function is the cumulative distribution of switch latencies. Its median measures timing accuracy and its interquartile interval measures timing precision. Next, using this behavioral paradigm, we have examined mice with a gene knockout of the receptor for gastrin-releasing peptide that show enhanced (i.e. prolonged) freezing in fear conditioning. We have tested the hypothesis that the mutants freeze longer because they are more uncertain than wild types about when to expect the electric shock. The knockouts however show normal accuracy and precision in timing, so we have rejected this alternative hypothesis. Last, we conduct the pharmacological validation of our behavioral screen using d-amphetamine and methamphetamine. We suggest including the analysis of interval timing and temporal memory in tests of genetically modified mice for learning and memory and argue that our paradigm allows this to be done simply and efficiently.

  17. Ratio-based lengths of intervals to improve fuzzy time series forecasting.

    Science.gov (United States)

    Huarng, Kunhuang; Yu, Tiffany Hui-Kuang

    2006-04-01

    The objective of this study is to explore ways of determining the useful lengths of intervals in fuzzy time series. It is suggested that ratios, instead of equal lengths of intervals, can more properly represent the intervals among observations. Ratio-based lengths of intervals are, therefore, proposed to improve fuzzy time series forecasting. Algebraic growth data, such as enrollments and the stock index, and exponential growth data, such as inventory demand, are chosen as the forecasting targets, before forecasting based on the various lengths of intervals is performed. Furthermore, sensitivity analyses are also carried out for various percentiles. The ratio-based lengths of intervals are found to outperform the effective lengths of intervals, as well as the arbitrary ones in regard to the different statistical measures. The empirical analysis suggests that the ratio-based lengths of intervals can also be used to improve fuzzy time series forecasting.

  18. Time interval measurement between to emission: a systematics

    International Nuclear Information System (INIS)

    Bizard, G.; Bougault, R.; Brou, R.; Colin, J.; Durand, D.; Genoux-Lubain, A.; Horn, D.; Kerambrun, A.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Lopez, O.; Louvel, M.; Mahi, M.; Meslin, C.; Steckmeyer, J.C.; Tamain, B.; Wieloch, A.

    1998-01-01

    A systematic study of the evolution of intervals of fragment emission times as a function of the energy deposited in the compound system was performed. Several measurements, Ne at 60 MeV/u, Ar at 30 and 60 MeV/u and two measurements for Kr at 60 MeV/u (central and semi-peripheral collisions) are presented. In all the experiments the target was Au and the mass of the compounds system was around A = 200. The excitation energies per nucleon reached in the case of these heavy systems cover the range of 3 to 5.5 MeV/u. The method used to determine the emission time intervals is based on the correlation functions associated to the relative angle distributions. The gaps between the data and simulations allow to evaluate the emission times. A rapid decrease of these time intervals was observed when the excitation energy increased. This variation starts at 500 fm/c which corresponds to a sequential emission. This relatively long time which indicates a weak interaction between fragments, corresponds practically to the measurement threshold. The shortest intervals (about 50 fm/c) are associated to a spontaneous multifragmentation and were observed in the case of central collisions at Ar+Au and Kr+Au at 60 MeV/u. Two interpretations are possible. The multifragmentation process might be viewed as a sequential process of very short time-separation or else, one can separate two zones heaving in mind that the multifragmentation is predominant from 4,5 MeV/u excitation energy upwards. This question is still open and its study is under way at LPC. An answer could come from the study of the rupture process of an excited nucleus, notably by the determination of its life-time

  19. Time interval measurement between two emissions: Ar + Au

    International Nuclear Information System (INIS)

    Bizard, G.; Bougault, R.; Brou, R.; Buta, A.; Durand, D.; Genoux-Lubain, A.; Hamdani, T.; Horn, D.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Louvel, M.; Peter, J.; Regimbart, R.; Steckmeyer, J.C.; Tamain, B.

    1998-01-01

    The Ar + Au system was studied at two bombarding energies, 30 and 60 A.MeV. The comparison of the distributions of fragment emission angles in central collisions was carried out by means of a simulation allowing the emission time interval variation. It was found that this interval depends on the bombarding energy (i.e. deposed excitation energy).For 30 A.MeV this interval is 500 fm/c (0.33 · 10 -23 s), while for 60 A.MeV it is so short that the multifragmentation concept can be used

  20. Nonparametric Estimation of Interval Reliability for Discrete-Time Semi-Markov Systems

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Limnios, Nikolaos

    2016-01-01

    In this article, we consider a repairable discrete-time semi-Markov system with finite state space. The measure of the interval reliability is given as the probability of the system being operational over a given finite-length time interval. A nonparametric estimator is proposed for the interval...

  1. Foundation for a Time Interval Access Control Model

    National Research Council Canada - National Science Library

    Afinidad, Francis B; Levin, Timothy E; Irvine, Cynthia E; Nguyen, Thuy D

    2005-01-01

    A new model for representing temporal access control policies is introduced. In this model, temporal authorizations are represented by time attributes associated with both subjects and objects, and a time interval access graph...

  2. Finite-Time Stability of Large-Scale Systems with Interval Time-Varying Delay in Interconnection

    Directory of Open Access Journals (Sweden)

    T. La-inchua

    2017-01-01

    Full Text Available We investigate finite-time stability of a class of nonlinear large-scale systems with interval time-varying delays in interconnection. Time-delay functions are continuous but not necessarily differentiable. Based on Lyapunov stability theory and new integral bounding technique, finite-time stability of large-scale systems with interval time-varying delays in interconnection is derived. The finite-time stability criteria are delays-dependent and are given in terms of linear matrix inequalities which can be solved by various available algorithms. Numerical examples are given to illustrate effectiveness of the proposed method.

  3. Regular and chaotic dynamics in time-dependent relativistic mean-field theory

    International Nuclear Information System (INIS)

    Vretenar, D.; Ring, P.; Lalazissis, G.A.; Poeschl, W.

    1997-01-01

    Isoscalar and isovector monopole oscillations that correspond to giant resonances in spherical nuclei are described in the framework of time-dependent relativistic mean-field theory. Time-dependent and self-consistent calculations that reproduce experimental data on monopole resonances in 208 Pb show that the motion of the collective coordinate is regular for isoscalar oscillations, and that it becomes chaotic when initial conditions correspond to the isovector mode. Regular collective dynamics coexists with chaotic oscillations on the microscopic level. Time histories, Fourier spectra, state-space plots, Poincare sections, autocorrelation functions, and Lyapunov exponents are used to characterize the nonlinear system and to identify chaotic oscillations. Analogous considerations apply to higher multipolarities. copyright 1997 The American Physical Society

  4. Increasing work-time influence: consequences for flexibility, variability, regularity and predictability.

    Science.gov (United States)

    Nabe-Nielsen, Kirsten; Garde, Anne Helene; Aust, Birgit; Diderichsen, Finn

    2012-01-01

    This quasi-experimental study investigated how an intervention aiming at increasing eldercare workers' influence on their working hours affected the flexibility, variability, regularity and predictability of the working hours. We used baseline (n = 296) and follow-up (n = 274) questionnaire data and interviews with intervention-group participants (n = 32). The work units in the intervention group designed their own intervention comprising either implementation of computerised self-scheduling (subgroup A), collection of information about the employees' work-time preferences by questionnaires (subgroup B), or discussion of working hours (subgroup C). Only computerised self-scheduling changed the working hours and the way they were planned. These changes implied more flexible but less regular working hours and an experience of less predictability and less continuity in the care of clients and in the co-operation with colleagues. In subgroup B and C, the participants ended up discussing the potential consequences of more work-time influence without actually implementing any changes. Employee work-time influence may buffer the adverse effects of shift work. However, our intervention study suggested that while increasing the individual flexibility, increasing work-time influence may also result in decreased regularity of the working hours and less continuity in the care of clients and co-operation with colleagues.

  5. Learning About Time Within the Spinal Cord II: Evidence that Temporal Regularity is Encoded by a Spinal Oscillator

    Directory of Open Access Journals (Sweden)

    Kuan Hsien Lee

    2016-02-01

    Full Text Available How a stimulus impacts spinal cord function depends upon temporal relations. When intermittent noxious stimulation (shock is applied and the interval between shock pulses is varied (unpredictable, it induces a lasting alteration that inhibits adaptive learning. If the same stimulus is applied in a temporally regular (predictable manner, the capacity to learn is preserved and a protective/restorative effect is engaged that counters the adverse effect of variable stimulation. Sensitivity to temporal relations implies a capacity to encode time. This study explores how spinal neurons discriminate variable and fixed spaced stimulation. Communication with the brain was blocked by means of a spinal transection and adaptive capacity was tested using an instrumental learning task. In this task, subjects must learn to maintain a hind limb in a flexed position to minimize shock exposure. To evaluate the possibility that a distinct class of afferent fibers provide a sensory cue for regularity, we manipulated the temporal relation between shocks given to two dermatomes (leg and tail. Evidence for timing emerged when the stimuli were applied in a coherent manner across dermatomes, implying that a central (spinal process detects regularity. Next, we show that fixed spaced stimulation has a restorative effect when half the physical stimuli are randomly omitted, as long as the stimuli remain in phase, suggesting that stimulus regularity is encoded by an internal oscillator Research suggests that the oscillator that drives the tempo of stepping depends upon neurons within the rostral lumbar (L1-L2 region. Disrupting communication with the L1-L2 tissue by means of a L3 transection eliminated the restorative effect of fixed spaced stimulation. Implications of the results for step training and rehabilitation after injury are discussed.

  6. Inactivation of the Medial-Prefrontal Cortex Impairs Interval Timing Precision, but Not Timing Accuracy or Scalar Timing in a Peak-Interval Procedure in Rats

    Directory of Open Access Journals (Sweden)

    Catalin V. Buhusi

    2018-06-01

    Full Text Available Motor sequence learning, planning and execution of goal-directed behaviors, and decision making rely on accurate time estimation and production of durations in the seconds-to-minutes range. The pathways involved in planning and execution of goal-directed behaviors include cortico-striato-thalamo-cortical circuitry modulated by dopaminergic inputs. A critical feature of interval timing is its scalar property, by which the precision of timing is proportional to the timed duration. We examined the role of medial prefrontal cortex (mPFC in timing by evaluating the effect of its reversible inactivation on timing accuracy, timing precision and scalar timing. Rats were trained to time two durations in a peak-interval (PI procedure. Reversible mPFC inactivation using GABA agonist muscimol resulted in decreased timing precision, with no effect on timing accuracy and scalar timing. These results are partly at odds with studies suggesting that ramping prefrontal activity is crucial to timing but closely match simulations with the Striatal Beat Frequency (SBF model proposing that timing is coded by the coincidental activation of striatal neurons by cortical inputs. Computer simulations indicate that in SBF, gradual inactivation of cortical inputs results in a gradual decrease in timing precision with preservation of timing accuracy and scalar timing. Further studies are needed to differentiate between timing models based on coincidence detection and timing models based on ramping mPFC activity, and clarify whether mPFC is specifically involved in timing, or more generally involved in attention, working memory, or response selection/inhibition.

  7. Hybrid integrated circuit for charge-to-time interval conversion

    Energy Technology Data Exchange (ETDEWEB)

    Basiladze, S.G.; Dotsenko, Yu.Yu.; Man' yakov, P.K.; Fedorchenko, S.N. (Joint Inst. for Nuclear Research, Dubna (USSR))

    The hybrid integrated circuit for charge-to time interval conversion with nanosecond input fast response is described. The circuit can be used in energy measuring channels, time-to-digital converters and in the modified variant in amplitude-to-digital converters. The converter described consists of a buffer amplifier, a linear transmission circuit, a direct current source and a unit of time interval separation. The buffer amplifier represents a current follower providing low input and high output resistances by the current feedback. It is concluded that the described converter excelled the QT100B circuit analogous to it in a number of parameters especially, in thermostability.

  8. Diverse Regular Employees and Non-regular Employment (Japanese)

    OpenAIRE

    MORISHIMA Motohiro

    2011-01-01

    Currently there are high expectations for the introduction of policies related to diverse regular employees. These policies are a response to the problem of disparities between regular and non-regular employees (part-time, temporary, contract and other non-regular employees) and will make it more likely that workers can balance work and their private lives while companies benefit from the advantages of regular employment. In this paper, I look at two issues that underlie this discussion. The ...

  9. Perception of short time scale intervals in a hypnotic virtuoso

    NARCIS (Netherlands)

    Noreika, Valdas; Falter, Christine M.; Arstila, Valtteri; Wearden, John H.; Kallio, Sakari

    2012-01-01

    Previous studies showed that hypnotized individuals underestimate temporal intervals in the range of several seconds to tens of minutes. However, no previous work has investigated whether duration perception is equally disorderly when shorter time intervals are probed. In this study, duration

  10. Cardiac time intervals by tissue Doppler imaging M-mode echocardiography

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor

    2016-01-01

    for myocardial myocytes to achieve an LV pressure equal to that of aorta increases, resulting in a prolongation of the isovolumic contraction time (IVCT). Furthermore, the ability of myocardial myocytes to maintain the LV pressure decreases, resulting in reduction in the ejection time (ET). As LV diastolic...... of whether the LV is suffering from impaired systolic or diastolic function. A novel method of evaluating the cardiac time intervals has recently evolved. Using tissue Doppler imaging (TDI) M-mode through the mitral valve (MV) to estimate the cardiac time intervals may be an improved method reflecting global...

  11. Across-province standardization and comparative analysis of time-to-care intervals for cancer

    Directory of Open Access Journals (Sweden)

    Nugent Zoann

    2007-10-01

    Full Text Available Abstract Background A set of consistent, standardized definitions of intervals and populations on which to report across provinces is needed to inform the Provincial/Territorial Deputy Ministries of Health on progress of the Ten-Year Plan to Strengthen Health Care. The objectives of this project were to: 1 identify a set of criteria and variables needed to create comparable measures of important time-to-cancer-care intervals that could be applied across provinces and 2 use the measures to compare time-to-care across participating provinces for lung and colorectal cancer patients diagnosed in 2004. Methods A broad-based group of stakeholders from each of the three participating cancer agencies was assembled to identify criteria for time-to-care intervals to standardize, evaluate possible intervals and their corresponding start and end time points, and finalize the selection of intervals to pursue. Inclusion/exclusion criteria were identified for the patient population and the selected time points to reduce potential selection bias. The provincial 2004 colorectal and lung cancer data were used to illustrate across-province comparisons for the selected time-to-care intervals. Results Criteria identified as critical for time-to-care intervals and corresponding start and end points were: 1 relevant to patients, 2 relevant to clinical care, 3 unequivocally defined, and 4 currently captured consistently across cancer agencies. Time from diagnosis to first radiation or chemotherapy treatment and the smaller components, time from diagnosis to first consult with an oncologist and time from first consult to first radiation or chemotherapy treatment, were the only intervals that met all four criteria. Timeliness of care for the intervals evaluated was similar between the provinces for lung cancer patients but significant differences were found for colorectal cancer patients. Conclusion We identified criteria important for selecting time-to-care intervals

  12. Interval timing under a behavioral microscope: Dissociating motivational and timing processes in fixed-interval performance.

    Science.gov (United States)

    Daniels, Carter W; Sanabria, Federico

    2017-03-01

    The distribution of latencies and interresponse times (IRTs) of rats was compared between two fixed-interval (FI) schedules of food reinforcement (FI 30 s and FI 90 s), and between two levels of food deprivation. Computational modeling revealed that latencies and IRTs were well described by mixture probability distributions embodying two-state Markov chains. Analysis of these models revealed that only a subset of latencies is sensitive to the periodicity of reinforcement, and prefeeding only reduces the size of this subset. The distribution of IRTs suggests that behavior in FI schedules is organized in bouts that lengthen and ramp up in frequency with proximity to reinforcement. Prefeeding slowed down the lengthening of bouts and increased the time between bouts. When concatenated, latency and IRT models adequately reproduced sigmoidal FI response functions. These findings suggest that behavior in FI schedules fluctuates in and out of schedule control; an account of such fluctuation suggests that timing and motivation are dissociable components of FI performance. These mixture-distribution models also provide novel insights on the motivational, associative, and timing processes expressed in FI performance. These processes may be obscured, however, when performance in timing tasks is analyzed in terms of mean response rates.

  13. Optimal time interval for induction of immunologic adaptive response

    International Nuclear Information System (INIS)

    Ju Guizhi; Song Chunhua; Liu Shuzheng

    1994-01-01

    The optimal time interval between prior dose (D1) and challenge dose (D2) for the induction of immunologic adaptive response was investigated. Kunming mice were exposed to 75 mGy X-rays at a dose rate of 12.5 mGy/min. 3, 6, 12, 24 or 60 h after the prior irradiation the mice were challenged with a dose of 1.5 Gy at a dose rate of 0.33 Gy/min. 18h after D2, the mice were sacrificed for examination of immunological parameters. The results showed that with an interval of 6 h between D1 and D2, the adaptive response of the reaction of splenocytes to LPS was induced, and with an interval of 12 h the adaptive responses of spontaneous incorporation of 3 H-TdR into thymocytes and the reaction of splenocytes to Con A and LPS were induced with 75 mGy prior irradiation. The data suggested that the optimal time intervals between D1 and D2 for the induction of immunologic adaptive response were 6 h and 12 h with a D1 of 75 mGy and a D2 of 1.5 Gy. The mechanism of immunologic adaptation following low dose radiation is discussed

  14. Major earthquakes occur regularly on an isolated plate boundary fault.

    Science.gov (United States)

    Berryman, Kelvin R; Cochran, Ursula A; Clark, Kate J; Biasi, Glenn P; Langridge, Robert M; Villamor, Pilar

    2012-06-29

    The scarcity of long geological records of major earthquakes, on different types of faults, makes testing hypotheses of regular versus random or clustered earthquake recurrence behavior difficult. We provide a fault-proximal major earthquake record spanning 8000 years on the strike-slip Alpine Fault in New Zealand. Cyclic stratigraphy at Hokuri Creek suggests that the fault ruptured to the surface 24 times, and event ages yield a 0.33 coefficient of variation in recurrence interval. We associate this near-regular earthquake recurrence with a geometrically simple strike-slip fault, with high slip rate, accommodating a high proportion of plate boundary motion that works in isolation from other faults. We propose that it is valid to apply time-dependent earthquake recurrence models for seismic hazard estimation to similar faults worldwide.

  15. Manifold-splitting regularization, self-linking, twisting, writhing numbers of space-time ribbons

    International Nuclear Information System (INIS)

    Tze, C.H.

    1988-01-01

    The authors present an alternative formulation of Polyakov's regularization of Gauss' integral formula for a single closed Feynman path. A key element in his proof of the D = 3 fermi-bose transmutations induced by topological gauge fields, this regularization is linked here with the existence and properties of a nontrivial topological invariant for a closed space ribbon. This self-linking coefficient, an integer, is the sum of two differential characteristics of the ribbon, its twisting and writhing numbers. These invariants form the basis for a physical interpretation of our regularization. Their connection to Polyakov's spinorization is discussed. The authors further generalize their construction to the self-linking, twisting and writhing of higher dimensional d = eta(odd) submanifolds in D = (2eta + 1) space-time

  16. HYBRID APPROACHES TO THE FORMALISATION OF EXPERT KNOWLEDGE CONCERNING TEMPORAL REGULARITIES IN THE TIME SERIES GROUP OF A SYSTEM MONITORING DATABASE

    Directory of Open Access Journals (Sweden)

    E. S. Staricov

    2016-01-01

    Full Text Available Objectives. The presented research problem concerns data regularities for an unspecified time series based on an approach to the expert formalisation of knowledge integrated into a decision-making mechanism. Method. A context-free grammar, consisting of a modification of universal temporal grammar, is used to describe regularities. Using the rules of the developed grammar, an expert can describe patterns in the group of time series. A multi-dimensional matrix pattern of the behaviour of a group of time series is used in a real-time decision-making regime in the expert system to implements a universal approach to the description of the dynamics of these changes in the expert system. The multidimensional matrix pattern is specifically intended for decision-making in an expert system; the modified temporal grammar is used to identify patterns in the data. Results. It is proposed to use the temporal relations of the series and fix observation values in the time interval as ―From-To‖, ―Before‖, ―After‖, ―Simultaneously‖ and ―Duration‖. A syntactically oriented converter of descriptions is developed. A schema for the creation and application of matrix patterns in expert systems is drawn up. Conclusion. The advantage of the implementation of the proposed hybrid approaches consists in a reduction of the time taken for identifying temporal patterns and an automation of the matrix pattern of the decision-making system based on expert descriptions verified using live data derived from relationships in the monitoring data. 

  17. Cerebellar Roles in Self-Timing for Sub- and Supra-Second Intervals.

    Science.gov (United States)

    Ohmae, Shogo; Kunimatsu, Jun; Tanaka, Masaki

    2017-03-29

    Previous studies suggest that the cerebellum and basal ganglia are involved in sub-second and supra-second timing, respectively. To test this hypothesis at the cellular level, we examined the activity of single neurons in the cerebellar dentate nucleus in monkeys performing the oculomotor version of the self-timing task. Animals were trained to report the passage of time of 400, 600, 1200, or 2400 ms following a visual cue by making self-initiated memory-guided saccades. We found a sizeable preparatory neuronal activity before self-timed saccades across delay intervals, while the time course of activity correlated with the trial-by-trial variation of saccade latency in different ways depending on the length of the delay intervals. For the shorter delay intervals, the ramping up of neuronal firing rate started just after the visual cue and the rate of rise of neuronal activity correlated with saccade timing. In contrast, for the longest delay (2400 ms), the preparatory activity started late during the delay period, and its onset time correlated with self-timed saccade latency. Because electrical microstimulation applied to the recording sites during saccade preparation advanced self-timed but not reactive saccades, regardless of their directions, the signals in the cerebellum may have a causal role in self-timing. We suggest that the cerebellum may regulate timing in both sub-second and supra-second ranges, although its relative contribution might be greater for sub-second than for supra-second time intervals. SIGNIFICANCE STATEMENT How we decide the timing of self-initiated movement is a fundamental question. According to the prevailing hypothesis, the cerebellum plays a role in monitoring sub-second timing, whereas the basal ganglia are important for supra-second timing. To verify this, we explored neuronal signals in the monkey cerebellum while animals reported the passage of time in the range 400-2400 ms by making eye movements. Contrary to our expectations, we

  18. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.

  19. Working time intervals and total work time on nursing positions in Poland

    Directory of Open Access Journals (Sweden)

    Danuta Kunecka

    2015-06-01

    Full Text Available Background: For the last few years a topic of overwork on nursing posts has given rise to strong discussions. The author has set herself a goal of answering the question if it is a result of real overwork of this particular profession or rather commonly assumed frustration of this professional group. The aim of this paper is to conduct the analysis of working time on chosen nursing positions in relation to measures of time being used as intervals in the course of conducting standard professional activities during one working day. Material and Methods: Research material consisted of documentation of work time on chosen nursing workplaces, compiled between 2007–2012 within the framework of a nursing course at the Pomeranian Medical University in Szczecin. As a method of measurement a photograph of a working day has been used. Measurements were performed in institutions located in 6 voivodeships in Poland. Results: Results suggest that only 6.5% of total of surveyed representatives of nurse profession spends proper amount of time (meaning: a time set by the applicable standards on work intervals during a working day. Conclusions: The scale of the phenomenon indicates excessive workload for nursing positions, which along with a longer period of time, longer working hours may cause decrease in efficiency of work and cause a drop in quality of provided services. Med Pr 2015;66,(2:165–172

  20. Time-variant random interval natural frequency analysis of structures

    Science.gov (United States)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  1. Interval Timing Deficits Assessed by Time Reproduction Dual Tasks as Cognitive Endophenotypes for Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Hwang-Gu, Shoou-Lian; Gau, Susan Shur-Fen

    2015-01-01

    The literature has suggested timing processing as a potential endophenotype for attention deficit/hyperactivity disorder (ADHD); however, whether the subjective internal clock speed presented by verbal estimation and limited attention capacity presented by time reproduction could be endophenotypes for ADHD is still unknown. We assessed 223 youths with DSM-IV ADHD (age range: 10-17 years), 105 unaffected siblings, and 84 typically developing (TD) youths using psychiatric interviews, intelligence tests, verbal estimation and time reproduction tasks (single task and simple and difficult dual tasks) at 5-second, 12-second, and 17-second intervals. We found that youths with ADHD tended to overestimate time in verbal estimation more than their unaffected siblings and TD youths, implying that fast subjective internal clock speed might be a characteristic of ADHD, rather than an endophenotype for ADHD. Youths with ADHD and their unaffected siblings were less precise in time reproduction dual tasks than TD youths. The magnitude of estimated errors in time reproduction was greater in youths with ADHD and their unaffected siblings than in TD youths, with an increased time interval at the 17-second interval and with increased task demands on both simple and difficult dual tasks versus the single task. Increased impaired time reproduction in dual tasks with increased intervals and task demands were shown in youths with ADHD and their unaffected siblings, suggesting that time reproduction deficits explained by limited attention capacity might be a useful endophenotype of ADHD. PMID:25992899

  2. Neutrino stress tensor regularization in two-dimensional space-time

    International Nuclear Information System (INIS)

    Davies, P.C.W.; Unruh, W.G.

    1977-01-01

    The method of covariant point-splitting is used to regularize the stress tensor for a massless spin 1/2 (neutrino) quantum field in an arbitrary two-dimensional space-time. A thermodynamic argument is used as a consistency check. The result shows that the physical part of the stress tensor is identical with that of the massless scalar field (in the absence of Casimir-type terms) even though the formally divergent expression is equal to the negative of the scalar case. (author)

  3. Investigations of timing during the schedule and reinforcement intervals with wheel-running reinforcement.

    Science.gov (United States)

    Belke, Terry W; Christie-Fougere, Melissa M

    2006-11-01

    Across two experiments, a peak procedure was used to assess the timing of the onset and offset of an opportunity to run as a reinforcer. The first experiment investigated the effect of reinforcer duration on temporal discrimination of the onset of the reinforcement interval. Three male Wistar rats were exposed to fixed-interval (FI) 30-s schedules of wheel-running reinforcement and the duration of the opportunity to run was varied across values of 15, 30, and 60s. Each session consisted of 50 reinforcers and 10 probe trials. Results showed that as reinforcer duration increased, the percentage of postreinforcement pauses longer than the 30-s schedule interval increased. On probe trials, peak response rates occurred near the time of reinforcer delivery and peak times varied with reinforcer duration. In a second experiment, seven female Long-Evans rats were exposed to FI 30-s schedules leading to 30-s opportunities to run. Timing of the onset and offset of the reinforcement period was assessed by probe trials during the schedule interval and during the reinforcement interval in separate conditions. The results provided evidence of timing of the onset, but not the offset of the wheel-running reinforcement period. Further research is required to assess if timing occurs during a wheel-running reinforcement period.

  4. Infinite time interval backward stochastic differential equations with continuous coefficients.

    Science.gov (United States)

    Zong, Zhaojun; Hu, Feng

    2016-01-01

    In this paper, we study the existence theorem for [Formula: see text] [Formula: see text] solutions to a class of 1-dimensional infinite time interval backward stochastic differential equations (BSDEs) under the conditions that the coefficients are continuous and have linear growths. We also obtain the existence of a minimal solution. Furthermore, we study the existence and uniqueness theorem for [Formula: see text] [Formula: see text] solutions of infinite time interval BSDEs with non-uniformly Lipschitz coefficients. It should be pointed out that the assumptions of this result is weaker than that of Theorem 3.1 in Zong (Turkish J Math 37:704-718, 2013).

  5. Quantification of fetal heart rate regularity using symbolic dynamics

    Science.gov (United States)

    van Leeuwen, P.; Cysarz, D.; Lange, S.; Geue, D.; Groenemeyer, D.

    2007-03-01

    Fetal heart rate complexity was examined on the basis of RR interval time series obtained in the second and third trimester of pregnancy. In each fetal RR interval time series, short term beat-to-beat heart rate changes were coded in 8bit binary sequences. Redundancies of the 28 different binary patterns were reduced by two different procedures. The complexity of these sequences was quantified using the approximate entropy (ApEn), resulting in discrete ApEn values which were used for classifying the sequences into 17 pattern sets. Also, the sequences were grouped into 20 pattern classes with respect to identity after rotation or inversion of the binary value. There was a specific, nonuniform distribution of the sequences in the pattern sets and this differed from the distribution found in surrogate data. In the course of gestation, the number of sequences increased in seven pattern sets, decreased in four and remained unchanged in six. Sequences that occurred less often over time, both regular and irregular, were characterized by patterns reflecting frequent beat-to-beat reversals in heart rate. They were also predominant in the surrogate data, suggesting that these patterns are associated with stochastic heart beat trains. Sequences that occurred more frequently over time were relatively rare in the surrogate data. Some of these sequences had a high degree of regularity and corresponded to prolonged heart rate accelerations or decelerations which may be associated with directed fetal activity or movement or baroreflex activity. Application of the pattern classes revealed that those sequences with a high degree of irregularity correspond to heart rate patterns resulting from complex physiological activity such as fetal breathing movements. The results suggest that the development of the autonomic nervous system and the emergence of fetal behavioral states lead to increases in not only irregular but also regular heart rate patterns. Using symbolic dynamics to

  6. Evaluating Protocol Lifecycle Time Intervals in HIV/AIDS Clinical Trials

    Science.gov (United States)

    Schouten, Jeffrey T.; Dixon, Dennis; Varghese, Suresh; Cope, Marie T.; Marci, Joe; Kagan, Jonathan M.

    2014-01-01

    Background Identifying efficacious interventions for the prevention and treatment of human diseases depends on the efficient development and implementation of controlled clinical trials. Essential to reducing the time and burden of completing the clinical trial lifecycle is determining which aspects take the longest, delay other stages, and may lead to better resource utilization without diminishing scientific quality, safety, or the protection of human subjects. Purpose In this study we modeled time-to-event data to explore relationships between clinical trial protocol development and implementation times, as well as identify potential correlates of prolonged development and implementation. Methods We obtained time interval and participant accrual data from 111 interventional clinical trials initiated between 2006 and 2011 by NIH’s HIV/AIDS Clinical Trials Networks. We determined the time (in days) required to complete defined phases of clinical trial protocol development and implementation. Kaplan-Meier estimates were used to assess the rates at which protocols reached specified terminal events, stratified by study purpose (therapeutic, prevention) and phase group (pilot/phase I, phase II, and phase III/ IV). We also examined several potential correlates to prolonged development and implementation intervals. Results Even though phase grouping did not determine development or implementation times of either therapeutic or prevention studies, overall we observed wide variation in protocol development times. Moreover, we detected a trend toward phase III/IV therapeutic protocols exhibiting longer developmental (median 2 ½ years) and implementation times (>3years). We also found that protocols exceeding the median number of days for completing the development interval had significantly longer implementation. Limitations The use of a relatively small set of protocols may have limited our ability to detect differences across phase groupings. Some timing effects

  7. [Estimation of the atrioventricular time interval by pulse Doppler in the normal fetal heart].

    Science.gov (United States)

    Hamela-Olkowska, Anita; Dangel, Joanna

    2009-08-01

    To assess normative values of the fetal atrioventricular (AV) time interval by pulse-wave Doppler methods on 5-chamber view. Fetal echocardiography exams were performed using Acuson Sequoia 512 in 140 singleton fetuses at 18 to 40 weeks of gestation with sinus rhythm and normal cardiac and extracardiac anatomy. Pulsed Doppler derived AV intervals were measured from left ventricular inflow/outflow view using transabdominal convex 3.5-6 MHz probe. The values of AV time interval ranged from 100 to 150 ms (mean 123 +/- 11.2). The AV interval was negatively correlated with the heart rhythm (page of gestation (p=0.007). However, in the same subgroup of the fetal heart rate there was no relation between AV intervals and gestational age. Therefore, the AV intervals showed only the heart rate dependence. The 95th percentiles of AV intervals according to FHR ranged from 135 to 148 ms. 1. The AV interval duration was negatively correlated with the heart rhythm. 2. Measurement of AV time interval is easy to perform and has a good reproducibility. It may be used for the fetal heart block screening in anti-Ro and anti-La positive pregnancies. 3. Normative values established in the study may help obstetricians in assessing fetal abnormalities of the AV conduction.

  8. Optimizing Time Intervals of Meteorological Data Used with Atmospheric Dose Modeling at SRS

    International Nuclear Information System (INIS)

    Simpkins, A.A.

    1999-01-01

    Measured tritium oxide concentrations in air have been compared with calculated values using routine release Gaussian plume models for different time intervals of meteorological data. These comparisons determined an optimum time interval of meteorological data used with atmospheric dose models at the Savannah River Site (SRS). Meteorological data of varying time intervals (1-yr to 10-yr) were used for the comparison. Insignificant differences are seen in using a one-year database as opposed to a five-year database. Use of a ten-year database results in slightly more conservative results. For meteorological databases of length one to five years the mean ratio of predicted to measured tritium oxide concentrations is approximately 1.25 whereas for the ten-year meteorological database the ration is closer to 1.35. Currently at the Savannah River Site a meteorological database of five years duration is used for all dose models. This study suggests no substantially improved accuracy using meteorological files of shorter or longer time intervals

  9. Cardiac Time Intervals Measured by Tissue Doppler Imaging M-mode

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Møgelvang, Rasmus; Schnohr, Peter

    2016-01-01

    function was evaluated in 1915 participants by using both conventional echocardiography and tissue Doppler imaging (TDI). The cardiac time intervals, including the isovolumic relaxation time (IVRT), isovolumic contraction time (IVCT), and ejection time (ET), were obtained by TDI M-mode through the mitral......). Additionally, they displayed a significant dose-response relationship, between increasing severity of elevated blood pressure and increasing left ventricular mass index (P

  10. Genus Ranges of 4-Regular Rigid Vertex Graphs.

    Science.gov (United States)

    Buck, Dorothy; Dolzhenko, Egor; Jonoska, Nataša; Saito, Masahico; Valencia, Karin

    2015-01-01

    A rigid vertex of a graph is one that has a prescribed cyclic order of its incident edges. We study orientable genus ranges of 4-regular rigid vertex graphs. The (orientable) genus range is a set of genera values over all orientable surfaces into which a graph is embedded cellularly, and the embeddings of rigid vertex graphs are required to preserve the prescribed cyclic order of incident edges at every vertex. The genus ranges of 4-regular rigid vertex graphs are sets of consecutive integers, and we address two questions: which intervals of integers appear as genus ranges of such graphs, and what types of graphs realize a given genus range. For graphs with 2 n vertices ( n > 1), we prove that all intervals [ a, b ] for all a genus ranges. For graphs with 2 n - 1 vertices ( n ≥ 1), we prove that all intervals [ a, b ] for all a genus ranges. We also provide constructions of graphs that realize these ranges.

  11. The synaptic properties of cells define the hallmarks of interval timing in a recurrent neural network.

    Science.gov (United States)

    Pérez, Oswaldo; Merchant, Hugo

    2018-04-03

    Extensive research has described two key features of interval timing. The bias property is associated with accuracy and implies that time is overestimated for short intervals and underestimated for long intervals. The scalar property is linked to precision and states that the variability of interval estimates increases as a function of interval duration. The neural mechanisms behind these properties are not well understood. Here we implemented a recurrent neural network that mimics a cortical ensemble and includes cells that show paired-pulse facilitation and slow inhibitory synaptic currents. The network produces interval selective responses and reproduces both bias and scalar properties when a Bayesian decoder reads its activity. Notably, the interval-selectivity, timing accuracy, and precision of the network showed complex changes as a function of the decay time constants of the modeled synaptic properties and the level of background activity of the cells. These findings suggest that physiological values of the time constants for paired-pulse facilitation and GABAb, as well as the internal state of the network, determine the bias and scalar properties of interval timing. Significant Statement Timing is a fundamental element of complex behavior, including music and language. Temporal processing in a wide variety of contexts shows two primary features: time estimates exhibit a shift towards the mean (the bias property) and are more variable for longer intervals (the scalar property). We implemented a recurrent neural network that includes long-lasting synaptic currents, which can not only produce interval selective responses but also follow the bias and scalar properties. Interestingly, only physiological values of the time constants for paired-pulse facilitation and GABAb, as well as intermediate background activity within the network can reproduce the two key features of interval timing. Copyright © 2018 the authors.

  12. Mean Square Exponential Stability of Stochastic Switched System with Interval Time-Varying Delays

    Directory of Open Access Journals (Sweden)

    Manlika Rajchakit

    2012-01-01

    Full Text Available This paper is concerned with mean square exponential stability of switched stochastic system with interval time-varying delays. The time delay is any continuous function belonging to a given interval, but not necessary to be differentiable. By constructing a suitable augmented Lyapunov-Krasovskii functional combined with Leibniz-Newton’s formula, a switching rule for the mean square exponential stability of switched stochastic system with interval time-varying delays and new delay-dependent sufficient conditions for the mean square exponential stability of the switched stochastic system are first established in terms of LMIs. Numerical example is given to show the effectiveness of the obtained result.

  13. Does regular practice of physical activity reduce the risk of dysphonia?

    Science.gov (United States)

    Assunção, Ada Avila; de Medeiros, Adriane Mesquita; Barreto, Sandhi Maria; Gama, Ana Cristina Cortes

    2009-12-01

    The purpose of this study was to investigate the association between regular physical activity and the prevalence of dysphonia. A cross-sectional study was conducted with 3142 teachers from 129 municipal public schools in the city of Belo Horizonte, Brazil. The dependent variable, dysphonia, was classified (absent or present) according to reported symptoms (fatigue when speaking and loss of voice quality), their frequency (occasionally and daily), and duration (past 15 days). The independent variable was regular physical activity. The degree of association was estimated based on the prevalence ratio and a 95% confidence interval obtained by the Poisson regression adapted for cross-sectional studies. In the study sample, the prevalence of dysphonia in teachers was 15.63%. Nearly half (47.52%) of the teachers reported no regular practice of physical exercises. The remaining teachers (52.48%) walked and did physical exercises, sports, and other activities; 31.25% undertook these activities once or twice a week, and 21.23% exercised three or more times a week. Teachers who did not practice physical activity were more likely to present dysphonia compared to those that exercised three or more times a week. Regular physical activity was associated positively with the prevalence of dysphonia.

  14. INTRINSIC TOPOLOGY AND REFINEMENT OF HUTTON UNIT INTERVAL

    Institute of Scientific and Technical Information of China (English)

    王国俊; 徐罗山

    1992-01-01

    This paper introduces the theory of continuous lattices to the study of the Hutton unit interval I(L). some theorems related to I(L) are pithily proved. A kind of intrinsic topologies is applied to refining the topology of I(L),and a new fuzzy unit interval,called the H(λ) unit interval,is defined.Based on the H(λ) unit interval the H(λ)-complete regularity is introduced.Also,the theory of. H(λ)-stone-ech compactifications is established

  15. Fault detection for discrete-time LPV systems using interval observers

    Science.gov (United States)

    Zhang, Zhi-Hui; Yang, Guang-Hong

    2017-10-01

    This paper is concerned with the fault detection (FD) problem for discrete-time linear parameter-varying systems subject to bounded disturbances. A parameter-dependent FD interval observer is designed based on parameter-dependent Lyapunov and slack matrices. The design method is presented by translating the parameter-dependent linear matrix inequalities (LMIs) into finite ones. In contrast to the existing results based on parameter-independent and diagonal Lyapunov matrices, the derived disturbance attenuation, fault sensitivity and nonnegative conditions lead to less conservative LMI characterisations. Furthermore, without the need to design the residual evaluation functions and thresholds, the residual intervals generated by the interval observers are used directly for FD decision. Finally, simulation results are presented for showing the effectiveness and superiority of the proposed method.

  16. New precession expressions, valid for long time intervals

    Science.gov (United States)

    Vondrák, J.; Capitaine, N.; Wallace, P.

    2011-10-01

    Context. The present IAU model of precession, like its predecessors, is given as a set of polynomial approximations of various precession parameters intended for high-accuracy applications over a limited time span. Earlier comparisons with numerical integrations have shown that this model is valid only for a few centuries around the basic epoch, J2000.0, while for more distant epochs it rapidly diverges from the numerical solution. In our preceding studies we also obtained preliminary developments for the precessional contribution to the motion of the equator: coordinates X,Y of the precessing pole and precession parameters ψA,ωA, suitable for use over long time intervals. Aims: The goal of the present paper is to obtain upgraded developments for various sets of precession angles that would fit modern observations near J2000.0 and at the same time fit numerical integration of the motions of solar system bodies on scales of several thousand centuries. Methods: We used the IAU 2006 solutions to represent the precession of the ecliptic and of the equator close to J2000.0 and, for more distant epochs, a numerical integration using the Mercury 6 package and solutions by Laskar et al. (1993, A&A, 270, 522) with upgraded initial conditions and constants to represent the ecliptic, and general precession and obliquity, respectively. From them, different precession parameters were calculated in the interval ± 200 millennia from J2000.0, and analytical expressions are found that provide a good fit for the whole interval. Results: Series for the various precessional parameters, comprising a cubic polynomial plus from 8 to 14 periodic terms, are derived that allow precession to be computed with an accuracy comparable to IAU 2006 around the central epoch J2000.0, a few arcseconds throughout the historical period, and a few tenths of a degree at the ends of the ± 200 millennia time span. Computer algorithms are provided that compute the ecliptic and mean equator poles and the

  17. Cardiac Time Intervals by Tissue Doppler Imaging M-Mode

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Mogelvang, Rasmus; de Knegt, Martina Chantal

    2016-01-01

    PURPOSE: To define normal values of the cardiac time intervals obtained by tissue Doppler imaging (TDI) M-mode through the mitral valve (MV). Furthermore, to evaluate the association of the myocardial performance index (MPI) obtained by TDI M-mode (MPITDI) and the conventional method of obtaining...

  18. Processing of sub- and supra-second intervals in the primate brain results from the calibration of neuronal oscillators via sensory, motor, and feedback processes

    Science.gov (United States)

    Gupta, Daya S.

    2014-01-01

    The processing of time intervals in the sub- to supra-second range by the brain is critical for the interaction of primates with their surroundings in activities, such as foraging and hunting. For an accurate processing of time intervals by the brain, representation of physical time within neuronal circuits is necessary. I propose that time dimension of the physical surrounding is represented in the brain by different types of neuronal oscillators, generating spikes or spike bursts at regular intervals. The proposed oscillators include the pacemaker neurons, tonic inputs, and synchronized excitation and inhibition of inter-connected neurons. Oscillators, which are built inside various circuits of brain, help to form modular clocks, processing time intervals or other temporal characteristics specific to functions of a circuit. Relative or absolute duration is represented within neuronal oscillators by “neural temporal unit,” defined as the interval between regularly occurring spikes or spike bursts. Oscillator output is processed to produce changes in activities of neurons, named frequency modulator neuron, wired within a separate module, represented by the rate of change in frequency, and frequency of activities, proposed to encode time intervals. Inbuilt oscillators are calibrated by (a) feedback processes, (b) input of time intervals resulting from rhythmic external sensory stimulation, and (c) synchronous effects of feedback processes and evoked sensory activity. A single active clock is proposed per circuit, which is calibrated by one or more mechanisms. Multiple calibration mechanisms, inbuilt oscillators, and the presence of modular connections prevent a complete loss of interval timing functions of the brain. PMID:25136321

  19. Discriminator/time interval meter system evaluation report

    Energy Technology Data Exchange (ETDEWEB)

    Condreva, K. J.

    1976-04-12

    The purpose of this report is to discuss the evaluation of a modular prototype Discriminator/Time Interval Meter data acquisition unit as a useful tool in a digital diagnostics system. The characteristics, operation and calibration of each of the hardware components are discussed in some detail. A discussion of the system calibration, operation, and data ingestion and reduction is also given. System test results to date are given and discussed. Finally, recommendations and conclusions concerning the capabilities of the Discriminator/T.I.M. system based on test and calibration results to date are given.

  20. Discriminator/time interval meter system evaluation report

    International Nuclear Information System (INIS)

    Condreva, K.J.

    1976-01-01

    The purpose of this report is to discuss the evaluation of a modular prototype Discriminator/Time Interval Meter data acquisition unit as a useful tool in a digital diagnostics system. The characteristics, operation and calibration of each of the hardware components are discussed in some detail. A discussion of the system calibration, operation, and data ingestion and reduction is also given. System test results to date are given and discussed. Finally, recommendations and conclusions concerning the capabilities of the Discriminator/T.I.M. system based on test and calibration results to date are given

  1. The 22nd Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting

    International Nuclear Information System (INIS)

    Sydnor, R.L.

    1990-05-01

    Papers presented at the 22nd Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting are compiled. The following subject areas are covered: Rb, Cs, and H-based frequency standards and cryogenic and trapped-ion technology; satellite laser tracking networks, GLONASS timing, intercomparison of national time scales and international telecommunications; telecommunications, power distribution, platform positioning, and geophysical survey industries; military communications and navigation systems; and dissemination of precise time and frequency by means of GPS, GLONASS, MIL STAR, LORAN, and synchronous communication satellites

  2. Probing interval timing with scalp-recorded electroencephalography (EEG).

    Science.gov (United States)

    Ng, Kwun Kei; Penney, Trevor B

    2014-01-01

    Humans, and other animals, are able to easily learn the durations of events and the temporal relationships among them in spite of the absence of a dedicated sensory organ for time. This chapter summarizes the investigation of timing and time perception using scalp-recorded electroencephalography (EEG), a non-invasive technique that measures brain electrical potentials on a millisecond time scale. Over the past several decades, much has been learned about interval timing through the examination of the characteristic features of averaged EEG signals (i.e., event-related potentials, ERPs) elicited in timing paradigms. For example, the mismatch negativity (MMN) and omission potential (OP) have been used to study implicit and explicit timing, respectively, the P300 has been used to investigate temporal memory updating, and the contingent negative variation (CNV) has been used as an index of temporal decision making. In sum, EEG measures provide biomarkers of temporal processing that allow researchers to probe the cognitive and neural substrates underlying time perception.

  3. Regular exercisers have stronger pelvic floor muscles than nonregular exercisers at midpregnancy.

    Science.gov (United States)

    Bø, Kari; Ellstrøm Engh, Marie; Hilde, Gunvor

    2018-04-01

    Today all healthy pregnant women are encouraged to be physically active throughout pregnancy, with recommendations to participate in at least 30 minutes of aerobic activity on most days of the week in addition to performing strength training of the major muscle groups 2-3 days per week and also pelvic floor muscle training. There is, however, an ongoing debate whether general physical activity enhances or declines pelvic floor muscle function. The objectives of the study were to compare vaginal resting pressure, pelvic floor muscle strength, and endurance in regular exercisers (exercise ≥30 minutes 3 or more times per week) and nonexercisers at midpregnancy. Furthermore, another objective was to assess whether regular general exercise or pelvic floor muscle strength was associated with urinary incontinence. This was a cross-sectional study at mean gestational week 20.9 (±1.4) including 218 nulliparous pregnant women, with a mean age of 28.6 years (range, 19-40 years) and prepregnancy body mass index of 23.9 kg/m 2 (SD, 4.0). Vaginal resting pressure, pelvic floor muscle strength, and pelvic floor muscle endurance were measured by a high-precision pressure transducer connected to a vaginal balloon. The International Consultation on Incontinence Questionnaire Urinary Incontinence Short Form was used to assess urinary incontinence. Differences between groups were analyzed using an independent-sample Student t test. Linear regression analysis was conducted to adjust for prepregnancy body mass index, age, smoking during pregnancy, and regular pelvic floor muscle training during pregnancy. The significance value was set to P ≤ .05. Regular exercisers had statistically significant stronger (mean 6.4 cm H 2 O [95% confidence interval, 1.7-11.2]) and more enduring (mean 39.9 cm H 2 Osec [95% confidence interval, 42.2-75.7]) pelvic floor muscles. Only pelvic floor muscle strength remained statistically significant, when adjusting for possible confounders. Pelvic floor

  4. Frequency interval balanced truncation of discrete-time bilinear systems

    DEFF Research Database (Denmark)

    Jazlan, Ahmad; Sreeram, Victor; Shaker, Hamid Reza

    2016-01-01

    This paper presents the development of a new model reduction method for discrete-time bilinear systems based on the balanced truncation framework. In many model reduction applications, it is advantageous to analyze the characteristics of the system with emphasis on particular frequency intervals...... are the solution to a pair of new generalized Lyapunov equations. The conditions for solvability of these new generalized Lyapunov equations are derived and a numerical solution method for solving these generalized Lyapunov equations is presented. Numerical examples which illustrate the usage of the new...... generalized frequency interval controllability and observability gramians as part of the balanced truncation framework are provided to demonstrate the performance of the proposed method....

  5. Count-to-count time interval distribution analysis in a fast reactor

    International Nuclear Information System (INIS)

    Perez-Navarro Gomez, A.

    1973-01-01

    The most important kinetic parameters have been measured at the zero power fast reactor CORAL-I by means of the reactor noise analysis in the time domain, using measurements of the count-to-count time intervals. (Author) 69 refs

  6. More consistent, yet less sensitive : Interval timing in autism spectrum disorders

    NARCIS (Netherlands)

    Falter, Christine M.; Noreika, Valdas; Wearden, John H.; Bailey, Anthony J.

    2012-01-01

    Even though phenomenological observations and anecdotal reports suggest atypical time processing in individuals with an autism spectrum disorder (ASD), very few psychophysical studies have investigated interval timing, and the obtained results are contradictory. The present study aimed to clarify

  7. Relativistic time-dependent Fermion-mass renormalization using statistical regularization

    Science.gov (United States)

    Kutnink, Timothy; McMurray, Christian; Santrach, Amelia; Hockett, Sarah; Barcus, Scott; Petridis, Athanasios

    2017-09-01

    The time-dependent electromagnetically self-coupled Dirac equation is solved numerically by means of the staggered-leap-frog algorithm with reflecting boundary conditions. The stability region of the method versus the interaction strength and the spatial-grid size over time-step ratio is established. The expectation values of several dynamic operators are then evaluated as functions of time. These include the fermion and electromagnetic energies and the fermion dynamic mass. There is a characteristic, non-exponential, oscillatory dependence leading to asymptotic constants of these expectation values. In the case of the fermion mass this amounts to renormalization. The dependence of the expectation values on the spatial-grid size is evaluated in detail. Furthermore, the contribution of positive and negative energy states to the asymptotic values and the gauge fields is analyzed. Statistical regularization, employing a canonical ensemble whose temperature is the inverse of the grid size, is used to remove the grid-size and momentum-dependence and produce a finite result in the continuum limit.

  8. Semiparametric regression analysis of failure time data with dependent interval censoring.

    Science.gov (United States)

    Chen, Chyong-Mei; Shen, Pao-Sheng

    2017-09-20

    Interval-censored failure-time data arise when subjects are examined or observed periodically such that the failure time of interest is not examined exactly but only known to be bracketed between two adjacent observation times. The commonly used approaches assume that the examination times and the failure time are independent or conditionally independent given covariates. In many practical applications, patients who are already in poor health or have a weak immune system before treatment usually tend to visit physicians more often after treatment than those with better health or immune system. In this situation, the visiting rate is positively correlated with the risk of failure due to the health status, which results in dependent interval-censored data. While some measurable factors affecting health status such as age, gender, and physical symptom can be included in the covariates, some health-related latent variables cannot be observed or measured. To deal with dependent interval censoring involving unobserved latent variable, we characterize the visiting/examination process as recurrent event process and propose a joint frailty model to account for the association of the failure time and visiting process. A shared gamma frailty is incorporated into the Cox model and proportional intensity model for the failure time and visiting process, respectively, in a multiplicative way. We propose a semiparametric maximum likelihood approach for estimating model parameters and show the asymptotic properties, including consistency and weak convergence. Extensive simulation studies are conducted and a data set of bladder cancer is analyzed for illustrative purposes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Travel time tomography with local image regularization by sparsity constrained dictionary learning

    Science.gov (United States)

    Bianco, M.; Gerstoft, P.

    2017-12-01

    We propose a regularization approach for 2D seismic travel time tomography which models small rectangular groups of slowness pixels, within an overall or `global' slowness image, as sparse linear combinations of atoms from a dictionary. The groups of slowness pixels are referred to as patches and a dictionary corresponds to a collection of functions or `atoms' describing the slowness in each patch. These functions could for example be wavelets.The patch regularization is incorporated into the global slowness image. The global image models the broad features, while the local patch images incorporate prior information from the dictionary. Further, high resolution slowness within patches is permitted if the travel times from the global estimates support it. The proposed approach is formulated as an algorithm, which is repeated until convergence is achieved: 1) From travel times, find the global slowness image with a minimum energy constraint on the pixel variance relative to a reference. 2) Find the patch level solutions to fit the global estimate as a sparse linear combination of dictionary atoms.3) Update the reference as the weighted average of the patch level solutions.This approach relies on the redundancy of the patches in the seismic image. Redundancy means that the patches are repetitions of a finite number of patterns, which are described by the dictionary atoms. Redundancy in the earth's structure was demonstrated in previous works in seismics where dictionaries of wavelet functions regularized inversion. We further exploit redundancy of the patches by using dictionary learning algorithms, a form of unsupervised machine learning, to estimate optimal dictionaries from the data in parallel with the inversion. We demonstrate our approach on densely, but irregularly sampled synthetic seismic images.

  10. Robust stability analysis of uncertain stochastic neural networks with interval time-varying delay

    International Nuclear Information System (INIS)

    Feng Wei; Yang, Simon X.; Fu Wei; Wu Haixia

    2009-01-01

    This paper addresses the stability analysis problem for uncertain stochastic neural networks with interval time-varying delays. The parameter uncertainties are assumed to be norm bounded, and the delay factor is assumed to be time-varying and belong to a given interval, which means that the lower and upper bounds of interval time-varying delays are available. A sufficient condition is derived such that for all admissible uncertainties, the considered neural network is robustly, globally, asymptotically stable in the mean square. Some stability criteria are formulated by means of the feasibility of a linear matrix inequality (LMI), which can be effectively solved by some standard numerical packages. Finally, numerical examples are provided to demonstrate the usefulness of the proposed criteria.

  11. Short-time regularity assessment of fibrillatory waves from the surface ECG in atrial fibrillation

    International Nuclear Information System (INIS)

    Alcaraz, Raúl; Martínez, Arturo; Hornero, Fernando; Rieta, José J

    2012-01-01

    This paper proposes the first non-invasive method for direct and short-time regularity quantification of atrial fibrillatory (f) waves from the surface ECG in atrial fibrillation (AF). Regularity is estimated by computing individual morphological variations among f waves, which are delineated and extracted from the atrial activity (AA) signal, making use of an adaptive signed correlation index. The algorithm was tested on real AF surface recordings in order to discriminate atrial signals with different organization degrees, providing a notably higher global accuracy (90.3%) than the two non-invasive AF organization estimates defined to date: the dominant atrial frequency (70.5%) and sample entropy (76.1%). Furthermore, due to its ability to assess AA regularity wave to wave, the proposed method is also able to pursue AF organization time course more precisely than the aforementioned indices. As a consequence, this work opens a new perspective in the non-invasive analysis of AF, such as the individualized study of each f wave, that could improve the understanding of AF mechanisms and become useful for its clinical treatment. (paper)

  12. Internal representations of temporal statistics and feedback calibrate motor-sensory interval timing.

    Directory of Open Access Journals (Sweden)

    Luigi Acerbi

    Full Text Available Humans have been shown to adapt to the temporal statistics of timing tasks so as to optimize the accuracy of their responses, in agreement with the predictions of Bayesian integration. This suggests that they build an internal representation of both the experimentally imposed distribution of time intervals (the prior and of the error (the loss function. The responses of a Bayesian ideal observer depend crucially on these internal representations, which have only been previously studied for simple distributions. To study the nature of these representations we asked subjects to reproduce time intervals drawn from underlying temporal distributions of varying complexity, from uniform to highly skewed or bimodal while also varying the error mapping that determined the performance feedback. Interval reproduction times were affected by both the distribution and feedback, in good agreement with a performance-optimizing Bayesian observer and actor model. Bayesian model comparison highlighted that subjects were integrating the provided feedback and represented the experimental distribution with a smoothed approximation. A nonparametric reconstruction of the subjective priors from the data shows that they are generally in agreement with the true distributions up to third-order moments, but with systematically heavier tails. In particular, higher-order statistical features (kurtosis, multimodality seem much harder to acquire. Our findings suggest that humans have only minor constraints on learning lower-order statistical properties of unimodal (including peaked and skewed distributions of time intervals under the guidance of corrective feedback, and that their behavior is well explained by Bayesian decision theory.

  13. The importance of time interval to development of second tumor in metachronous bilateral wilms' tumor

    International Nuclear Information System (INIS)

    Paulino, Arnold C.; Thakkar, Bharat; Henderson, William G.

    1997-01-01

    Purpose: To determine whether the time interval to development of second tumor is a prognostic factor for overall survival in children with metachronous bilateral Wilms' tumor and to give a recommendation regarding screening of the contralateral kidney in patients with Wilms' tumor. Materials and Management: A literature search using MEDLINE was performed of manuscripts in the English language from 1950-1996 and identified 108 children with metachronous bilateral Wilms' tumor. Children were classified according to time interval to development of a contralateral Wilms' tumor ( 78 mos (2), 78 - < 84 mos (1), 84 - < 90 mos (0), 90 - < 96 mos (1), ≥ 96 mos (0). Analysis of overall survival in patients with a time interval of < 18 months and ≥ 18 months showed a 10 year survival of 39.6% and 55.2%, respectively (p = 0.024, log-rank test). Conclusions: Children with metachronous bilateral Wilms' tumor who develop a contralateral tumor at a time interval of ≥ 18 months from the initial Wilms' tumor had a better overall survival than children with a time interval of < 18 months. Screening by abdominal ultrasound of the contralateral kidney for more than 5 years after initial diagnosis of Wilms' tumor may not be necessary since 102/106 (96.2%) of children had a time interval to second tumor of < 60 months

  14. Procedure prediction from symbolic Electronic Health Records via time intervals analytics.

    Science.gov (United States)

    Moskovitch, Robert; Polubriaginof, Fernanda; Weiss, Aviram; Ryan, Patrick; Tatonetti, Nicholas

    2017-11-01

    Prediction of medical events, such as clinical procedures, is essential for preventing disease, understanding disease mechanism, and increasing patient quality of care. Although longitudinal clinical data from Electronic Health Records provides opportunities to develop predictive models, the use of these data faces significant challenges. Primarily, while the data are longitudinal and represent thousands of conceptual events having duration, they are also sparse, complicating the application of traditional analysis approaches. Furthermore, the framework presented here takes advantage of the events duration and gaps. International standards for electronic healthcare data represent data elements, such as procedures, conditions, and drug exposures, using eras, or time intervals. Such eras contain both an event and a duration and enable the application of time intervals mining - a relatively new subfield of data mining. In this study, we present Maitreya, a framework for time intervals analytics in longitudinal clinical data. Maitreya discovers frequent time intervals related patterns (TIRPs), which we use as prognostic markers for modelling clinical events. We introduce three novel TIRP metrics that are normalized versions of the horizontal-support, that represents the number of TIRP instances per patient. We evaluate Maitreya on 28 frequent and clinically important procedures, using the three novel TIRP representation metrics in comparison to no temporal representation and previous TIRPs metrics. We also evaluate the epsilon value that makes Allen's relations more flexible with several settings of 30, 60, 90 and 180days in comparison to the default zero. For twenty-two of these procedures, the use of temporal patterns as predictors was superior to non-temporal features, and the use of the vertically normalized horizontal support metric to represent TIRPs as features was most effective. The use of the epsilon value with thirty days was slightly better than the zero

  15. Opposite Distortions in Interval Timing Perception for Visual and Auditory Stimuli with Temporal Modulations.

    Science.gov (United States)

    Yuasa, Kenichi; Yotsumoto, Yuko

    2015-01-01

    When an object is presented visually and moves or flickers, the perception of its duration tends to be overestimated. Such an overestimation is called time dilation. Perceived time can also be distorted when a stimulus is presented aurally as an auditory flutter, but the mechanisms and their relationship to visual processing remains unclear. In the present study, we measured interval timing perception while modulating the temporal characteristics of visual and auditory stimuli, and investigated whether the interval times of visually and aurally presented objects shared a common mechanism. In these experiments, participants compared the durations of flickering or fluttering stimuli to standard stimuli, which were presented continuously. Perceived durations for auditory flutters were underestimated, while perceived durations of visual flickers were overestimated. When auditory flutters and visual flickers were presented simultaneously, these distortion effects were cancelled out. When auditory flutters were presented with a constantly presented visual stimulus, the interval timing perception of the visual stimulus was affected by the auditory flutters. These results indicate that interval timing perception is governed by independent mechanisms for visual and auditory processing, and that there are some interactions between the two processing systems.

  16. Changes in crash risk following re-timing of traffic signal change intervals.

    Science.gov (United States)

    Retting, Richard A; Chapline, Janella F; Williams, Allan F

    2002-03-01

    More than I million motor vehicle crashes occur annually at signalized intersections in the USA. The principal method used to prevent crashes associated with routine changes in signal indications is employment of a traffic signal change interval--a brief yellow and all-red period that follows the green indication. No universal practice exists for selecting the duration of change intervals, and little is known about the influence of the duration of the change interval on crash risk. The purpose of this study was to estimate potential crash effects of modifying the duration of traffic signal change intervals to conform with values associated with a proposed recommended practice published by the Institute of Transportation Engineers. A sample of 122 intersections was identified and randomly assigned to experimental and control groups. Of 51 eligible experimental sites, 40 (78%) needed signal timing changes. For the 3-year period following implementation of signal timing changes, there was an 8% reduction in reportable crashes at experimental sites relative to those occurring at control sites (P = 0.08). For injury crashes, a 12% reduction at experimental sites relative to those occurring at control sites was found (P = 0.03). Pedestrian and bicycle crashes at experimental sites decreased 37% (P = 0.03) relative to controls. Given these results and the relatively low cost of re-timing traffic signals, modifying the duration of traffic signal change intervals to conform with values associated with the Institute of Transportation Engineers' proposed recommended practice should be strongly considered by transportation agencies to reduce the frequency of urban motor vehicle crashes.

  17. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    Science.gov (United States)

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  18. A polynomial time algorithm for checking regularity of totally normed process algebra

    NARCIS (Netherlands)

    Yang, F.; Huang, H.

    2015-01-01

    A polynomial algorithm for the regularity problem of weak and branching bisimilarity on totally normed process algebra (PA) processes is given. Its time complexity is O(n 3 +mn) O(n3+mn), where n is the number of transition rules and m is the maximal length of the rules. The algorithm works for

  19. Periodontal Disease, Regular Dental Care Use, and Incident Ischemic Stroke.

    Science.gov (United States)

    Sen, Souvik; Giamberardino, Lauren D; Moss, Kevin; Morelli, Thiago; Rosamond, Wayne D; Gottesman, Rebecca F; Beck, James; Offenbacher, Steven

    2018-02-01

    Periodontal disease is independently associated with cardiovascular disease. Identification of periodontal disease as a risk factor for incident ischemic stroke raises the possibility that regular dental care utilization may reduce the stroke risk. In the ARIC (Atherosclerosis Risk in Communities) study, pattern of dental visits were classified as regular or episodic dental care users. In the ancillary dental ARIC study, selected subjects from ARIC underwent fullmouth periodontal measurements collected at 6 sites per tooth and classified into 7 periodontal profile classes (PPCs). In the ARIC study 10 362 stroke-free participants, 584 participants had incident ischemic strokes over a 15-year period. In the dental ARIC study, 6736 dentate subjects were assessed for periodontal disease status using PPC with a total of 299 incident ischemic strokes over the 15-year period. The 7 levels of PPC showed a trend toward an increased stroke risk (χ 2 trend P periodontal disease). Periodontal disease was significantly associated with cardioembolic (hazard ratio, 2.6; 95% confidence interval, 1.2-5.6) and thrombotic (hazard ratio, 2.2; 95% confidence interval, 1.3-3.8) stroke subtypes. Regular dental care utilization was associated with lower adjusted stroke risk (hazard ratio, 0.77; 95% confidence interval, 0.63-0.94). We confirm an independent association between periodontal disease and incident stroke risk, particularly cardioembolic and thrombotic stroke subtype. Further, we report that regular dental care utilization may lower this risk for stroke. © 2018 American Heart Association, Inc.

  20. Embodiment and the origin of interval timing: kinematic and electromyographic data.

    Science.gov (United States)

    Addyman, Caspar; Rocha, Sinead; Fautrelle, Lilian; French, Robert M; Thomas, Elizabeth; Mareschal, Denis

    2017-03-01

    Recent evidence suggests that interval timing (the judgment of durations lasting from approximately 500 ms. to a few minutes) is closely coupled to the action control system. We used surface electromyography (EMG) and motion capture technology to explore the emergence of this coupling in 4-, 6-, and 8-month-olds. We engaged infants in an active and socially relevant arm-raising task with seven cycles and response period. In one condition, cycles were slow (every 4 s); in another, they were fast (every 2 s). In the slow condition, we found evidence of time-locked sub-threshold EMG activity even in the absence of any observed overt motor responses at all three ages. This study shows that EMGs can be a more sensitive measure of interval timing in early development than overt behavior.

  1. Decoding Complex Cognitive States Online by Manifold Regularization in Real-Time fMRI

    DEFF Research Database (Denmark)

    Hansen, Toke Jansen; Hansen, Lars Kai; Madsen, Kristoffer Hougaard

    2011-01-01

    Human decision making is complex and influenced by many factors on multiple time scales, reflected in the numerous brain networks and connectivity patterns involved as revealed by fMRI. We address mislabeling issues in paradigms involving complex cognition, by considering a manifold regularizing...

  2. Beat-to-beat systolic time-interval measurement from heart sounds and ECG

    International Nuclear Information System (INIS)

    Paiva, R P; Carvalho, P; Couceiro, R; Henriques, J; Antunes, M; Quintal, I; Muehlsteff, J

    2012-01-01

    Systolic time intervals are highly correlated to fundamental cardiac functions. Several studies have shown that these measurements have significant diagnostic and prognostic value in heart failure condition and are adequate for long-term patient follow-up and disease management. In this paper, we investigate the feasibility of using heart sound (HS) to accurately measure the opening and closing moments of the aortic heart valve. These moments are crucial to define the main systolic timings of the heart cycle, i.e. pre-ejection period (PEP) and left ventricular ejection time (LVET). We introduce an algorithm for automatic extraction of PEP and LVET using HS and electrocardiogram. PEP is estimated with a Bayesian approach using the signal's instantaneous amplitude and patient-specific time intervals between atrio-ventricular valve closure and aortic valve opening. As for LVET, since the aortic valve closure corresponds to the start of the S2 HS component, we base LVET estimation on the detection of the S2 onset. A comparative assessment of the main systolic time intervals is performed using synchronous signal acquisitions of the current gold standard in cardiac time-interval measurement, i.e. echocardiography, and HS. The algorithms were evaluated on a healthy population, as well as on a group of subjects with different cardiovascular diseases (CVD). In the healthy group, from a set of 942 heartbeats, the proposed algorithm achieved 7.66 ± 5.92 ms absolute PEP estimation error. For LVET, the absolute estimation error was 11.39 ± 8.98 ms. For the CVD population, 404 beats were used, leading to 11.86 ± 8.30 and 17.51 ± 17.21 ms absolute PEP and LVET errors, respectively. The results achieved in this study suggest that HS can be used to accurately estimate LVET and PEP. (paper)

  3. Time interval measurement between two emissions: Kr + Au

    International Nuclear Information System (INIS)

    Aboufirassi, M; Bougault, R.; Brou, R.; Colin, J.; Durand, D.; Genoux-Lubain, A.; Horn, D.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Lefebvres, F.; Lopez, O.; Louvel, M.; Mahi, M.; Steckmeyer, J.C.; Tamain, B.

    1998-01-01

    To indicate the method allowing the determination of the emission intervals, the results obtained with the Kr + Au system at 43 and 60 A.MeV are presented. The experiments were performed with the NAUTILUS exclusive detectors. Central collisions were selected by means of a relative velocity criterion to reject the events containing a forward emitted fragment. For the two bombardment energies the data analysis shows that the formation of a compound of mass around A = 200. By comparing the fragment dynamical variables with simulations one can conclude about the simultaneity of the compound deexcitation processes. It was found that a 5 MeV/A is able to reproduce the characteristics of the detected fragments. Also, it was found that to reproduce the dynamical characteristics of the fragments issued from central collisions it was not necessary to superimpose a radial collective energy upon the Coulomb and thermal motion. The distribution of the relative angles between detected fragments is used here as a chronometer. For simultaneous ruptures the small relative angles are forbidden by the Coulomb repulsion, while for sequential processes this interdiction is the more lifted the longer the interval between the two emissions is. For the system discussed here the comparison between simulation and data has been carried out for the extreme cases, i.e. for a vanishing and infinite time interval between the two emissions, respectively. More sophisticated simulations to describe angular distributions between the emitted fragments were also developed

  4. Systolic time intervals vs invasive predictors of fluid responsiveness after coronary artery bypass surgery(dagger)

    NARCIS (Netherlands)

    Smorenberg, A.; Lust, E.J.; Beishuizen, A.; Meijer, J.H.; Verdaasdonk, R.M.; Groeneveld, A.B.J.

    2013-01-01

    OBJECTIVES: Haemodynamic parameters for predicting fluid responsiveness in intensive care patients are invasive, technically challenging or not universally applicable. We compared the initial systolic time interval (ISTI), a non-invasive measure of the time interval between the electrical and

  5. Measuring time series regularity using nonlinear similarity-based sample entropy

    International Nuclear Information System (INIS)

    Xie Hongbo; He Weixing; Liu Hui

    2008-01-01

    Sampe Entropy (SampEn), a measure quantifying regularity and complexity, is believed to be an effective analyzing method of diverse settings that include both deterministic chaotic and stochastic processes, particularly operative in the analysis of physiological signals that involve relatively small amount of data. However, the similarity definition of vectors is based on Heaviside function, of which the boundary is discontinuous and hard, may cause some problems in the validity and accuracy of SampEn. Sigmoid function is a smoothed and continuous version of Heaviside function. To overcome the problems SampEn encountered, a modified SampEn (mSampEn) based on nonlinear Sigmoid function was proposed. The performance of mSampEn was tested on the independent identically distributed (i.i.d.) uniform random numbers, the MIX stochastic model, the Rossler map, and the Hennon map. The results showed that mSampEn was superior to SampEn in several aspects, including giving entropy definition in case of small parameters, better relative consistency, robust to noise, and more independence on record length when characterizing time series generated from either deterministic or stochastic system with different regularities

  6. Characterization of Cardiac Time Intervals in Healthy Bonnet Macaques (Macaca radiata) by Using an Electronic Stethoscope

    Science.gov (United States)

    Kamran, Haroon; Salciccioli, Louis; Pushilin, Sergei; Kumar, Paraag; Carter, John; Kuo, John; Novotney, Carol; Lazar, Jason M

    2011-01-01

    Nonhuman primates are used frequently in cardiovascular research. Cardiac time intervals derived by phonocardiography have long been used to assess left ventricular function. Electronic stethoscopes are simple low-cost systems that display heart sound signals. We assessed the use of an electronic stethoscope to measure cardiac time intervals in 48 healthy bonnet macaques (age, 8 ± 5 y) based on recorded heart sounds. Technically adequate recordings were obtained from all animals and required 1.5 ± 1.3 min. The following cardiac time intervals were determined by simultaneously recording acoustic and single-lead electrocardiographic data: electromechanical activation time (QS1), electromechanical systole (QS2), the time interval between the first and second heart sounds (S1S2), and the time interval between the second and first sounds (S2S1). QS2 was correlated with heart rate, mean arterial pressure, diastolic blood pressure, and left ventricular ejection time determined by using echocardiography. S1S2 correlated with heart rate, mean arterial pressure, diastolic blood pressure, left ventricular ejection time, and age. S2S1 correlated with heart rate, mean arterial pressure, diastolic blood pressure, systolic blood pressure, and left ventricular ejection time. QS1 did not correlate with any anthropometric or echocardiographic parameter. The relation S1S2/S2S1 correlated with systolic blood pressure. On multivariate analyses, heart rate was the only independent predictor of QS2, S1S2, and S2S1. In conclusion, determination of cardiac time intervals is feasible and reproducible by using an electrical stethoscope in nonhuman primates. Heart rate is a major determinant of QS2, S1S2, and S2S1 but not QS1; regression equations for reference values for cardiac time intervals in bonnet macaques are provided. PMID:21439218

  7. High-intensity interval training improves insulin sensitivity in older individuals

    DEFF Research Database (Denmark)

    Søgaard, D; Lund, M T; Scheuer, C M

    2017-01-01

    AIM: Metabolic health may deteriorate with age as a result of altered body composition and decreased physical activity. Endurance exercise is known to counter these changes delaying or even preventing onset of metabolic diseases. High-intensity interval training (HIIT) is a time efficient...... alternative to regular endurance exercise, and the aim of this study was to investigate the metabolic benefit of HIIT in older subjects. METHODS: Twenty-two sedentary male (n = 11) and female (n = 11) subjects aged 63 ± 1 years performed HIIT training three times/week for 6 weeks on a bicycle ergometer. Each...... HIIT session consisted of five 1-minute intervals interspersed with 1½-minute rest. Prior to the first and after the last HIIT session whole-body insulin sensitivity, measured by a hyperinsulinaemic-euglycaemic clamp, plasma lipid levels, HbA1c, glycaemic parameters, body composition and maximal oxygen...

  8. 75 FR 76006 - Regular Meeting

    Science.gov (United States)

    2010-12-07

    ... FARM CREDIT SYSTEM INSURANCE CORPORATION Regular Meeting AGENCY: Farm Credit System Insurance Corporation Board. ACTION: Regular meeting. SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). Date and Time: The meeting of the Board will be held...

  9. The time interval distribution of sand–dust storms in theory: testing with observational data for Yanchi, China

    International Nuclear Information System (INIS)

    Liu, Guoliang; Zhang, Feng; Hao, Lizhen

    2012-01-01

    We previously introduced a time record model for use in studying the duration of sand–dust storms. In the model, X is the normalized wind speed and Xr is the normalized wind speed threshold for the sand–dust storm. X is represented by a random signal with a normal Gaussian distribution. The storms occur when X ≥ Xr. From this model, the time interval distribution of N = Aexp(−bt) can be deduced, wherein N is the number of time intervals with length greater than t, A and b are constants, and b is related to Xr. In this study, sand–dust storm data recorded in spring at the Yanchi meteorological station in China were analysed to verify whether the time interval distribution of the sand–dust storms agrees with the above time interval distribution. We found that the distribution of the time interval between successive sand–dust storms in April agrees well with the above exponential equation. However, the interval distribution for the sand–dust storm data for the entire spring period displayed a better fit to the Weibull equation and depended on the variation of the sand–dust storm threshold wind speed. (paper)

  10. Design of time interval generator based on hybrid counting method

    International Nuclear Information System (INIS)

    Yao, Yuan; Wang, Zhaoqi; Lu, Houbing; Chen, Lian; Jin, Ge

    2016-01-01

    Time Interval Generators (TIGs) are frequently used for the characterizations or timing operations of instruments in particle physics experiments. Though some “off-the-shelf” TIGs can be employed, the necessity of a custom test system or control system makes the TIGs, being implemented in a programmable device desirable. Nowadays, the feasibility of using Field Programmable Gate Arrays (FPGAs) to implement particle physics instrumentation has been validated in the design of Time-to-Digital Converters (TDCs) for precise time measurement. The FPGA-TDC technique is based on the architectures of Tapped Delay Line (TDL), whose delay cells are down to few tens of picosecond. In this case, FPGA-based TIGs with high delay step are preferable allowing the implementation of customized particle physics instrumentations and other utilities on the same FPGA device. A hybrid counting method for designing TIGs with both high resolution and wide range is presented in this paper. The combination of two different counting methods realizing an integratable TIG is described in detail. A specially designed multiplexer for tap selection is emphatically introduced. The special structure of the multiplexer is devised for minimizing the different additional delays caused by the unpredictable routings from different taps to the output. A Kintex-7 FPGA is used for the hybrid counting-based implementation of a TIG, providing a resolution up to 11 ps and an interval range up to 8 s.

  11. Design of time interval generator based on hybrid counting method

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Yuan [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Zhaoqi [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Lu, Houbing [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Hefei Electronic Engineering Institute, Hefei 230037 (China); Chen, Lian [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Jin, Ge, E-mail: goldjin@ustc.edu.cn [State Key Laboratory of Particle Detection and Electronics, University of Science and Technology of China, Hefei, Anhui 230026 (China)

    2016-10-01

    Time Interval Generators (TIGs) are frequently used for the characterizations or timing operations of instruments in particle physics experiments. Though some “off-the-shelf” TIGs can be employed, the necessity of a custom test system or control system makes the TIGs, being implemented in a programmable device desirable. Nowadays, the feasibility of using Field Programmable Gate Arrays (FPGAs) to implement particle physics instrumentation has been validated in the design of Time-to-Digital Converters (TDCs) for precise time measurement. The FPGA-TDC technique is based on the architectures of Tapped Delay Line (TDL), whose delay cells are down to few tens of picosecond. In this case, FPGA-based TIGs with high delay step are preferable allowing the implementation of customized particle physics instrumentations and other utilities on the same FPGA device. A hybrid counting method for designing TIGs with both high resolution and wide range is presented in this paper. The combination of two different counting methods realizing an integratable TIG is described in detail. A specially designed multiplexer for tap selection is emphatically introduced. The special structure of the multiplexer is devised for minimizing the different additional delays caused by the unpredictable routings from different taps to the output. A Kintex-7 FPGA is used for the hybrid counting-based implementation of a TIG, providing a resolution up to 11 ps and an interval range up to 8 s.

  12. Dimensional Stability of Two Polyvinyl Siloxane Impression Materials in Different Time Intervals

    Directory of Open Access Journals (Sweden)

    Aalaei Sh

    2015-12-01

    Full Text Available Statement of the Problem: Dental prosthesis is usually made indirectly; there- fore dimensional stability of the impression material is very important. Every few years, new impression materials with different manufacturers’ claims regarding their better properties are introduced to the dental markets which require more research to evaluate their true dimensional changes. Objectives: The aim of this study was to evaluate dimensional stability of additional silicone impression material (Panasil® and Affinis® in different time intervals. Materials and Methods: In this experimental study, using two additional silicones (Panasil® and Affinis®, we made sixty impressions of standard die in similar conditions of 23 °C and 59% relative humidity by a special tray. The die included three horizontal and two vertical lines that were parallel. The vertical line crossed the horizontal ones at a point that served as reference for measurement. All impressions were poured with high strength dental stone. The dimensions were measured by stereo-microscope by two examiners in three interval storage times (1, 24 and 168 hours.The data were statistically analyzed using t-test and ANOVA. Results: All of the stone casts were larger than the standard die. Dimensional changes of Panasil and Affinis were 0.07%, 0.24%, 0.27% and 0.02%, 0.07%, 0.16% after 1, 24 and 168 hours, respectively. Dimensional change for two impression materials wasn’t significant in the interval time, expect for Panasil after one week (p = 0.004. Conclusions: According to the limitations of this study, Affinis impressions were dimensionally more stable than Panasil ones, but it was not significant. Dimensional change of Panasil impression showed a statistically significant difference after one week. Dimensional changes of both impression materials were based on ADA standard limitation in all time intervals (< 0.5%; therefore, dimensional stability of this impression was accepted at least

  13. Automatic, time-interval traffic counts for recreation area management planning

    Science.gov (United States)

    D. L. Erickson; C. J. Liu; H. K. Cordell

    1980-01-01

    Automatic, time-interval recorders were used to count directional vehicular traffic on a multiple entry/exit road network in the Red River Gorge Geological Area, Daniel Boone National Forest. Hourly counts of entering and exiting traffic differed according to recorder location, but an aggregated distribution showed a delayed peak in exiting traffic thought to be...

  14. UNFOLDED REGULAR AND SEMI-REGULAR POLYHEDRA

    Directory of Open Access Journals (Sweden)

    IONIŢĂ Elena

    2015-06-01

    Full Text Available This paper proposes a presentation unfolding regular and semi-regular polyhedra. Regular polyhedra are convex polyhedra whose faces are regular and equal polygons, with the same number of sides, and whose polyhedral angles are also regular and equal. Semi-regular polyhedra are convex polyhedra with regular polygon faces, several types and equal solid angles of the same type. A net of a polyhedron is a collection of edges in the plane which are the unfolded edges of the solid. Modeling and unfolding Platonic and Arhimediene polyhedra will be using 3dsMAX program. This paper is intended as an example of descriptive geometry applications.

  15. Estimation of sojourn time in chronic disease screening without data on interval cases.

    Science.gov (United States)

    Chen, T H; Kuo, H S; Yen, M F; Lai, M S; Tabar, L; Duffy, S W

    2000-03-01

    Estimation of the sojourn time on the preclinical detectable period in disease screening or transition rates for the natural history of chronic disease usually rely on interval cases (diagnosed between screens). However, to ascertain such cases might be difficult in developing countries due to incomplete registration systems and difficulties in follow-up. To overcome this problem, we propose three Markov models to estimate parameters without using interval cases. A three-state Markov model, a five-state Markov model related to regional lymph node spread, and a five-state Markov model pertaining to tumor size are applied to data on breast cancer screening in female relatives of breast cancer cases in Taiwan. Results based on a three-state Markov model give mean sojourn time (MST) 1.90 (95% CI: 1.18-4.86) years for this high-risk group. Validation of these models on the basis of data on breast cancer screening in the age groups 50-59 and 60-69 years from the Swedish Two-County Trial shows the estimates from a three-state Markov model that does not use interval cases are very close to those from previous Markov models taking interval cancers into account. For the five-state Markov model, a reparameterized procedure using auxiliary information on clinically detected cancers is performed to estimate relevant parameters. A good fit of internal and external validation demonstrates the feasibility of using these models to estimate parameters that have previously required interval cancers. This method can be applied to other screening data in which there are no data on interval cases.

  16. HIV intertest interval among MSM in King County, Washington.

    Science.gov (United States)

    Katz, David A; Dombrowski, Julia C; Swanson, Fred; Buskin, Susan E; Golden, Matthew R; Stekler, Joanne D

    2013-02-01

    The authors examined temporal trends and correlates of HIV testing frequency among men who have sex with men (MSM) in King County, Washington. The authors evaluated data from MSM testing for HIV at the Public Health-Seattle & King County (PHSKC) STD Clinic and Gay City Health Project (GCHP) and testing history data from MSM in PHSKC HIV surveillance. The intertest interval (ITI) was defined as the number of days between the last negative HIV test and the current testing visit or first positive test. Correlates of the log(10)-transformed ITI were determined using generalised estimating equations linear regression. Between 2003 and 2010, the median ITI among MSM seeking HIV testing at the STD Clinic and GCHP were 215 (IQR: 124-409) and 257 (IQR: 148-503) days, respectively. In multivariate analyses, younger age, having only male partners and reporting ≥10 male sex partners in the last year were associated with shorter ITIs at both testing sites (pGCHP attendees, having a regular healthcare provider, seeking a test as part of a regular schedule and inhaled nitrite use in the last year were also associated with shorter ITIs (pGCHP (median 359 vs 255 days, p=0.02). Although MSM in King County appear to be testing at frequent intervals, further efforts are needed to reduce the time that HIV-infected persons are unaware of their status.

  17. Discrete maximal regularity of time-stepping schemes for fractional evolution equations.

    Science.gov (United States)

    Jin, Bangti; Li, Buyang; Zhou, Zhi

    2018-01-01

    In this work, we establish the maximal [Formula: see text]-regularity for several time stepping schemes for a fractional evolution model, which involves a fractional derivative of order [Formula: see text], [Formula: see text], in time. These schemes include convolution quadratures generated by backward Euler method and second-order backward difference formula, the L1 scheme, explicit Euler method and a fractional variant of the Crank-Nicolson method. The main tools for the analysis include operator-valued Fourier multiplier theorem due to Weis (Math Ann 319:735-758, 2001. doi:10.1007/PL00004457) and its discrete analogue due to Blunck (Stud Math 146:157-176, 2001. doi:10.4064/sm146-2-3). These results generalize the corresponding results for parabolic problems.

  18. Effects of the lateral amplitude and regularity of upper body fluctuation on step time variability evaluated using return map analysis.

    Science.gov (United States)

    Chidori, Kazuhiro; Yamamoto, Yuji

    2017-01-01

    The aim of this study was to evaluate the effects of the lateral amplitude and regularity of upper body fluctuation on step time variability. Return map analysis was used to clarify the relationship between step time variability and a history of falling. Eleven healthy, community-dwelling older adults and twelve younger adults participated in the study. All of the subjects walked 25 m at a comfortable speed. Trunk acceleration was measured using triaxial accelerometers attached to the third lumbar vertebrae (L3) and the seventh cervical vertebrae (C7). The normalized average magnitude of acceleration, the coefficient of determination ($R^2$) of the return map, and the step time variabilities, were calculated. Cluster analysis using the average fluctuation and the regularity of C7 fluctuation identified four walking patterns in the mediolateral (ML) direction. The participants with higher fluctuation and lower regularity showed significantly greater step time variability compared with the others. Additionally, elderly participants who had fallen in the past year had higher amplitude and a lower regularity of fluctuation during walking. In conclusion, by focusing on the time evolution of each step, it is possible to understand the cause of stride and/or step time variability that is associated with a risk of falls.

  19. Infant rats can learn time intervals before the maturation of the striatum: evidence from odor fear conditioning

    Directory of Open Access Journals (Sweden)

    Julie eBoulanger Bertolus

    2014-05-01

    Full Text Available Interval timing refers to the ability to perceive, estimate and discriminate durations in the range of seconds to minutes. Very little is currently known about the ontogeny of interval timing throughout development. On the other hand, even though the neural circuit sustaining interval timing is a matter of debate, the striatum has been suggested to be an important component of the system and its maturation occurs around the third post-natal week in rats. The global aim of the present study was to investigate interval timing abilities at an age for which striatum is not yet mature. We used odor fear conditioning, as it can be applied to very young animals. In odor fear conditioning, an odor is presented to the animal and a mild footshock is delivered after a fixed interval. Adult rats have been shown to learn the temporal relationships between the odor and the shock after a few associations. The first aim of the present study was to assess the activity of the striatum during odor fear conditioning using 2-Deoxyglucose autoradiography during development in rats. The data showed that although fear learning was displayed at all tested ages, activation of the striatum was observed in adults but not in juvenile animals. Next, we assessed the presence of evidence of interval timing in ages before and after the inclusion of the striatum into the fear conditioning circuit. We used an experimental setup allowing the simultaneous recording of freezing and respiration that have been demonstrated to be sensitive to interval timing in adult rats. This enabled the detection of duration-related temporal patterns for freezing and/or respiration curves in infants as young as 12 days post-natal during odor-fear conditioning. This suggests that infants are able to encode time durations as well as and as quickly as adults while their striatum is not yet functional. Alternative networks possibly sustaining interval timing in infant rats are discussed.

  20. Dead-time corrections on long-interval measurements of short-lived activities

    International Nuclear Information System (INIS)

    Irfan, M.

    1977-01-01

    A method has been proposed to make correction for counting losses due to dead time where the counting interval is comparable to or larger than the half-life of the activity under investigation. Counts due to background and any long-lived activity present in the source have been taken into consideration. The method is, under certain circumstances, capable of providing a valuable check on the accuracy of the dead time of the counting system. (Auth.)

  1. Robust stability analysis for Markovian jumping interval neural networks with discrete and distributed time-varying delays

    International Nuclear Information System (INIS)

    Balasubramaniam, P.; Lakshmanan, S.; Manivannan, A.

    2012-01-01

    Highlights: ► Robust stability analysis for Markovian jumping interval neural networks is considered. ► Both linear fractional and interval uncertainties are considered. ► A new LKF is constructed with triple integral terms. ► MATLAB LMI control toolbox is used to validate theoretical results. ► Numerical examples are given to illustrate the effectiveness of the proposed method. - Abstract: This paper investigates robust stability analysis for Markovian jumping interval neural networks with discrete and distributed time-varying delays. The parameter uncertainties are assumed to be bounded in given compact sets. The delay is assumed to be time-varying and belong to a given interval, which means that the lower and upper bounds of interval time-varying delays are available. Based on the new Lyapunov–Krasovskii functional (LKF), some inequality techniques and stochastic stability theory, new delay-dependent stability criteria have been obtained in terms of linear matrix inequalities (LMIs). Finally, two numerical examples are given to illustrate the less conservative and effectiveness of our theoretical results.

  2. Regularization and renormalization of quantum field theory in curved space-time

    International Nuclear Information System (INIS)

    Bernard, C.; Duncan, A.

    1977-01-01

    It is proposed that field theories quantized in a curved space-time manifold can be conveniently regularized and renormalized with the aid of Pauli-Villars regulator fields. The method avoids the conceptual difficulties of covariant point-separation approaches, by starting always from a manifestly generally covariant action, and the technical limitations of the dimensional reqularization approach, which requires solution of the theory in arbitrary dimension in order to go beyond a weak-field expansion. An action is constructed which renormalizes the weak-field perturbation theory of a massive scalar field in two space-time dimensions--it is shown that the trace anomaly previously found in dimensional regularization and some point-separation calculations also arises in perturbation theory when the theory is Pauli-Villars regulated. One then studies a specific solvable two-dimensional model of a massive scalar field in a Robertson-Walker asymptotically flat universe. It is shown that the action previously considered leads, in this model, to a well defined finite expectation value for the stress-energy tensor. The particle production (less than 0 in/vertical bar/theta/sup mu nu/(x,t)/vertical bar/0 in greater than for t → + infinity) is computed explicitly. Finally, the validity of weak-field perturbation theory (in the appropriate range of parameters) is checked directly in the solvable model, and the trace anomaly computed in the asymptotic regions t→ +- infinity independently of any weak field approximation. The extension of the model to higher dimensions and the renormalization of interacting (scalar) field theories are briefly discussed

  3. Explicit Inverse of an Interval Matrix with Unit Midpoint

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří

    2011-01-01

    Roč. 22, - (2011), s. 138-150 E-ISSN 1081-3810 R&D Projects: GA ČR GA201/09/1957; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : interval matrix * unit midpoint * inverse interval matrix * regularity Subject RIV: BA - General Mathematics Impact factor: 0.808, year: 2010 http://www.math.technion.ac.il/iic/ ela / ela -articles/articles/vol22_pp138-150.pdf

  4. A comparison of systolic time intervals measured by impedance cardiography and carotid pulse tracing

    DEFF Research Database (Denmark)

    Mehlsen, J; Bonde, J; Rehling, Michael

    1990-01-01

    The purpose of this study was to compare the systolic time intervals (STI) obtained by impedance cardiography and by the conventional carotid technique. This comparison was done with respect to: 1) correlations between variables obtained by the two methods, 2) ability to reflect drug-induced chan......The purpose of this study was to compare the systolic time intervals (STI) obtained by impedance cardiography and by the conventional carotid technique. This comparison was done with respect to: 1) correlations between variables obtained by the two methods, 2) ability to reflect drug...

  5. The Effect of Integration Policies on the Time until Regular Employment of Newly Arrived Immigrants:

    DEFF Research Database (Denmark)

    Clausen, Jens; Heinesen, Eskil; Hummelgaard, Hans

    We analyse the effect of active labour-market programmes on the hazard rate into regular employment for newly arrived immigrants using the timing-of-events duration model. We take account of language course participation and progression in destination country language skills. We use rich...... administrative data from Denmark. We find substantial lock-in effects of participation in active labour-market programmes. Post programme effects on the hazard rate to regular employment are significantly positive for wage subsidy programmes, but not for other types of programmes. For language course...... participants, improvement in language proficiency has significant and substantial positive effects on the hazard rate to employment....

  6. Regression analysis of case K interval-censored failure time data in the presence of informative censoring.

    Science.gov (United States)

    Wang, Peijie; Zhao, Hui; Sun, Jianguo

    2016-12-01

    Interval-censored failure time data occur in many fields such as demography, economics, medical research, and reliability and many inference procedures on them have been developed (Sun, 2006; Chen, Sun, and Peace, 2012). However, most of the existing approaches assume that the mechanism that yields interval censoring is independent of the failure time of interest and it is clear that this may not be true in practice (Zhang et al., 2007; Ma, Hu, and Sun, 2015). In this article, we consider regression analysis of case K interval-censored failure time data when the censoring mechanism may be related to the failure time of interest. For the problem, an estimated sieve maximum-likelihood approach is proposed for the data arising from the proportional hazards frailty model and for estimation, a two-step procedure is presented. In the addition, the asymptotic properties of the proposed estimators of regression parameters are established and an extensive simulation study suggests that the method works well. Finally, we apply the method to a set of real interval-censored data that motivated this study. © 2016, The International Biometric Society.

  7. Parameter identification for continuous point emission source based on Tikhonov regularization method coupled with particle swarm optimization algorithm.

    Science.gov (United States)

    Ma, Denglong; Tan, Wei; Zhang, Zaoxiao; Hu, Jun

    2017-03-05

    In order to identify the parameters of hazardous gas emission source in atmosphere with less previous information and reliable probability estimation, a hybrid algorithm coupling Tikhonov regularization with particle swarm optimization (PSO) was proposed. When the source location is known, the source strength can be estimated successfully by common Tikhonov regularization method, but it is invalid when the information about both source strength and location is absent. Therefore, a hybrid method combining linear Tikhonov regularization and PSO algorithm was designed. With this method, the nonlinear inverse dispersion model was transformed to a linear form under some assumptions, and the source parameters including source strength and location were identified simultaneously by linear Tikhonov-PSO regularization method. The regularization parameters were selected by L-curve method. The estimation results with different regularization matrixes showed that the confidence interval with high-order regularization matrix is narrower than that with zero-order regularization matrix. But the estimation results of different source parameters are close to each other with different regularization matrixes. A nonlinear Tikhonov-PSO hybrid regularization was also designed with primary nonlinear dispersion model to estimate the source parameters. The comparison results of simulation and experiment case showed that the linear Tikhonov-PSO method with transformed linear inverse model has higher computation efficiency than nonlinear Tikhonov-PSO method. The confidence intervals from linear Tikhonov-PSO are more reasonable than that from nonlinear method. The estimation results from linear Tikhonov-PSO method are similar to that from single PSO algorithm, and a reasonable confidence interval with some probability levels can be additionally given by Tikhonov-PSO method. Therefore, the presented linear Tikhonov-PSO regularization method is a good potential method for hazardous emission

  8. Pre-hospital care time intervals among victims of road traffic injuries in Iran. A cross-sectional study

    Directory of Open Access Journals (Sweden)

    Bigdeli Maryam

    2010-07-01

    Full Text Available Abstract Background Road traffic injuries (RTIs are a major public health problem, requiring concerted efforts both for their prevention and a reduction of their consequences. Timely arrival of the Emergency Medical Service (EMS at the crash scene followed by speedy victim transportation by trained personnel may reduce the RTIs' consequences. The first 60 minutes after injury occurrence - referred to as the "golden hour"- are vital for the saving of lives. The present study was designed to estimate the average of various time intervals occurring during the pre-hospital care process and to examine the differences between these time intervals as regards RTIs on urban and interurban roads. Method A retrospective cross-sectional study was designed and various time intervals in relation to pre-hospital care of RTIs identified in the ambulance dispatch centre in Urmia, Iran from 20 March 2005 to 20 March 2007. All cases which resulted in ambulance dispatches were reviewed and those that had complete data on time intervals were analyzed. Results In total, the cases of 2027 RTI victims were analysed. Of these, 61.5 % of the subjects were injured in city areas. The mean response time for city locations was 5.0 minutes, compared with 10.6 minutes for interurban road locations. The mean on-scene time on the interurban roads was longer than on city roads (9.2 vs. 6.1 minutes, p Conclusion The response, transport and total time intervals among EMS responding to RTI incidents were longer for interurban roads, compared to the city areas. More research should take place on needs-to and access-for EMS on city and interurban roads. The notification interval seems to be a hidden part of the post-crash events and indirectly affects the "golden hour" for victim management and it needs to be measured through the establishment of the surveillance systems.

  9. [Processing acoustically presented time intervals of seconds duration: an expression of the phonological loop of the working memory?].

    Science.gov (United States)

    Grube, D

    1996-01-01

    Working memory has been proposed to contribute to the processing of time, rhythm and music; the question which component of working memory is involved is under discussion. The present study tests the hypothesis that the phonological loop component (Baddeley, 1986) is involved in the processing of auditorily presented time intervals of a few seconds' duration. Typical effects well known with short-term retention of verbal material could be replicated with short-term retention of temporal intervals: The immediate reproduction of time intervals was impaired under conditions of background music and articulatory suppression. Neither the accuracy nor the speed of responses in a (non-phonological) mental rotation task were diminished under these conditions. Processing of auditorily presented time intervals seems to be constrained by the capacity of the phonological loop: The immediate serial recall of sequences of time intervals was shown to be related to the immediate serial recall of words (memory span). The results confirm the notion that working memory resources, and especially the phonological loop component, underlie the processing of auditorily presented temporal information with a duration of a few seconds.

  10. Hawking fluxes and anomalies in rotating regular black holes with a time-delay

    International Nuclear Information System (INIS)

    Takeuchi, Shingo

    2016-01-01

    Based on the anomaly cancellation method we compute the Hawking fluxes (the Hawking thermal flux and the total flux of energy-momentum tensor) from a four-dimensional rotating regular black hole with a time-delay. To this purpose, in the three metrics proposed in [1], we try to perform the dimensional reduction in which the anomaly cancellation method is feasible at the near-horizon region in a general scalar field theory. As a result we can demonstrate that the dimensional reduction is possible in two of those metrics. Hence we perform the anomaly cancellation method and compute the Hawking fluxes in those two metrics. Our Hawking fluxes involve three effects: (1) quantum gravity effect regularizing the core of the black holes, (2) rotation of the black hole, (3) time-delay. Further in this paper toward the metric in which the dimensional could not be performed, we argue that it would be some problematic metric, and mention its cause. The Hawking fluxes we compute in this study could be considered to correspond to more realistic Hawking fluxes. Further what Hawking fluxes can be obtained from the anomaly cancellation method would be interesting in terms of the relation between a consistency of quantum field theories and black hole thermodynamics. (paper)

  11. A new variable interval schedule with constant hazard rate and finite time range.

    Science.gov (United States)

    Bugallo, Mehdi; Machado, Armando; Vasconcelos, Marco

    2018-05-27

    We propose a new variable interval (VI) schedule that achieves constant probability of reinforcement in time while using a bounded range of intervals. By sampling each trial duration from a uniform distribution ranging from 0 to 2 T seconds, and then applying a reinforcement rule that depends linearly on trial duration, the schedule alternates reinforced and unreinforced trials, each less than 2 T seconds, while preserving a constant hazard function. © 2018 Society for the Experimental Analysis of Behavior.

  12. Critical phenomena of regular black holes in anti-de Sitter space-time

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Zhong-Ying [Peking University, Center for High Energy Physics, Beijing (China)

    2017-04-15

    In General Relativity, addressing coupling to a non-linear electromagnetic field, together with a negative cosmological constant, we obtain the general static spherical symmetric black hole solution with magnetic charges, which is asymptotic to anti-de Sitter (AdS) space-times. In particular, for a degenerate case the solution becomes a Hayward-AdS black hole, which is regular everywhere in the full space-time. The existence of such a regular black hole solution preserves the weak energy condition, while the strong energy condition is violated. We then derive the first law and the Smarr formula of the black hole solution. We further discuss its thermodynamic properties and study the critical phenomena in the extended phase space where the cosmological constant is treated as a thermodynamic variable as well as the parameter associated with the non-linear electrodynamics. We obtain many interesting results such as: the Maxwell equal area law in the P-V (or S-T) diagram is violated and consequently the critical point (T{sub *},P{sub *}) of the first order small-large black hole transition does not coincide with the inflection point (T{sub c},P{sub c}) of the isotherms; the Clapeyron equation describing the coexistence curve of the Van der Waals (vdW) fluid is no longer valid; the heat capacity at constant pressure is finite at the critical point; the various exponents near the critical point are also different from those of the vdW fluid. (orig.)

  13. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals

    Science.gov (United States)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.

    2018-02-01

    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ physiological data after shuffling or with a group of symmetric synthetic time series.

  14. Method to measure autonomic control of cardiac function using time interval parameters from impedance cardiography

    International Nuclear Information System (INIS)

    Meijer, Jan H; Boesveldt, Sanne; Elbertse, Eskeline; Berendse, H W

    2008-01-01

    The time difference between the electrocardiogram and impedance cardiogram can be considered as a measure for the time delay between the electrical and mechanical activities of the heart. This time interval, characterized by the pre-ejection period (PEP), is related to the sympathetic autonomous nervous control of cardiac activity. PEP, however, is difficult to measure in practice. Therefore, a novel parameter, the initial systolic time interval (ISTI), is introduced to provide a more practical measure. The use of ISTI instead of PEP was evaluated in three groups: young healthy subjects, patients with Parkinson's disease, and a group of elderly, healthy subjects of comparable age. PEP and ISTI were studied under two conditions: at rest and after an exercise stimulus. Under both conditions, PEP and ISTI behaved largely similarly in the three groups and were significantly correlated. It is concluded that ISTI can be used as a substitute for PEP and, therefore, to evaluate autonomic neuropathy both in clinical and extramural settings. Measurement of ISTI can also be used to non-invasively monitor the electromechanical cardiac time interval, and the associated autonomic activity, under physiological circumstances

  15. Incremental projection approach of regularization for inverse problems

    Energy Technology Data Exchange (ETDEWEB)

    Souopgui, Innocent, E-mail: innocent.souopgui@usm.edu [The University of Southern Mississippi, Department of Marine Science (United States); Ngodock, Hans E., E-mail: hans.ngodock@nrlssc.navy.mil [Naval Research Laboratory (United States); Vidard, Arthur, E-mail: arthur.vidard@imag.fr; Le Dimet, François-Xavier, E-mail: ledimet@imag.fr [Laboratoire Jean Kuntzmann (France)

    2016-10-15

    This paper presents an alternative approach to the regularized least squares solution of ill-posed inverse problems. Instead of solving a minimization problem with an objective function composed of a data term and a regularization term, the regularization information is used to define a projection onto a convex subspace of regularized candidate solutions. The objective function is modified to include the projection of each iterate in the place of the regularization. Numerical experiments based on the problem of motion estimation for geophysical fluid images, show the improvement of the proposed method compared with regularization methods. For the presented test case, the incremental projection method uses 7 times less computation time than the regularization method, to reach the same error target. Moreover, at convergence, the incremental projection is two order of magnitude more accurate than the regularization method.

  16. The existence and regularity of time-periodic solutions to the three-dimensional Navier–Stokes equations in the whole space

    International Nuclear Information System (INIS)

    Kyed, Mads

    2014-01-01

    The existence, uniqueness and regularity of time-periodic solutions to the Navier–Stokes equations in the three-dimensional whole space are investigated. We consider the Navier–Stokes equations with a non-zero drift term corresponding to the physical model of a fluid flow around a body that moves with a non-zero constant velocity. The existence of a strong time-periodic solution is shown for small time-periodic data. It is further shown that this solution is unique in a large class of weak solutions that can be considered physically reasonable. Finally, we establish regularity properties for any strong solution regardless of its size. (paper)

  17. The Regularity and Irregularity of Travel: an Analysis of the Consistency of Travel Times Associated with Subsistence, Maintenance and Discretionary Activities

    OpenAIRE

    Longden, Thomas

    2016-01-01

    Regular and irregular travel patterns coincide with different underlying purposes of travel and days of the week. Within this paper, it is shown that the balance between subsistence (i.e. work) and discretionary (i.e. leisure) activities is related to differences in travel patterns and explains consistency across years. Using eight years of time use diary entries this paper finds that travel time related to subsistence activities tends to be regular and stable. In contrast, travel time associ...

  18. Corticostriatal field potentials are modulated at delta and theta frequencies during interval-timing task in rodents

    Directory of Open Access Journals (Sweden)

    Eric B Emmons

    2016-04-01

    Full Text Available Organizing movements in time is a critical and highly conserved feature of mammalian behavior. Temporal control of action requires corticostriatal networks. We investigate these networks in rodents using a two-interval timing task while recording local field potentials in medial frontal cortex or dorsomedial striatum. Consistent with prior work, we found cue-triggered delta (1-4 Hz and theta activity (4-8 Hz primarily in rodent medial frontal cortex. We observed delta activity across temporal intervals in medial frontal cortex and dorsomedial striatum. Rewarded responses were associated with increased delta activity in medial frontal cortex. Activity in theta bands in medial frontal cortex and delta bands in the striatum was linked with the timing of responses. These data suggest both delta and theta activity in frontostriatal networks are modulated during interval timing and that activity in these bands may be involved in the temporal control of action.

  19. Pre-hospital care time intervals among victims of road traffic injuries in Iran. A cross-sectional study.

    Science.gov (United States)

    Bigdeli, Maryam; Khorasani-Zavareh, Davoud; Mohammadi, Reza

    2010-07-09

    Road traffic injuries (RTIs) are a major public health problem, requiring concerted efforts both for their prevention and a reduction of their consequences. Timely arrival of the Emergency Medical Service (EMS) at the crash scene followed by speedy victim transportation by trained personnel may reduce the RTIs' consequences. The first 60 minutes after injury occurrence--referred to as the "golden hour"--are vital for the saving of lives. The present study was designed to estimate the average of various time intervals occurring during the pre-hospital care process and to examine the differences between these time intervals as regards RTIs on urban and interurban roads. A retrospective cross-sectional study was designed and various time intervals in relation to pre-hospital care of RTIs identified in the ambulance dispatch centre in Urmia, Iran from 20 March 2005 to 20 March 2007. All cases which resulted in ambulance dispatches were reviewed and those that had complete data on time intervals were analyzed. In total, the cases of 2027 RTI victims were analysed. Of these, 61.5% of the subjects were injured in city areas. The mean response time for city locations was 5.0 minutes, compared with 10.6 minutes for interurban road locations. The mean on-scene time on the interurban roads was longer than on city roads (9.2 vs. 6.1 minutes, p transport times from the scene to the hospital were also significantly longer for interurban incidents (17.1 vs. 6.3 minutes, p transport and total time intervals among EMS responding to RTI incidents were longer for interurban roads, compared to the city areas. More research should take place on needs-to and access-for EMS on city and interurban roads. The notification interval seems to be a hidden part of the post-crash events and indirectly affects the "golden hour" for victim management and it needs to be measured through the establishment of the surveillance systems.

  20. Particle dynamics around time conformal regular black holes via Noether symmetries

    Science.gov (United States)

    Jawad, Abdul; Umair Shahzad, M.

    The time conformal regular black hole (RBH) solutions which are admitting the time conformal factor e𝜖g(t), where g(t) is an arbitrary function of time and 𝜖 is the perturbation parameter are being considered. The approximate Noether symmetries technique is being used for finding the function g(t) which leads to t α. The dynamics of particles around RBHs are also being discussed through symmetry generators which provide approximate energy as well as angular momentum of the particles. In addition, we analyze the motion of neutral and charged particles around two well known RBHs such as charged RBH using Fermi-Dirac distribution and Kehagias-Sftesos asymptotically flat RBH. We obtain the innermost stable circular orbit and corresponding approximate energy and angular momentum. The behavior of effective potential, effective force and escape velocity of the particles in the presence/absence of magnetic field for different values of angular momentum near horizons are also being analyzed. The stable and unstable regions of particle near horizons due to the effect of angular momentum and magnetic field are also explained.

  1. Robust stability of interval bidirectional associative memory neural network with time delays.

    Science.gov (United States)

    Liao, Xiaofeng; Wong, Kwok-wo

    2004-04-01

    In this paper, the conventional bidirectional associative memory (BAM) neural network with signal transmission delay is intervalized in order to study the bounded effect of deviations in network parameters and external perturbations. The resultant model is referred to as a novel interval dynamic BAM (IDBAM) model. By combining a number of different Lyapunov functionals with the Razumikhin technique, some sufficient conditions for the existence of unique equilibrium and robust stability are derived. These results are fairly general and can be verified easily. To go further, we extend our investigation to the time-varying delay case. Some robust stability criteria for BAM with perturbations of time-varying delays are derived. Besides, our approach for the analysis allows us to consider several different types of activation functions, including piecewise linear sigmoids with bounded activations as well as the usual C1-smooth sigmoids. We believe that the results obtained have leading significance in the design and application of BAM neural networks.

  2. 75 FR 53966 - Regular Meeting

    Science.gov (United States)

    2010-09-02

    ... FARM CREDIT SYSTEM INSURANCE CORPORATION Regular Meeting AGENCY: Farm Credit System Insurance Corporation Board. SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). DATE AND TIME: The meeting of the Board will be held at the offices of the Farm...

  3. Self-produced Time Intervals Are Perceived as More Variable and/or Shorter Depending on Temporal Context in Subsecond and Suprasecond Ranges

    Directory of Open Access Journals (Sweden)

    Keita eMitani

    2016-06-01

    Full Text Available The processing of time intervals is fundamental for sensorimotor and cognitive functions. Perceptual and motor timing are often performed concurrently (e.g., playing a musical instrument. Although previous studies have shown the influence of body movements on time perception, how we perceive self-produced time intervals has remained unclear. Furthermore, it has been suggested that the timing mechanisms are distinct for the sub- and suprasecond ranges. Here, we compared perceptual performances for self-produced and passively presented time intervals in random contexts (i.e., multiple target intervals presented in a session across the sub- and suprasecond ranges (Experiment 1 and within the sub- (Experiment 2 and suprasecond (Experiment 3 ranges, and in a constant context (i.e., a single target interval presented in a session in the sub- and suprasecond ranges (Experiment 4. We show that self-produced time intervals were perceived as shorter and more variable across the sub- and suprasecond ranges and within the suprasecond range but not within the subsecond range in a random context. In a constant context, the self-produced time intervals were perceived as more variable in the suprasecond range but not in the subsecond range. The impairing effects indicate that motor timing interferes with perceptual timing. The dependence of impairment on temporal contexts suggests multiple timing mechanisms for the subsecond and suprasecond ranges. In addition, violation of the scalar property (i.e., a constant variability to target interval ratio was observed between the sub- and suprasecond ranges. The violation was clearer for motor timing than for perceptual timing. This suggests that the multiple timing mechanisms for the sub- and suprasecond ranges overlap more for perception than for motor. Moreover, the central tendency effect (i.e., where shorter base intervals are overestimated and longer base intervals are underestimated disappeared with subsecond

  4. Data warehousing technologies for large-scale and right-time data

    DEFF Research Database (Denmark)

    Xiufeng, Liu

    heterogeneous sources into a central data warehouse (DW) by Extract-Transform-Load (ETL) at regular time intervals, e.g., monthly, weekly, or daily. But now, it becomes challenging for large-scale data, and hard to meet the near real-time/right-time business decisions. This thesis considers some...

  5. Discrete time interval measurement system: fundamentals, resolution and errors in the measurement of angular vibrations

    International Nuclear Information System (INIS)

    Gómez de León, F C; Meroño Pérez, P A

    2010-01-01

    The traditional method for measuring the velocity and the angular vibration in the shaft of rotating machines using incremental encoders is based on counting the pulses at given time intervals. This method is generically called the time interval measurement system (TIMS). A variant of this method that we have developed in this work consists of measuring the corresponding time of each pulse from the encoder and sampling the signal by means of an A/D converter as if it were an analog signal, that is to say, in discrete time. For this reason, we have denominated this method as the discrete time interval measurement system (DTIMS). This measurement system provides a substantial improvement in the precision and frequency resolution compared with the traditional method of counting pulses. In addition, this method permits modification of the width of some pulses in order to obtain a mark-phase on every lap. This paper explains the theoretical fundamentals of the DTIMS and its application for measuring the angular vibrations of rotating machines. It also displays the required relationship between the sampling rate of the signal, the number of pulses of the encoder and the rotating velocity in order to obtain the required resolution and to delimit the methodological errors in the measurement

  6. The Acute Effects of Interval-Type Exercise on Glycemic Control in Type 2 Diabetes Subjects: Importance of Interval Length. A Controlled, Counterbalanced, Crossover Study.

    Directory of Open Access Journals (Sweden)

    Ida Jakobsen

    Full Text Available Interval-type exercise is effective for improving glycemic control, but the optimal approach is unknown. The purpose of this study was to determine the importance of the interval length on changes in postprandial glycemic control following a single exercise bout. Twelve subjects with type 2 diabetes completed a cross-over study with three 1-hour interventions performed in a non-randomized but counter-balanced order: 1 Interval walking consisting of repeated cycles of 3 min slow (aiming for 54% of Peak oxygen consumption rate [VO2peak] and 3 min fast (aiming for 89% of VO2peak walking (IW3; 2 Interval walking consisting of repeated cycles of 1 min slow and 1 min fast walking (IW1 and 3 No walking (CON. The exercise interventions were matched with regards to walking speed, and VO2 and heart rate was assessed throughout all interventions. A 4-hour liquid mixed meal tolerance test commenced 30 min after each intervention, with blood samples taken regularly. IW3 and IW1 resulted in comparable mean VO2 and heart rates. Overall mean postprandial blood glucose levels were lower after IW3 compared to CON (10.3±3.0 vs. 11.1±3.3 mmol/L; P 0.05 for both. Conversely blood glucose levels at specific time points during the MMTT differed significantly following both IW3 and IW1 as compared to CON. Our findings support the previously found blood glucose lowering effect of IW3 and suggest that reducing the interval length, while keeping the walking speed and time spend on fast and slow walking constant, does not result in additional improvements.ClinicalTrials.gov NCT02257190.

  7. Tonic and Phasic Dopamine Fluctuations as Reflected in Beta-power Predict Interval Timing Behavior

    NARCIS (Netherlands)

    Kononowicz, Tadeusz; van Rijn, Hedderik

    It has been repeatedly shown that dopamine impacts interval timing in humans and animals (for a review, see Coull, Cheng, & Meck, 2012). Particularly, administration of dopamine agonists or antagonists speeds-up or slows down internal passage of time, respectively (Meck, 1996). This co-variations in

  8. Determination and identification of naturally occurring decay series using milli-second order pulse time interval analysis (TIA)

    International Nuclear Information System (INIS)

    Hashimoto, T.; Sanada, Y.; Uezu, Y.

    2003-01-01

    A delayed coincidence method, called a time interval analysis (TIA) method, has been successfully applied to selective determination of the correlated α-α decay events in millisecond order life-time. A main decay process applicable to TIA-treatment is 220 Rn → 216 Po(T 1/2 :145ms) → {Th-series}. The TIA is fundamentally based on the difference of time interval distribution between non-correlated decay events and other events such as background or random events when they were compiled the time interval data within a fixed time (for example, a tenth of concerned half lives). The sensitivity of the TIA-analysis due to correlated α-α decay events could be subsequently improved in respect of background elimination using the pulse shape discrimination technique (PSD with PERALS counter) to reject β/γ-pulses, purging of nitrogen gas into extra scintillator, and applying solvent extraction of Ra. (author)

  9. Usability of a new multiple high-speed pulse time data registration, processing and real-time display system for pulse time interval analysis

    International Nuclear Information System (INIS)

    Yawata, Takashi; Sakaue, Hisanobu; Hashimoto, Tetsuo; Itou, Shigeki

    2006-01-01

    A new high-speed multiple pulse time data registration, processing and real-time display system for time interval analysis (TIA) was developed for counting either β-α or α-α correlated decay-events. The TIA method has been so far limited to selective extraction of successive α-α decay events within the milli-second time scale owing to the use of original electronic hardware. In the present pulse-processing system, three different high-speed α/β(γ) pulses could be fed quickly to original 32 bit PCI board (ZN-HTS2) within 1 μs. This original PCI board is consisting of a timing-control IC (HTS-A) and 28 bit counting IC (HTS-B). All channel and pulse time data were stored to FIFO RAM, followed to transfer into temporary CPU RAM (32 MB) by DMA. Both data registration (into main RAM (200 MB)) and calculation of pulse time intervals together with real-time TIA-distribution display simultaneously processed using two sophisticate softwares. The present system has proven to succeed for the real-time display of TIA distribution spectrum even when 1.6x10 5 cps pulses from pulse generator were given to the system. By using this new system combined with liquid scintillation counting (LSC) apparatus, both a natural micro-second order β-α correlated decay-events and a milli-second order α-α correlated decay-event could be selectively extracted from the mixture of natural radionuclides. (author)

  10. Efficient Estimation for Diffusions Sampled at High Frequency Over a Fixed Time Interval

    DEFF Research Database (Denmark)

    Jakobsen, Nina Munkholt; Sørensen, Michael

    Parametric estimation for diffusion processes is considered for high frequency observations over a fixed time interval. The processes solve stochastic differential equations with an unknown parameter in the diffusion coefficient. We find easily verified conditions on approximate martingale...

  11. The geometric $\\beta$-function in curved space-time under operator regularization

    OpenAIRE

    Agarwala, Susama

    2009-01-01

    In this paper, I compare the generators of the renormalization group flow, or the geometric $\\beta$-functions for dimensional regularization and operator regularization. I then extend the analysis to show that the geometric $\\beta$-function for a scalar field theory on a closed compact Riemannian manifold is defined on the entire manifold. I then extend the analysis to find the generator of the renormalization group flow for a conformal scalar-field theories on the same manifolds. The geometr...

  12. Optimization of Allowed Outage Time and Surveillance Test Intervals

    Energy Technology Data Exchange (ETDEWEB)

    Al-Dheeb, Mujahed; Kang, Sunkoo; Kim, Jonghyun [KEPCO international nuclear graduate school, Ulsan (Korea, Republic of)

    2015-10-15

    The primary purpose of surveillance testing is to assure that the components of standby safety systems will be operable when they are needed in an accident. By testing these components, failures can be detected that may have occurred since the last test or the time when the equipment was last known to be operational. The probability a system or system component performs a specified function or mission under given conditions at a prescribed time is called availability (A). Unavailability (U) as a risk measure is just the complementary probability to A(t). The increase of U means the risk is increased as well. D and T have an important impact on components, or systems, unavailability. The extension of D impacts the maintenance duration distributions for at-power operations, making them longer. This, in turn, increases the unavailability due to maintenance in the systems analysis. As for T, overly-frequent surveillances can result in high system unavailability. This is because the system may be taken out of service often due to the surveillance itself and due to the repair of test-caused failures of the component. The test-caused failures include those incurred by wear and tear of the component due to the surveillances. On the other hand, as the surveillance interval increases, the component's unavailability will grow because of increased occurrences of time-dependent random failures. In that situation, the component cannot be relied upon, and accordingly the system unavailability will increase. Thus, there should be an optimal component surveillance interval in terms of the corresponding system availability. This paper aims at finding the optimal T and D which result in minimum unavailability which in turn reduces the risk. Applying the methodology in section 2 to find the values of optimal T and D for two components, i.e., safety injection pump (SIP) and turbine driven aux feedwater pump (TDAFP). Section 4 is addressing interaction between D and T. In general

  13. Optimization of Allowed Outage Time and Surveillance Test Intervals

    International Nuclear Information System (INIS)

    Al-Dheeb, Mujahed; Kang, Sunkoo; Kim, Jonghyun

    2015-01-01

    The primary purpose of surveillance testing is to assure that the components of standby safety systems will be operable when they are needed in an accident. By testing these components, failures can be detected that may have occurred since the last test or the time when the equipment was last known to be operational. The probability a system or system component performs a specified function or mission under given conditions at a prescribed time is called availability (A). Unavailability (U) as a risk measure is just the complementary probability to A(t). The increase of U means the risk is increased as well. D and T have an important impact on components, or systems, unavailability. The extension of D impacts the maintenance duration distributions for at-power operations, making them longer. This, in turn, increases the unavailability due to maintenance in the systems analysis. As for T, overly-frequent surveillances can result in high system unavailability. This is because the system may be taken out of service often due to the surveillance itself and due to the repair of test-caused failures of the component. The test-caused failures include those incurred by wear and tear of the component due to the surveillances. On the other hand, as the surveillance interval increases, the component's unavailability will grow because of increased occurrences of time-dependent random failures. In that situation, the component cannot be relied upon, and accordingly the system unavailability will increase. Thus, there should be an optimal component surveillance interval in terms of the corresponding system availability. This paper aims at finding the optimal T and D which result in minimum unavailability which in turn reduces the risk. Applying the methodology in section 2 to find the values of optimal T and D for two components, i.e., safety injection pump (SIP) and turbine driven aux feedwater pump (TDAFP). Section 4 is addressing interaction between D and T. In general

  14. Electric power demand forecasting using interval time series. A comparison between VAR and iMLP

    International Nuclear Information System (INIS)

    Garcia-Ascanio, Carolina; Mate, Carlos

    2010-01-01

    Electric power demand forecasts play an essential role in the electric industry, as they provide the basis for making decisions in power system planning and operation. A great variety of mathematical methods have been used for demand forecasting. The development and improvement of appropriate mathematical tools will lead to more accurate demand forecasting techniques. In order to forecast the monthly electric power demand per hour in Spain for 2 years, this paper presents a comparison between a new forecasting approach considering vector autoregressive (VAR) forecasting models applied to interval time series (ITS) and the iMLP, the multi-layer perceptron model adapted to interval data. In the proposed comparison, for the VAR approach two models are fitted per every hour, one composed of the centre (mid-point) and radius (half-range), and another one of the lower and upper bounds according to the interval representation assumed by the ITS in the learning set. In the case of the iMLP, only the model composed of the centre and radius is fitted. The other interval representation composed of the lower and upper bounds is obtained from the linear combination of the two. This novel approach, obtaining two bivariate models each hour, makes possible to establish, for different periods in the day, which interval representation is more accurate. Furthermore, the comparison between two different techniques adapted to interval time series allows us to determine the efficiency of these models in forecasting electric power demand. It is important to note that the iMLP technique has been selected for the comparison, as it has shown its accuracy in forecasting daily electricity price intervals. This work shows the ITS forecasting methods as a potential tool that will lead to a reduction in risk when making power system planning and operational decisions. (author)

  15. Assessing cardiac preload by the Initial Systolic Time Interval obtained from impedance cardiography

    Directory of Open Access Journals (Sweden)

    Jan H Meijer

    2010-01-01

    Full Text Available The Initial Systolic Time Interval (ISTI, obtained from the electrocardiogram (ECG and impedance cardiogram (ICG, is considered to be a measure for the time delay between the electrical and mechanical activity of the heart and reflects an early active period of the cardiac cycle. The clinical relevance of this time interval is subject of study. This paper presents preliminary results of a pilot study investigating the use of ISTI in evaluating and predicting the circulatory response to fluid administration in patients after coronary artery bypass graft surgery, by comparing ISTI with cardiac output (CO responsiveness. Also the use of the pulse transit time (PTT, earlier recommended for this purpose, is investigated. The results show an inverse relationship between ISTI and CO at all moments of fluid administration and also an inverse relationship between the changes ΔISTI and ΔCO before and after full fluid administration. No relationships between PTT and CO or ΔPTT and ΔCO were found. It is concluded that ISTI is dependent upon preload, and that ISTI has the potential to be used as a clinical parameter assessing preload.

  16. Evaluation of downmotion time interval molten materials to core catcher during core disruptive accidents postulated in LMFR

    International Nuclear Information System (INIS)

    Voronov, S.A.; Kiryushin, A.I.; Kuzavkov, N.G.; Vlasichev, G.N.

    1994-01-01

    Hypothetical core disruptive accidents are postulated to clear potential of a reactor plant to withstand extreme conditions and to generate measures for management and mitigation of accidents consequence. In Russian advanced reactors there is a core catcher below the diagrid to prevent vessel bottom melting and to localize fuel debris. In this paper the calculation technique and estimation of relocation time of molten fuel and materials are presented in the case of core disruptive accidents postulated for LMFR reactor. To evaluate minimum interval of fuel relocation time the calculations for different initial data are provided. Large mass of materials between the core and the catcher in LMFR reactor hinders molten materials relocation toward the vessel bottom. That condition increases the time interval of reaching core catcher by molten fuel. Computations performed allowed to evaluate the minimum molten materials relocation time from the core to the core catcher. This time interval is in a range of 3.5-5.5 hours. (author)

  17. RISMA: A Rule-based Interval State Machine Algorithm for Alerts Generation, Performance Analysis and Monitoring Real-Time Data Processing

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2013-04-01

    The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory

  18. Robust L2-L∞ Filtering of Time-Delay Jump Systems with Respect to the Finite-Time Interval

    Directory of Open Access Journals (Sweden)

    Shuping He

    2011-01-01

    Full Text Available This paper studied the problem of stochastic finite-time boundedness and disturbance attenuation for a class of linear time-delayed systems with Markov jumping parameters. Sufficient conditions are provided to solve this problem. The L2-L∞ filters are, respectively, designed for time-delayed Markov jump linear systems with/without uncertain parameters such that the resulting filtering error dynamic system is stochastically finite-time bounded and has the finite-time interval disturbance attenuation γ for all admissible uncertainties, time delays, and unknown disturbances. By using stochastic Lyapunov-Krasovskii functional approach, it is shown that the filter designing problem is in terms of the solutions of a set of coupled linear matrix inequalities. Simulation examples are included to demonstrate the potential of the proposed results.

  19. Incidence of Interval Colorectal Cancer Among Inflammatory Bowel Disease Patients Undergoing Regular Colonoscopic Surveillance

    NARCIS (Netherlands)

    Mooiweer, Erik; van der Meulen-de Jong, Andrea E.; Ponsioen, Cyriel Y.; van der Woude, C. Janneke; van Bodegraven, Ad A.; Jansen, Jeroen M.; Mahmmod, Nofel; Kremer, Willemijn; Siersema, Peter D.; Oldenburg, Bas

    2015-01-01

    Surveillance is recommended for patients with long-term inflammatory bowel disease because they have an increased risk of colorectal cancer (CRC). To study the effectiveness of surveillance, we determined the incidence of CRC after negative findings from surveillance colonoscopies (interval CRC).

  20. Improved Criteria on Delay-Dependent Stability for Discrete-Time Neural Networks with Interval Time-Varying Delays

    Directory of Open Access Journals (Sweden)

    O. M. Kwon

    2012-01-01

    Full Text Available The purpose of this paper is to investigate the delay-dependent stability analysis for discrete-time neural networks with interval time-varying delays. Based on Lyapunov method, improved delay-dependent criteria for the stability of the networks are derived in terms of linear matrix inequalities (LMIs by constructing a suitable Lyapunov-Krasovskii functional and utilizing reciprocally convex approach. Also, a new activation condition which has not been considered in the literature is proposed and utilized for derivation of stability criteria. Two numerical examples are given to illustrate the effectiveness of the proposed method.

  1. Detection of abnormal item based on time intervals for recommender systems.

    Science.gov (United States)

    Gao, Min; Yuan, Quan; Ling, Bin; Xiong, Qingyu

    2014-01-01

    With the rapid development of e-business, personalized recommendation has become core competence for enterprises to gain profits and improve customer satisfaction. Although collaborative filtering is the most successful approach for building a recommender system, it suffers from "shilling" attacks. In recent years, the research on shilling attacks has been greatly improved. However, the approaches suffer from serious problem in attack model dependency and high computational cost. To solve the problem, an approach for the detection of abnormal item is proposed in this paper. In the paper, two common features of all attack models are analyzed at first. A revised bottom-up discretized approach is then proposed based on time intervals and the features for the detection. The distributions of ratings in different time intervals are compared to detect anomaly based on the calculation of chi square distribution (χ(2)). We evaluated our approach on four types of items which are defined according to the life cycles of these items. The experimental results show that the proposed approach achieves a high detection rate with low computational cost when the number of attack profiles is more than 15. It improves the efficiency in shilling attacks detection by narrowing down the suspicious users.

  2. Detection of Abnormal Item Based on Time Intervals for Recommender Systems

    Directory of Open Access Journals (Sweden)

    Min Gao

    2014-01-01

    Full Text Available With the rapid development of e-business, personalized recommendation has become core competence for enterprises to gain profits and improve customer satisfaction. Although collaborative filtering is the most successful approach for building a recommender system, it suffers from “shilling” attacks. In recent years, the research on shilling attacks has been greatly improved. However, the approaches suffer from serious problem in attack model dependency and high computational cost. To solve the problem, an approach for the detection of abnormal item is proposed in this paper. In the paper, two common features of all attack models are analyzed at first. A revised bottom-up discretized approach is then proposed based on time intervals and the features for the detection. The distributions of ratings in different time intervals are compared to detect anomaly based on the calculation of chi square distribution (χ2. We evaluated our approach on four types of items which are defined according to the life cycles of these items. The experimental results show that the proposed approach achieves a high detection rate with low computational cost when the number of attack profiles is more than 15. It improves the efficiency in shilling attacks detection by narrowing down the suspicious users.

  3. A new criterion for global robust stability of interval neural networks with discrete time delays

    International Nuclear Information System (INIS)

    Li Chuandong; Chen Jinyu; Huang Tingwen

    2007-01-01

    This paper further studies global robust stability of a class of interval neural networks with discrete time delays. By introducing an equivalent transformation of interval matrices, a new criterion on global robust stability is established. In comparison with the results reported in the literature, the proposed approach leads to results with less restrictive conditions. Numerical examples are also worked through to illustrate our results

  4. Incidence of Interval Colorectal Cancer Among Inflammatory Bowel Disease Patients Undergoing Regular Colonoscopic Surveillance

    NARCIS (Netherlands)

    Mooiweer, Erik; van der Meulen-de Jong, Andrea E.; Ponsioen, Cyriel Y.; van der Woude, C. Janneke; van Bodegraven, Ad A.; Jansen, Jeroen M.; Mahmmod, Nofel; Kremer, Willemijn; Siersema, Peter D.; Oldenburg, Bas

    2015-01-01

    Surveillance is recommended for patients with long-term inflammatory bowel disease because they have an increased risk of colorectal cancer (CRC). To study the effectiveness of surveillance, we determined the incidence of CRC after negative findings from surveillance colonoscopies (interval CRC). We

  5. The geometric β-function in curved space-time under operator regularization

    Energy Technology Data Exchange (ETDEWEB)

    Agarwala, Susama [Mathematical Institute, Oxford University, Oxford OX2 6GG (United Kingdom)

    2015-06-15

    In this paper, I compare the generators of the renormalization group flow, or the geometric β-functions, for dimensional regularization and operator regularization. I then extend the analysis to show that the geometric β-function for a scalar field theory on a closed compact Riemannian manifold is defined on the entire manifold. I then extend the analysis to find the generator of the renormalization group flow to conformally coupled scalar-field theories on the same manifolds. The geometric β-function in this case is not defined.

  6. The geometric β-function in curved space-time under operator regularization

    International Nuclear Information System (INIS)

    Agarwala, Susama

    2015-01-01

    In this paper, I compare the generators of the renormalization group flow, or the geometric β-functions, for dimensional regularization and operator regularization. I then extend the analysis to show that the geometric β-function for a scalar field theory on a closed compact Riemannian manifold is defined on the entire manifold. I then extend the analysis to find the generator of the renormalization group flow to conformally coupled scalar-field theories on the same manifolds. The geometric β-function in this case is not defined

  7. The Perforation-Operation time Interval; An Important Mortality Indicator in Peptic Ulcer Perforation.

    Science.gov (United States)

    Surapaneni, Sushama; S, Rajkumar; Reddy A, Vijaya Bhaskar

    2013-05-01

    To find out the significance of the Perforation-Operation Interval (POI) with respect to an early prognosis, in patients with peritonitis which is caused by peptic ulcer perforation. Case series. Place and Duration of the Study: Department of General Surgery, Konaseema Institute of Medical Sciences and RF Amalapuram, Andhra Pradesh, India from 2008-2011. This study included 150 patients with generalized peritonitis, who were diagnosed to have Perforated Peptic Ulcers (PPUs). The diagnosis of the PPUs was established on the basis of the history , the clinical examination and the radiological findings. The perforation-operation interval was calculated from the time of onset of the symptoms like severe abdominal pain or vomiting till the time the patient was operated. Out of the 150 patients 134 were males and 16 were females, with a male : female ratio of 9:1. Their ages ranged between 25-70 years. Out of the 150 patients, 65 patients (43.3%) presented within 24 hours of the onset of severe abdominal pain (Group A), 27 patients (18%) presented between 24-48 hours of the onset of severe abdominal pain (Group B) and 58 patients (38.6%) presented after 48 hours. There was no mortality in Group A and the morbidity was more in Group B and Group C. There were 15 deaths in Group C. The problem of peptic ulcer perforation with its complication, can be decreased by decreasing the perforation -operation time interval, which as per our study, appeared to be the single most important mortality and morbidity indicator in peptic ulcer perforation.

  8. Detection of surface electromyography recording time interval without muscle fatigue effect for biceps brachii muscle during maximum voluntary contraction.

    Science.gov (United States)

    Soylu, Abdullah Ruhi; Arpinar-Avsar, Pinar

    2010-08-01

    The effects of fatigue on maximum voluntary contraction (MVC) parameters were examined by using force and surface electromyography (sEMG) signals of the biceps brachii muscles (BBM) of 12 subjects. The purpose of the study was to find the sEMG time interval of the MVC recordings which is not affected by the muscle fatigue. At least 10s of force and sEMG signals of BBM were recorded simultaneously during MVC. The subjects reached the maximum force level within 2s by slightly increasing the force, and then contracted the BBM maximally. The time index of each sEMG and force signal were labeled with respect to the time index of the maximum force (i.e. after the time normalization, each sEMG or force signal's 0s time index corresponds to maximum force point). Then, the first 8s of sEMG and force signals were divided into 0.5s intervals. Mean force, median frequency (MF) and integrated EMG (iEMG) values were calculated for each interval. Amplitude normalization was performed by dividing the force signals to their mean values of 0s time intervals (i.e. -0.25 to 0.25s). A similar amplitude normalization procedure was repeated for the iEMG and MF signals. Statistical analysis (Friedman test with Dunn's post hoc test) was performed on the time and amplitude normalized signals (MF, iEMG). Although the ANOVA results did not give statistically significant information about the onset of the muscle fatigue, linear regression (mean force vs. time) showed a decreasing slope (Pearson-r=0.9462, pfatigue starts after the 0s time interval as the muscles cannot attain their peak force levels. This implies that the most reliable interval for MVC calculation which is not affected by the muscle fatigue is from the onset of the EMG activity to the peak force time. Mean, SD, and range of this interval (excluding 2s gradual increase time) for 12 subjects were 2353, 1258ms and 536-4186ms, respectively. Exceeding this interval introduces estimation errors in the maximum amplitude calculations

  9. Nonparametric estimation in an "illness-death" model when all transition times are interval censored

    DEFF Research Database (Denmark)

    Frydman, Halina; Gerds, Thomas; Grøn, Randi

    2013-01-01

    We develop nonparametric maximum likelihood estimation for the parameters of an irreversible Markov chain on states {0,1,2} from the observations with interval censored times of 0 → 1, 0 → 2 and 1 → 2 transitions. The distinguishing aspect of the data is that, in addition to all transition times ...

  10. CMOS direct time interval measurement of long-lived luminescence lifetimes.

    Science.gov (United States)

    Yao, Lei; Yung, Ka Yi; Cheung, Maurice C; Chodavarapu, Vamsy P; Bright, Frank V

    2011-01-01

    We describe a Complementary Metal-Oxide Semiconductor (CMOS) Direct Time Interval Measurement (DTIM) Integrated Circuit (IC) to detect the decay (fall) time of the luminescence emission when analyte-sensitive luminophores are excited with an optical pulse. The CMOS DTIM IC includes 14 × 14 phototransistor array, transimpedance amplifier, regulated gain amplifier, fall time detector, and time-to-digital convertor. We examined the DTIM system to measure the emission lifetime of oxygen-sensitive luminophores tris(4,7-diphenyl-1, 10-phenanthroline) ruthenium(II) ([Ru(dpp)(3)](2+)) encapsulated in sol-gel derived xerogel thin-films. The DTIM system fabricated using TSMC 0.35 μm process functions to detect lifetimes from 4 μs to 14.4 μs but can be tuned to detect longer lifetimes. The system provides 8-bit digital output proportional to lifetimes and consumes 4.5 mW of power with 3.3 V DC supply. The CMOS system provides a useful platform for the development of reliable, robust, and miniaturized optical chemical sensors.

  11. Zeta-function regularization approach to finite temperature effects in Kaluza-Klein space-times

    International Nuclear Information System (INIS)

    Bytsenko, A.A.; Vanzo, L.; Zerbini, S.

    1992-01-01

    In the framework of heat-kernel approach to zeta-function regularization, in this paper the one-loop effective potential at finite temperature for scalar and spinor fields on Kaluza-Klein space-time of the form M p x M c n , where M p is p-dimensional Minkowski space-time is evaluated. In particular, when the compact manifold is M c n = H n /Γ, the Selberg tracer formula associated with discrete torsion-free group Γ of the n-dimensional Lobachevsky space H n is used. An explicit representation for the thermodynamic potential valid for arbitrary temperature is found. As a result a complete high temperature expansion is presented and the roles of zero modes and topological contributions is discussed

  12. Real time QRS complex detection using DFA and regular grammar.

    Science.gov (United States)

    Hamdi, Salah; Ben Abdallah, Asma; Bedoui, Mohamed Hedi

    2017-02-28

    The sequence of Q, R, and S peaks (QRS) complex detection is a crucial procedure in electrocardiogram (ECG) processing and analysis. We propose a novel approach for QRS complex detection based on the deterministic finite automata with the addition of some constraints. This paper confirms that regular grammar is useful for extracting QRS complexes and interpreting normalized ECG signals. A QRS is assimilated to a pair of adjacent peaks which meet certain criteria of standard deviation and duration. The proposed method was applied on several kinds of ECG signals issued from the standard MIT-BIH arrhythmia database. A total of 48 signals were used. For an input signal, several parameters were determined, such as QRS durations, RR distances, and the peaks' amplitudes. σRR and σQRS parameters were added to quantify the regularity of RR distances and QRS durations, respectively. The sensitivity rate of the suggested method was 99.74% and the specificity rate was 99.86%. Moreover, the sensitivity and the specificity rates variations according to the Signal-to-Noise Ratio were performed. Regular grammar with the addition of some constraints and deterministic automata proved functional for ECG signals diagnosis. Compared to statistical methods, the use of grammar provides satisfactory and competitive results and indices that are comparable to or even better than those cited in the literature.

  13. Development of a Regularized Dynamic System Response Curve for Real-Time Flood Forecasting Correction

    Directory of Open Access Journals (Sweden)

    Yiqun Sun

    2018-04-01

    Full Text Available The dynamic system response curve (DSRC is commonly applied as a real-time flood forecasting error correction method to improve the accuracy of real-time flood forecasting. It has been widely recognized that the least squares (OLS/LS method, employed by DSRC, breaks down ill-posed problems, and therefore, the DSRC method may lead to deterioration in performance caused by meaningless solutions. To address this problem, a diagnostically theoretical analysis was conducted to investigate the relationship between the numerical solution of the Fredholm equation of the first kind and the DSRC method. The analysis clearly demonstrates the derivation of the problem and has implications for an improved approach. To overcome the unstable problem, a new method using regularization techniques (Tikhonov regularization and L-Curve criterion is proposed. Moreover, in this study, to improve the performance of hydrological models, the new method is used as an error correction method to correct a variable from a hydrological model. The proposed method incorporates the information from a hydrological model structure. Based on the analysis of the hydrological model, the free water storage of the Xinanjiang rainfall-runoff (XAJ model is corrected to improve the model’s performance. A numerical example and a real case study are presented to compare the two methods. Results from the numerical example indicate that the mean Nash–Sutcliffe efficiency value (NSE of the regularized DSRC method (RDSRC decreased from 0.99 to 0.55, while the mean NSE of DSRC decreased from 0.98 to −1.84 when the noise level was increased. The overall performance measured by four different criteria clearly demonstrates the robustness of the RDSRC method. Similar results were obtained for the real case study. The mean NSE of 35 flood events obtained by RDSRC method was 0.92, which is significantly higher than the mean NSE of DSRC (0.7. The results demonstrate that the RDSRC method is much

  14. Comparison of equilibrium radionuclide and contrast angiographic measurements of left ventricular peak ejection and filling rates and their time intervals

    Energy Technology Data Exchange (ETDEWEB)

    Sugrue, D.D.; Dickie, S.; Newman, H.; Myers, M.J.; Lavender, J.P.; McKenna, W.J. (Royal Postgraduate Medical School, London (UK))

    1984-10-01

    A comparison has been made of the equilibrium radionuclide and contrast angiographic estimates of normalized peak rates of ejection (PER) and filling (PFR) and their time intervals in twenty-one patients with cardiac disorders. Contrast angiographic and radionuclide measurements of left ventricular ejection fraction (LVEF), PER and PFR correlated well but time intervals correlated poorly. Mean values for radionuclide LVEF, PER and PFR were significantly lower and radionuclide time intervals were significantly longer compared to contrast angiography measurements.

  15. Existence, regularity and representation of solutions of time fractional wave equations

    Directory of Open Access Journals (Sweden)

    Valentin Keyantuo

    2017-09-01

    Full Text Available We study the solvability of the fractional order inhomogeneous Cauchy problem $$ \\mathbb{D}_t^\\alpha u(t=Au(t+f(t, \\quad t>0,\\;1<\\alpha\\le 2, $$ where A is a closed linear operator in some Banach space X and $f:[0,\\infty\\to X$ a given function. Operator families associated with this problem are defined and their regularity properties are investigated. In the case where A is a generator of a $\\beta$-times integrated cosine family $(C_\\beta(t$, we derive explicit representations of mild and classical solutions of the above problem in terms of the integrated cosine family. We include applications to elliptic operators with Dirichlet, Neumann or Robin type boundary conditions on $L^p$-spaces and on the space of continuous functions.

  16. An integrated theory of prospective time interval estimation : The role of cognition, attention, and learning

    NARCIS (Netherlands)

    Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John

    A theory of prospective time perception is introduced and incorporated as a module in an integrated theory of cognition, thereby extending existing theories and allowing predictions about attention and learning. First, a time perception module is established by fitting existing datasets (interval

  17. Heuristic algorithms for the minmax regret flow-shop problem with interval processing times.

    Science.gov (United States)

    Ćwik, Michał; Józefczyk, Jerzy

    2018-01-01

    An uncertain version of the permutation flow-shop with unlimited buffers and the makespan as a criterion is considered. The investigated parametric uncertainty is represented by given interval-valued processing times. The maximum regret is used for the evaluation of uncertainty. Consequently, the minmax regret discrete optimization problem is solved. Due to its high complexity, two relaxations are applied to simplify the optimization procedure. First of all, a greedy procedure is used for calculating the criterion's value, as such calculation is NP-hard problem itself. Moreover, the lower bound is used instead of solving the internal deterministic flow-shop. The constructive heuristic algorithm is applied for the relaxed optimization problem. The algorithm is compared with previously elaborated other heuristic algorithms basing on the evolutionary and the middle interval approaches. The conducted computational experiments showed the advantage of the constructive heuristic algorithm with regards to both the criterion and the time of computations. The Wilcoxon paired-rank statistical test confirmed this conclusion.

  18. Neutron generation time of the reactor 'crocus' by an interval distribution method for counts collected by two detectors

    International Nuclear Information System (INIS)

    Haldy, P.-A.; Chikouche, M.

    1975-01-01

    The distribution is considered of time intervals between a count in one neutron detector and the consequent event registered in a second one. A 'four interval' probability generating function was derived by means of which the expression for the distribution of the time intervals, lasting from triggering detection in the first detector to subsequent count in the second, one could be obtained. The experimental work was conducted in the zero thermal power reactor Crocus, using a neutron source provided by spontaneous fission, a BF 3 counter for the first detector and an He 3 detector for the second instrument. (U.K.)

  19. Cardiac C-arm computed tomography using a 3D + time ROI reconstruction method with spatial and temporal regularization

    Energy Technology Data Exchange (ETDEWEB)

    Mory, Cyril, E-mail: cyril.mory@philips.com [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, F-69621 Villeurbanne Cedex (France); Philips Research Medisys, 33 rue de Verdun, 92156 Suresnes (France); Auvray, Vincent; Zhang, Bo [Philips Research Medisys, 33 rue de Verdun, 92156 Suresnes (France); Grass, Michael; Schäfer, Dirk [Philips Research, Röntgenstrasse 24–26, D-22335 Hamburg (Germany); Chen, S. James; Carroll, John D. [Department of Medicine, Division of Cardiology, University of Colorado Denver, 12605 East 16th Avenue, Aurora, Colorado 80045 (United States); Rit, Simon [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1 (France); Centre Léon Bérard, 28 rue Laënnec, F-69373 Lyon (France); Peyrin, Françoise [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, F-69621 Villeurbanne Cedex (France); X-ray Imaging Group, European Synchrotron, Radiation Facility, BP 220, F-38043 Grenoble Cedex (France); Douek, Philippe; Boussel, Loïc [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1 (France); Hospices Civils de Lyon, 28 Avenue du Doyen Jean Lépine, 69500 Bron (France)

    2014-02-15

    Purpose: Reconstruction of the beating heart in 3D + time in the catheter laboratory using only the available C-arm system would improve diagnosis, guidance, device sizing, and outcome control for intracardiac interventions, e.g., electrophysiology, valvular disease treatment, structural or congenital heart disease. To obtain such a reconstruction, the patient's electrocardiogram (ECG) must be recorded during the acquisition and used in the reconstruction. In this paper, the authors present a 4D reconstruction method aiming to reconstruct the heart from a single sweep 10 s acquisition. Methods: The authors introduce the 4D RecOnstructiOn using Spatial and TEmporal Regularization (short 4D ROOSTER) method, which reconstructs all cardiac phases at once, as a 3D + time volume. The algorithm alternates between a reconstruction step based on conjugate gradient and four regularization steps: enforcing positivity, averaging along time outside a motion mask that contains the heart and vessels, 3D spatial total variation minimization, and 1D temporal total variation minimization. Results: 4D ROOSTER recovers the different temporal representations of a moving Shepp and Logan phantom, and outperforms both ECG-gated simultaneous algebraic reconstruction technique and prior image constrained compressed sensing on a clinical case. It generates 3D + time reconstructions with sharp edges which can be used, for example, to estimate the patient's left ventricular ejection fraction. Conclusions: 4D ROOSTER can be applied for human cardiac C-arm CT, and potentially in other dynamic tomography areas. It can easily be adapted to other problems as regularization is decoupled from projection and back projection.

  20. Cardiac C-arm computed tomography using a 3D + time ROI reconstruction method with spatial and temporal regularization

    International Nuclear Information System (INIS)

    Mory, Cyril; Auvray, Vincent; Zhang, Bo; Grass, Michael; Schäfer, Dirk; Chen, S. James; Carroll, John D.; Rit, Simon; Peyrin, Françoise; Douek, Philippe; Boussel, Loïc

    2014-01-01

    Purpose: Reconstruction of the beating heart in 3D + time in the catheter laboratory using only the available C-arm system would improve diagnosis, guidance, device sizing, and outcome control for intracardiac interventions, e.g., electrophysiology, valvular disease treatment, structural or congenital heart disease. To obtain such a reconstruction, the patient's electrocardiogram (ECG) must be recorded during the acquisition and used in the reconstruction. In this paper, the authors present a 4D reconstruction method aiming to reconstruct the heart from a single sweep 10 s acquisition. Methods: The authors introduce the 4D RecOnstructiOn using Spatial and TEmporal Regularization (short 4D ROOSTER) method, which reconstructs all cardiac phases at once, as a 3D + time volume. The algorithm alternates between a reconstruction step based on conjugate gradient and four regularization steps: enforcing positivity, averaging along time outside a motion mask that contains the heart and vessels, 3D spatial total variation minimization, and 1D temporal total variation minimization. Results: 4D ROOSTER recovers the different temporal representations of a moving Shepp and Logan phantom, and outperforms both ECG-gated simultaneous algebraic reconstruction technique and prior image constrained compressed sensing on a clinical case. It generates 3D + time reconstructions with sharp edges which can be used, for example, to estimate the patient's left ventricular ejection fraction. Conclusions: 4D ROOSTER can be applied for human cardiac C-arm CT, and potentially in other dynamic tomography areas. It can easily be adapted to other problems as regularization is decoupled from projection and back projection

  1. Multivariate interval-censored survival data

    DEFF Research Database (Denmark)

    Hougaard, Philip

    2014-01-01

    Interval censoring means that an event time is only known to lie in an interval (L,R], with L the last examination time before the event, and R the first after. In the univariate case, parametric models are easily fitted, whereas for non-parametric models, the mass is placed on some intervals, de...

  2. RiTE: Providing On-Demand Data for Right-Time Data Warehousing

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach; Lehner, Wolfgang

    2008-01-01

    Data warehouses (DWs) have traditionally been loaded with data at regular time intervals, e.g., monthly, weekly, or daily, using fast bulk loading techniques. Recently, the trend is to insert all (or only some) new source data very quickly into DWs, called near-realtime DWs (right-time DWs...

  3. Theoretical implications of quantitative properties of interval timing and probability estimation in mouse and rat.

    Science.gov (United States)

    Kheifets, Aaron; Freestone, David; Gallistel, C R

    2017-07-01

    In three experiments with mice ( Mus musculus ) and rats (Rattus norvigicus), we used a switch paradigm to measure quantitative properties of the interval-timing mechanism. We found that: 1) Rodents adjusted the precision of their timed switches in response to changes in the interval between the short and long feed latencies (the temporal goalposts). 2) The variability in the timing of the switch response was reduced or unchanged in the face of large trial-to-trial random variability in the short and long feed latencies. 3) The adjustment in the distribution of switch latencies in response to changes in the relative frequency of short and long trials was sensitive to the asymmetry in the Kullback-Leibler divergence. The three results suggest that durations are represented with adjustable precision, that they are timed by multiple timers, and that there is a trial-by-trial (episodic) record of feed latencies in memory. © 2017 Society for the Experimental Analysis of Behavior.

  4. 29 CFR 779.18 - Regular rate.

    Science.gov (United States)

    2010-07-01

    ... employee under subsection (a) or in excess of the employee's normal working hours or regular working hours... Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR STATEMENTS OF GENERAL POLICY OR... not less than one and one-half times their regular rates of pay. Section 7(e) of the Act defines...

  5. Coordinate-invariant regularization

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1987-01-01

    A general phase-space framework for coordinate-invariant regularization is given. The development is geometric, with all regularization contained in regularized DeWitt Superstructures on field deformations. Parallel development of invariant coordinate-space regularization is obtained by regularized functional integration of the momenta. As representative examples of the general formulation, the regularized general non-linear sigma model and regularized quantum gravity are discussed. copyright 1987 Academic Press, Inc

  6. An analyzer for pulse-interval times to study high-order effects in the processing of nuclear detector signals

    International Nuclear Information System (INIS)

    Denecke, B.; Jonge, S. de

    1998-01-01

    An electronic device to measure interval time density distributions of subsequent pulses in nuclear detectors and their electronics is described. The device has a pair-pulse resolution of 10 ns and 25 ns for 3 subsequent input signals. The conversion range is 4096 channels and the lowest channel width is 10 ns. Counter dead times, single and in series were studied and compared with the statistical model. True count rates were obtained from an exponential fit through the interval-time distribution

  7. Motor Synchronization in Patients With Schizophrenia: Preserved Time Representation With Abnormalities in Predictive Timing

    Directory of Open Access Journals (Sweden)

    Hélène Wilquin

    2018-05-01

    Full Text Available Objective: Basic temporal dysfunctions have been described in patients with schizophrenia, which may impact their ability to connect and synchronize with the outer world. The present study was conducted with the aim to distinguish between interval timing and synchronization difficulties and more generally the spatial-temporal organization disturbances for voluntary actions. A new sensorimotor synchronization task was developed to test these abilities.Method: Twenty-four chronic schizophrenia patients matched with 27 controls performed a spatial-tapping task in which finger taps were to be produced in synchrony with a regular metronome to six visual targets presented around a virtual circle on a tactile screen. Isochronous (time intervals of 500 ms and non-isochronous auditory sequences (alternated time intervals of 300/600 ms were presented. The capacity to produce time intervals accurately versus the ability to synchronize own actions (tap with external events (tone were measured.Results: Patients with schizophrenia were able to produce the tapping patterns of both isochronous and non-isochronous auditory sequences as accurately as controls producing inter-response intervals close to the expected interval of 500 and 900 ms, respectively. However, the synchronization performances revealed significantly more positive asynchrony means (but similar variances in the patient group than in the control group for both types of auditory sequences.Conclusion: The patterns of results suggest that patients with schizophrenia are able to perceive and produce both simple and complex sequences of time intervals but are impaired in the ability to synchronize their actions with external events. These findings suggest a specific deficit in predictive timing, which may be at the core of early symptoms previously described in schizophrenia.

  8. An Integrated Theory of Prospective Time Interval Estimation: The Role of Cognition, Attention, and Learning

    Science.gov (United States)

    Taatgen, Niels A.; van Rijn, Hedderik; Anderson, John

    2007-01-01

    A theory of prospective time perception is introduced and incorporated as a module in an integrated theory of cognition, thereby extending existing theories and allowing predictions about attention and learning. First, a time perception module is established by fitting existing datasets (interval estimation and bisection and impact of secondary…

  9. Interval stability for complex systems

    Science.gov (United States)

    Klinshov, Vladimir V.; Kirillov, Sergey; Kurths, Jürgen; Nekorkin, Vladimir I.

    2018-04-01

    Stability of dynamical systems against strong perturbations is an important problem of nonlinear dynamics relevant to many applications in various areas. Here, we develop a novel concept of interval stability, referring to the behavior of the perturbed system during a finite time interval. Based on this concept, we suggest new measures of stability, namely interval basin stability (IBS) and interval stability threshold (IST). IBS characterizes the likelihood that the perturbed system returns to the stable regime (attractor) in a given time. IST provides the minimal magnitude of the perturbation capable to disrupt the stable regime for a given interval of time. The suggested measures provide important information about the system susceptibility to external perturbations which may be useful for practical applications. Moreover, from a theoretical viewpoint the interval stability measures are shown to bridge the gap between linear and asymptotic stability. We also suggest numerical algorithms for quantification of the interval stability characteristics and demonstrate their potential for several dynamical systems of various nature, such as power grids and neural networks.

  10. Global Robust Stability of Switched Interval Neural Networks with Discrete and Distributed Time-Varying Delays of Neural Type

    Directory of Open Access Journals (Sweden)

    Huaiqin Wu

    2012-01-01

    Full Text Available By combing the theories of the switched systems and the interval neural networks, the mathematics model of the switched interval neural networks with discrete and distributed time-varying delays of neural type is presented. A set of the interval parameter uncertainty neural networks with discrete and distributed time-varying delays of neural type are used as the individual subsystem, and an arbitrary switching rule is assumed to coordinate the switching between these networks. By applying the augmented Lyapunov-Krasovskii functional approach and linear matrix inequality (LMI techniques, a delay-dependent criterion is achieved to ensure to such switched interval neural networks to be globally asymptotically robustly stable in terms of LMIs. The unknown gain matrix is determined by solving this delay-dependent LMIs. Finally, an illustrative example is given to demonstrate the validity of the theoretical results.

  11. Regular variation on measure chains

    Czech Academy of Sciences Publication Activity Database

    Řehák, Pavel; Vitovec, J.

    2010-01-01

    Roč. 72, č. 1 (2010), s. 439-448 ISSN 0362-546X R&D Projects: GA AV ČR KJB100190701 Institutional research plan: CEZ:AV0Z10190503 Keywords : regularly varying function * regularly varying sequence * measure chain * time scale * embedding theorem * representation theorem * second order dynamic equation * asymptotic properties Subject RIV: BA - General Mathematics Impact factor: 1.279, year: 2010 http://www.sciencedirect.com/science/article/pii/S0362546X09008475

  12. Two-sorted Point-Interval Temporal Logics

    DEFF Research Database (Denmark)

    Balbiani, Philippe; Goranko, Valentin; Sciavicco, Guido

    2011-01-01

    There are two natural and well-studied approaches to temporal ontology and reasoning: point-based and interval-based. Usually, interval-based temporal reasoning deals with points as particular, duration-less intervals. Here we develop explicitly two-sorted point-interval temporal logical framework...... whereby time instants (points) and time periods (intervals) are considered on a par, and the perspective can shift between them within the formal discourse. We focus on fragments involving only modal operators that correspond to the inter-sort relations between points and intervals. We analyze...

  13. Comparative evaluation of nickel discharge from brackets in artificial saliva at different time intervals.

    Science.gov (United States)

    Jithesh, C; Venkataramana, V; Penumatsa, Narendravarma; Reddy, S N; Poornima, K Y; Rajasigamani, K

    2015-08-01

    To determine and compare the potential difference of nickel release from three different orthodontic brackets, in different artificial pH, in different time intervals. Twenty-seven samples of three different orthodontic brackets were selected and grouped as 1, 2, and 3. Each group was divided into three subgroups depending on the type of orthodontic brackets, salivary pH and the time interval. The Nickel release from each subgroup were analyzed by using inductively coupled plasma-Atomic Emission Spectrophotometer (Perkin Elmer, Optima 2100 DV, USA) model. Quantitative analysis of nickel was performed three times, and the mean value was used as result. ANOVA (F-test) was used to test the significant difference among the groups at 0.05 level of significance (P brackets have the highest at all 4.2 pH except in 120 h. The study result shows that the nickel release from the recycled stainless steel brackets is highest. Metal slot ceramic bracket release significantly less nickel. So, recycled stainless steel brackets should not be used for nickel allergic patients. Metal slot ceramic brackets are advisable.

  14. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  15. A network of spiking neurons that can represent interval timing: mean field analysis.

    Science.gov (United States)

    Gavornik, Jeffrey P; Shouval, Harel Z

    2011-04-01

    Despite the vital importance of our ability to accurately process and encode temporal information, the underlying neural mechanisms are largely unknown. We have previously described a theoretical framework that explains how temporal representations, similar to those reported in the visual cortex, can form in locally recurrent cortical networks as a function of reward modulated synaptic plasticity. This framework allows networks of both linear and spiking neurons to learn the temporal interval between a stimulus and paired reward signal presented during training. Here we use a mean field approach to analyze the dynamics of non-linear stochastic spiking neurons in a network trained to encode specific time intervals. This analysis explains how recurrent excitatory feedback allows a network structure to encode temporal representations.

  16. Contrasting Perspectives of Anesthesiologists and Gastroenterologists on the Optimal Time Interval between Bowel Preparation and Endoscopic Sedation

    Directory of Open Access Journals (Sweden)

    Deepak Agrawal

    2015-01-01

    Full Text Available Background. The optimal time interval between the last ingestion of bowel prep and sedation for colonoscopy remains controversial, despite guidelines that sedation can be administered 2 hours after consumption of clear liquids. Objective. To determine current practice patterns among anesthesiologists and gastroenterologists regarding the optimal time interval for sedation after last ingestion of bowel prep and to understand the rationale underlying their beliefs. Design. Questionnaire survey of anesthesiologists and gastroenterologists in the USA. The questions were focused on the preferred time interval of endoscopy after a polyethylene glycol based preparation in routine cases and select conditions. Results. Responses were received from 109 anesthesiologists and 112 gastroenterologists. 96% of anesthesiologists recommended waiting longer than 2 hours until sedation, in contrast to only 26% of gastroenterologists. The main reason for waiting >2 hours was that PEG was not considered a clear liquid. Most anesthesiologists, but not gastroenterologists, waited longer in patients with history of diabetes or reflux. Conclusions. Anesthesiologists and gastroenterologists do not agree on the optimal interval for sedation after last drink of bowel prep. Most anesthesiologists prefer to wait longer than the recommended 2 hours for clear liquids. The data suggest a need for clearer guidelines on this issue.

  17. Prognostic value of cardiac time intervals measured by tissue Doppler imaging M-mode in the general population

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Mogelvang, Rasmus; Jensen, Jan Skov

    2015-01-01

    : In a large prospective community-based study, cardiac function was evaluated in 1915 participants by both conventional echocardiography and TDI. The cardiac time intervals, including the isovolumic relaxation time (IVRT), isovolumic contraction time (IVCT) and ejection time (ET), were obtained by TDI M...

  18. The delayed reproduction of long time intervals defined by innocuous thermal sensation.

    Science.gov (United States)

    Khoshnejad, Mina; Martinu, Kristina; Grondin, Simon; Rainville, Pierre

    2016-04-01

    The presence of discrete events during an interval to be estimated generally causes a dilation of perceived duration (event-filling effect). Here, we investigated this phenomenon in the thermal modality using multi-seconds (19 s) innocuous cool stimuli that were either constant (continuous interval) or fluctuating to create three discrete sensory events (segmented interval). Moreover, we introduced a delay following stimulus offset, before the reproduction phase, to allow for a direct comparison with our recent study showing an underestimation of duration in a delayed reproduction task of heat pain sensations (Khoshnejad et al. in Pain 155:581-590, 2014. doi: 10.1016/j.pain.2013.12.015 ). The event-filling effect was tested by comparing the delayed reproduction of the segmented and the continuous stimuli in experimental conditions asking participants to (1) reproduce the dynamics of the sensation (i.e., changes in sensory intensity over time) or (2) reproduce only the interval duration (i.e., sensation onset-to-offset). A perceptual (control) condition required participants to report changes in sensation concurrently with the stimulus. Results of the dynamic task confirmed the underestimation of duration in the delayed reproduction task, but this effect was only found with the continuous and not with the segmented stimulus. This implies that the dilation of duration produced by segmentation might compensate for the underestimation of duration in this delayed reproduction task. However, this temporal dilation effect was only observed when participants were required to attend and reproduce the dynamics of sensation. These results suggest that the event-filling effect can be observed in the thermal sensory modality and that attention directed toward changes in sensory intensity might contribute to this effect.

  19. Determining diabetic retinopathy screening interval based on time from no retinopathy to laser therapy.

    Science.gov (United States)

    Hughes, Daniel; Nair, Sunil; Harvey, John N

    2017-12-01

    Objectives To determine the necessary screening interval for retinopathy in diabetic patients with no retinopathy based on time to laser therapy and to assess long-term visual outcome following screening. Methods In a population-based community screening programme in North Wales, 2917 patients were followed until death or for approximately 12 years. At screening, 2493 had no retinopathy; 424 had mostly minor degrees of non-proliferative retinopathy. Data on timing of first laser therapy and visual outcome following screening were obtained from local hospitals and ophthalmology units. Results Survival analysis showed that very few of the no retinopathy at screening group required laser therapy in the early years compared with the non-proliferative retinopathy group ( p retinopathy at screening group required laser therapy, and at three years 0.2% (cumulative), lower rates of treatment than have been suggested by analyses of sight-threatening retinopathy determined photographically. At follow-up (mean 7.8 ± 4.6 years), mild to moderate visual impairment in one or both eyes due to diabetic retinopathy was more common in those with retinopathy at screening (26% vs. 5%, p diabetes occurred in only 1 in 1000. Conclusions Optimum screening intervals should be determined from time to active treatment. Based on requirement for laser therapy, the screening interval for diabetic patients with no retinopathy can be extended to two to three years. Patients who attend for retinal screening and treatment who have no or non-proliferative retinopathy now have a very low risk of eventual blindness from diabetes.

  20. Spatially-Variant Tikhonov Regularization for Double-Difference Waveform Inversion

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Youzuo [Los Alamos National Laboratory; Huang, Lianjie [Los Alamos National Laboratory; Zhang, Zhigang [Los Alamos National Laboratory

    2011-01-01

    Double-difference waveform inversion is a potential tool for quantitative monitoring for geologic carbon storage. It jointly inverts time-lapse seismic data for changes in reservoir geophysical properties. Due to the ill-posedness of waveform inversion, it is a great challenge to obtain reservoir changes accurately and efficiently, particularly when using time-lapse seismic reflection data. Regularization techniques can be utilized to address the issue of ill-posedness. The regularization parameter controls the smoothness of inversion results. A constant regularization parameter is normally used in waveform inversion, and an optimal regularization parameter has to be selected. The resulting inversion results are a trade off among regions with different smoothness or noise levels; therefore the images are either over regularized in some regions while under regularized in the others. In this paper, we employ a spatially-variant parameter in the Tikhonov regularization scheme used in double-difference waveform tomography to improve the inversion accuracy and robustness. We compare the results obtained using a spatially-variant parameter with those obtained using a constant regularization parameter and those produced without any regularization. We observe that, utilizing a spatially-variant regularization scheme, the target regions are well reconstructed while the noise is reduced in the other regions. We show that the spatially-variant regularization scheme provides the flexibility to regularize local regions based on the a priori information without increasing computational costs and the computer memory requirement.

  1. Selection of regularization parameter for l1-regularized damage detection

    Science.gov (United States)

    Hou, Rongrong; Xia, Yong; Bao, Yuequan; Zhou, Xiaoqing

    2018-06-01

    The l1 regularization technique has been developed for structural health monitoring and damage detection through employing the sparsity condition of structural damage. The regularization parameter, which controls the trade-off between data fidelity and solution size of the regularization problem, exerts a crucial effect on the solution. However, the l1 regularization problem has no closed-form solution, and the regularization parameter is usually selected by experience. This study proposes two strategies of selecting the regularization parameter for the l1-regularized damage detection problem. The first method utilizes the residual and solution norms of the optimization problem and ensures that they are both small. The other method is based on the discrepancy principle, which requires that the variance of the discrepancy between the calculated and measured responses is close to the variance of the measurement noise. The two methods are applied to a cantilever beam and a three-story frame. A range of the regularization parameter, rather than one single value, can be determined. When the regularization parameter in this range is selected, the damage can be accurately identified even for multiple damage scenarios. This range also indicates the sensitivity degree of the damage identification problem to the regularization parameter.

  2. H∞ state estimation of generalised neural networks with interval time-varying delays

    Science.gov (United States)

    Saravanakumar, R.; Syed Ali, M.; Cao, Jinde; Huang, He

    2016-12-01

    This paper focuses on studying the H∞ state estimation of generalised neural networks with interval time-varying delays. The integral terms in the time derivative of the Lyapunov-Krasovskii functional are handled by the Jensen's inequality, reciprocally convex combination approach and a new Wirtinger-based double integral inequality. A delay-dependent criterion is derived under which the estimation error system is globally asymptotically stable with H∞ performance. The proposed conditions are represented by linear matrix inequalities. Optimal H∞ norm bounds are obtained easily by solving convex problems in terms of linear matrix inequalities. The advantage of employing the proposed inequalities is illustrated by numerical examples.

  3. Constructing Proxy Variables to Measure Adult Learners' Time Management Strategies in LMS

    Science.gov (United States)

    Jo, Il-Hyun; Kim, Dongho; Yoon, Meehyun

    2015-01-01

    This study describes the process of constructing proxy variables from recorded log data within a Learning Management System (LMS), which represents adult learners' time management strategies in an online course. Based on previous research, three variables of total login time, login frequency, and regularity of login interval were selected as…

  4. Hospital process intervals, not EMS time intervals, are the most important predictors of rapid reperfusion in EMS Patients with ST-segment elevation myocardial infarction.

    Science.gov (United States)

    Clark, Carol Lynn; Berman, Aaron D; McHugh, Ann; Roe, Edward Jedd; Boura, Judith; Swor, Robert A

    2012-01-01

    To assess the relationship of emergency medical services (EMS) intervals and internal hospital intervals to the rapid reperfusion of patients with ST-segment elevation myocardial infarction (STEMI). We performed a secondary analysis of a prospectively collected database of STEMI patients transported to a large academic community hospital between January 1, 2004, and December 31, 2009. EMS and hospital data intervals included EMS scene time, transport time, hospital arrival to myocardial infarction (MI) team activation (D2Page), page to catheterization laboratory arrival (P2Lab), and catheterization laboratory arrival to reperfusion (L2B). We used two outcomes: EMS scene arrival to reperfusion (S2B) ≤90 minutes and hospital arrival to reperfusion (D2B) ≤90 minutes. Means and proportions are reported. Pearson chi-square and multivariate regression were used for analysis. During the study period, we included 313 EMS-transported STEMI patients with 298 (95.2%) MI team activations. Of these STEMI patients, 295 (94.2%) were taken to the cardiac catheterization laboratory and 244 (78.0%) underwent percutaneous coronary intervention (PCI). For the patients who underwent PCI, 127 (52.5%) had prehospital EMS activation, 202 (82.8%) had D2B ≤90 minutes, and 72 (39%) had S2B ≤90 minutes. In a multivariate analysis, hospital processes EMS activation (OR 7.1, 95% CI 2.7, 18.4], Page to Lab [6.7, 95% CI 2.3, 19.2] and Lab arrival to Reperfusion [18.5, 95% CI 6.1, 55.6]) were the most important predictors of Scene to Balloon ≤ 90 minutes. EMS scene and transport intervals also had a modest association with rapid reperfusion (OR 0.85, 95% CI 0.78, 0.93 and OR 0.89, 95% CI 0.83, 0.95, respectively). In a secondary analysis, Hospital processes (Door to Page [OR 44.8, 95% CI 8.6, 234.4], Page 2 Lab [OR 5.4, 95% CI 1.9, 15.3], and Lab arrival to Reperfusion [OR 14.6 95% CI 2.5, 84.3]), but not EMS scene and transport intervals were the most important predictors D2B ≤90

  5. Regular non-twisting S-branes

    International Nuclear Information System (INIS)

    Obregon, Octavio; Quevedo, Hernando; Ryan, Michael P.

    2004-01-01

    We construct a family of time and angular dependent, regular S-brane solutions which corresponds to a simple analytical continuation of the Zipoy-Voorhees 4-dimensional vacuum spacetime. The solutions are asymptotically flat and turn out to be free of singularities without requiring a twist in space. They can be considered as the simplest non-singular generalization of the singular S0-brane solution. We analyze the properties of a representative of this family of solutions and show that it resembles to some extent the asymptotic properties of the regular Kerr S-brane. The R-symmetry corresponds, however, to the general lorentzian symmetry. Several generalizations of this regular solution are derived which include a charged S-brane and an additional dilatonic field. (author)

  6. Is the time interval between surgery and radiotherapy important in operable nonsmall cell lung cancer? A retrospective analysis of 340 cases

    International Nuclear Information System (INIS)

    Wuerschmidt, Florian; Buenemann, Henry; Ehnert, Michael; Heilmann, Hans-Peter

    1997-01-01

    Purpose: To evaluate the influence of prognostic factors in postoperative radiotherapy of NSCLC with special emphasis on the time interval between surgery and start of radiotherapy. Methods and Materials: Between January 1976 and December 1993, 340 cases were treated and retrospectively analyzed meeting the following criteria: complete follow-up; complete staging information including pathological confirmation of resection status; maximum interval between surgery (SX) and radiotherapy (RT) of 12 weeks (median 36 days, range 18 to 84 days); minimum dose of 50 Gy (R0), and maximum dose of 70 Gy (R2). Two hundred thirty patients (68%) had N2 disease; 228 patients were completely resected (R0). One hundred six (31%) had adenocarcinoma, 172 (51%) squamous cell carcinoma. Results: In univariate analysis, Karnofsky performance status (90+ > 60-80%; p = 0.019 log rank), resection status stratified for nodal disease (R+ < R0; p = 0.046), and the time interval between SX and RT were of significant importance. Patients with a long interval (37 to 84 days) had higher 5-year survival rates (26%) and a median survival time (MST: 21.9 months, 95% C.I. 17.2 to 28.6 months) than patients with a short interval (18 to 36 days: 15%; 14.9 months, 13 to 19.9 months; p = 0.013). A further subgroup analysis revealed significant higher survival rates in patients with a long interval in N0/1 disease (p = 0.011) and incompletely resected NSCLC (p = 0.012). In multivariate analysis, the time interval had a p-value of 0.009 (nodal disease: p = 0.0083; KPI: p = 0.0037; sex: p = 0.035). Conclusion: Shortening the time interval between surgery and postoperative radiotherapy to less than 6 weeks even in R+ cases is not necessary. Survival of patients with a long interval between surgery and start of radiotherapy was better in this retrospective analysis as compared to patients with a short interval

  7. Perceptual inequality between two neighboring time intervals defined by sound markers: correspondence between neurophysiological and psychological data

    Directory of Open Access Journals (Sweden)

    Takako eMitsudo

    2014-09-01

    Full Text Available Brain activity related to time estimation processes in humans was analyzed using a perceptual phenomenon called auditory temporal assimilation. In a typical stimulus condition, two neighboring time intervals (T1 and T2 in this order are perceived as equal even when the physical lengths of these time intervals are considerably different. Our previous event-related potential (ERP study demonstrated that a slow negative component (SNCt appears in the right-frontal brain area (around the F8 electrode after T2, which is associated with judgment of the equality/inequality of T1 and T2. In the present study, we conducted two ERP experiments to further confirm the robustness of the SNCt. The stimulus patterns consisted of two neighboring time intervals marked by three successive tone bursts. Thirteen participants only listened to the patterns in the first session, and judged the equality/inequality of T1 and T2 in the next session. Behavioral data showed typical temporal assimilation. The ERP data revealed that three components (N1; contingent negative variation, CNV; and SNCt emerged related to the temporal judgment. The N1 appeared in the central area, and its peak latencies corresponded to the physical timing of each marker onset. The CNV component appeared in the frontal area during T2 presentation, and its amplitude increased as a function of T1. The SNCt appeared in the right-frontal area after the presentation of T1 and T2, and its magnitude was larger for the temporal patterns causing perceptual inequality. The SNCt was also correlated with the perceptual equality/inequality of the same stimulus pattern, and continued up to about 400 ms after the end of T2. These results suggest that the SNCt can be a signature of equality/inequality judgment, which derives from the comparison of the two neighboring time intervals.

  8. Diagnostic Efficiency of MR Imaging of the Knee. Relationship to time Interval between MR and Arthroscopy

    International Nuclear Information System (INIS)

    Barrera, M. C.; Recondo, J. A.; Aperribay, M.; Gervas, C.; Fernandez, E.; Alustiza, J. M.

    2003-01-01

    To evaluate the efficiency of magnetic resonance (MR) in the diagnosis of knee lesions and how the results are influenced by the time interval between MR and arthroscopy. 248 knees studied by MR were retrospectively analyzed, as well as those which also underwent arthroscopy. Arthroscopy was considered to be the gold standard, MR diagnostic capacity was evaluated for both meniscal and cruciate ligament lesions. Sensitivity, specificity and Kappa index were calculated for the set of all knees included in the study (248), for those in which the time between MR and arthroscopy was less than or equal to three months (134) and for those in which the time between both procedures was less than or equal to one month. Sensitivity, specificity and Kappa index of the MR had global values of 96.5%, 70% and 71%, respectively. When the interval between MR and arthroscopy was less than or equal to three months, sensitivity, specificity and Kappa index were 95.5%, 75% and 72%, respectively. When it was less than or equal to one month, sensitivity was 100%, specificity was 87.5% and Kappa index was 91%. MR is an excellent tool for the diagnosis of knee lesions. Higher MR values of sensitivity, specificity and Kappa index are obtained when the time interval between both procedures is kept to a minimum. (Author) 11 refs

  9. Borderline personality disorder and regularly drinking alcohol before sex.

    Science.gov (United States)

    Thompson, Ronald G; Eaton, Nicholas R; Hu, Mei-Chen; Hasin, Deborah S

    2017-07-01

    Drinking alcohol before sex increases the likelihood of engaging in unprotected intercourse, having multiple sexual partners and becoming infected with sexually transmitted infections. Borderline personality disorder (BPD), a complex psychiatric disorder characterised by pervasive instability in emotional regulation, self-image, interpersonal relationships and impulse control, is associated with substance use disorders and sexual risk behaviours. However, no study has examined the relationship between BPD and drinking alcohol before sex in the USA. This study examined the association between BPD and regularly drinking before sex in a nationally representative adult sample. Participants were 17 491 sexually active drinkers from Wave 2 of the National Epidemiologic Survey on Alcohol and Related Conditions. Logistic regression models estimated effects of BPD diagnosis, specific borderline diagnostic criteria and BPD criterion count on the likelihood of regularly (mostly or always) drinking alcohol before sex, adjusted for controls. Borderline personality disorder diagnosis doubled the odds of regularly drinking before sex [adjusted odds ratio (AOR) = 2.26; confidence interval (CI) = 1.63, 3.14]. Of nine diagnostic criteria, impulsivity in areas that are self-damaging remained a significant predictor of regularly drinking before sex (AOR = 1.82; CI = 1.42, 2.35). The odds of regularly drinking before sex increased by 20% for each endorsed criterion (AOR = 1.20; CI = 1.14, 1.27) DISCUSSION AND CONCLUSIONS: This is the first study to examine the relationship between BPD and regularly drinking alcohol before sex in the USA. Substance misuse treatment should assess regularly drinking before sex, particularly among patients with BPD, and BPD treatment should assess risk at the intersection of impulsivity, sexual behaviour and substance use. [Thompson Jr RG, Eaton NR, Hu M-C, Hasin DS Borderline personality disorder and regularly drinking alcohol

  10. Time Interval to Initiation of Contraceptive Methods Following ...

    African Journals Online (AJOL)

    2018-01-30

    Jan 30, 2018 ... interval between a woman's last childbirth and the initiation of contraception. Materials and ..... DF=Degree of freedom; χ2=Chi‑square test ..... practice of modern contraception among single women in a rural and urban ...

  11. Global Regularity and Time Decay for the 2D Magnetohydrodynamic Equations with Fractional Dissipation and Partial Magnetic Diffusion

    Science.gov (United States)

    Dong, Bo-Qing; Jia, Yan; Li, Jingna; Wu, Jiahong

    2018-05-01

    This paper focuses on a system of the 2D magnetohydrodynamic (MHD) equations with the kinematic dissipation given by the fractional operator (-Δ )^α and the magnetic diffusion by partial Laplacian. We are able to show that this system with any α >0 always possesses a unique global smooth solution when the initial data is sufficiently smooth. In addition, we make a detailed study on the large-time behavior of these smooth solutions and obtain optimal large-time decay rates. Since the magnetic diffusion is only partial here, some classical tools such as the maximal regularity property for the 2D heat operator can no longer be applied. A key observation on the structure of the MHD equations allows us to get around the difficulties due to the lack of full Laplacian magnetic diffusion. The results presented here are the sharpest on the global regularity problem for the 2D MHD equations with only partial magnetic diffusion.

  12. Impact of Vestibular Lesions on Allocentric Navigation and Interval Timing: The Role of Self-Initiated Motion in Spatial-Temporal Integration

    Czech Academy of Sciences Publication Activity Database

    Dallal, N. L.; Yin, B.; Nekovářová, Tereza; Stuchlík, Aleš; Meck, W. H.

    2015-01-01

    Roč. 3, 3-4 (2015), s. 269-305 ISSN 2213-445X R&D Projects: GA MŠk(CZ) LH14053 Institutional support: RVO:67985823 Keywords : peak-interval procedure * interval timing * radial-arm maze * magnitude representation * dorsolateral striatum * self-initiated movement * hippocampus * cerebellum * time perception * allocentric navigation Subject RIV: FH - Neurology

  13. An experimental evaluation of electrical skin conductivity changes in postmortem interval and its assessment for time of death estimation.

    Science.gov (United States)

    Cantürk, İsmail; Karabiber, Fethullah; Çelik, Safa; Şahin, M Feyzi; Yağmur, Fatih; Kara, Sadık

    2016-02-01

    In forensic medicine, estimation of the time of death (ToD) is one of the most important and challenging medico-legal problems. Despite the partial accomplishments in ToD estimations to date, the error margin of ToD estimation is still too large. In this study, electrical conductivity changes were experimentally investigated in the postmortem interval in human cases. Electrical conductivity measurements give some promising clues about the postmortem interval. A living human has a natural electrical conductivity; in the postmortem interval, intracellular fluids gradually leak out of cells. These leaked fluids combine with extra-cellular fluids in tissues and since both fluids are electrolytic, intracellular fluids help increase conductivity. Thus, the level of electrical conductivity is expected to increase with increased time after death. In this study, electrical conductivity tests were applied for six hours. The electrical conductivity of the cases exponentially increased during the tested time period, indicating a positive relationship between electrical conductivity and the postmortem interval. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Association of regular walking and body mass index on metabolic syndrome among an elderly Korean population.

    Science.gov (United States)

    Kim, Soonyoung; Kim, Dong-Il

    2018-06-01

    Aging is associated with increased body fat and lower lean body mass, which leads to increased prevalence of obesity and metabolic syndrome. This study aimed to investigate the association of regular participation in walking and body mass index (BMI) with metabolic syndrome and its 5 criteria in elderly Koreans. A total of 3554 (male = 1581, female = 1973) elderly subjects (age ≥ 65 years), who participated in the Fifth Korea National Health and Nutrition Examination Survey (KNHANES V) were analyzed in this cross-sectional study. Participation in walking activity, BMI, metabolic syndrome and its 5 criteria; waist circumference (WC), systolic blood pressure (SBP), diastolic blood pressure (DBP), fasting glucose (FG) levels, triglyceride (TG) levels, and high-density lipoprotein cholesterol (HDLC) levels, were measured. Subjects were categorized into four groups based on the duration and regularity of their walks and BMI. In the regular walking (≥30 min of continuous walking a day, on ≥5 days a week) and normal weight (BMI metabolic syndrome was 4.36 times higher (Odds ratio [OR]: 4.36, 95% confidence interval [CI]: 3.37-5.63) in the non-regular walking and overweight group than that of the regular walking and normal weight group after controlling for the influence of age, sex, and smoking status. Moreover, The BMI (β = 0.328, R 2  = 0.152) were more contributing factors than Regular walking (β = -0.011) for metabolic syndrome. In conclusions, regular participation in walking activity and implementing weight control may reduce the incidence rate of metabolic syndrome in elderly Koreans, with weight management serving as the greater influences of the two. Copyright © 2018. Published by Elsevier Inc.

  15. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  16. Centered Differential Waveform Inversion with Minimum Support Regularization

    KAUST Repository

    Kazei, Vladimir

    2017-05-26

    Time-lapse full-waveform inversion has two major challenges. The first one is the reconstruction of a reference model (baseline model for most of approaches). The second is inversion for the time-lapse changes in the parameters. Common model approach is utilizing the information contained in all available data sets to build a better reference model for time lapse inversion. Differential (Double-difference) waveform inversion allows to reduce the artifacts introduced into estimates of time-lapse parameter changes by imperfect inversion for the baseline-reference model. We propose centered differential waveform inversion (CDWI) which combines these two approaches in order to benefit from both of their features. We apply minimum support regularization commonly used with electromagnetic methods of geophysical exploration. We test the CDWI method on synthetic dataset with random noise and show that, with Minimum support regularization, it provides better resolution of velocity changes than with total variation and Tikhonov regularizations in time-lapse full-waveform inversion.

  17. Trajectories of problem video gaming among adult regular gamers: an 18-month longitudinal study.

    Science.gov (United States)

    King, Daniel L; Delfabbro, Paul H; Griffiths, Mark D

    2013-01-01

    A three-wave, longitudinal study examined the long-term trajectory of problem gaming symptoms among adult regular video gamers. Potential changes in problem gaming status were assessed at two intervals using an online survey over an 18-month period. Participants (N=117) were recruited by an advertisement posted on the public forums of multiple Australian video game-related websites. Inclusion criteria were being of adult age and having a video gaming history of at least 1 hour of gaming every week over the past 3 months. Two groups of adult video gamers were identified: those players who did (N=37) and those who did not (N=80) identify as having a serious gaming problem at the initial survey intake. The results showed that regular gamers who self-identified as having a video gaming problem at baseline reported more severe problem gaming symptoms than normal gamers, at all time points. However, both groups experienced a significant decline in problem gaming symptoms over an 18-month period, controlling for age, video gaming activity, and psychopathological symptoms.

  18. Impulsive sounds change European seabass swimming patterns: Influence of pulse repetition interval

    International Nuclear Information System (INIS)

    Neo, Y.Y.; Ufkes, E.; Kastelein, R.A.; Winter, H.V.; Cate, C. ten; Slabbekoorn, H.

    2015-01-01

    Highlights: • We exposed impulsive sounds of different repetition intervals to European seabass. • Immediate behavioural changes mirrored previous indoor & outdoor studies. • Repetition intervals influenced the impacts differentially but not the recovery. • Sound temporal patterns may be more important than some standard metrics. - Abstract: Seismic shootings and offshore pile-driving are regularly performed, emitting significant amounts of noise that may negatively affect fish behaviour. The pulse repetition interval (PRI) of these impulsive sounds may vary considerably and influence the behavioural impact and recovery. Here, we tested the effect of four PRIs (0.5–4.0 s) on European seabass swimming patterns in an outdoor basin. At the onset of the sound exposures, the fish swam faster and dived deeper in tighter shoals. PRI affected the immediate and delayed behavioural changes but not the recovery time. Our study highlights that (1) the behavioural changes of captive European seabass were consistent with previous indoor and outdoor studies; (2) PRI could influence behavioural impact differentially, which may have management implications; (3) some acoustic metrics, e.g. SEL cum , may have limited predictive power to assess the strength of behavioural impacts of noise. Noise impact assessments need to consider the contribution of sound temporal structure

  19. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    Science.gov (United States)

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  20. An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files

    Directory of Open Access Journals (Sweden)

    Anthony Chan

    2008-01-01

    Full Text Available A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events. These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughly proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage. The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.

  1. Poststimulation time interval-dependent effects of motor cortex anodal tDCS on reaction-time task performance.

    Science.gov (United States)

    Molero-Chamizo, Andrés; Alameda Bailén, José R; Garrido Béjar, Tamara; García López, Macarena; Jaén Rodríguez, Inmaculada; Gutiérrez Lérida, Carolina; Pérez Panal, Silvia; González Ángel, Gloria; Lemus Corchero, Laura; Ruiz Vega, María J; Nitsche, Michael A; Rivera-Urbina, Guadalupe N

    2018-02-01

    Anodal transcranial direct current stimulation (tDCS) induces long-term potentiation-like plasticity, which is associated with long-lasting effects on different cognitive, emotional, and motor performances. Specifically, tDCS applied over the motor cortex is considered to improve reaction time in simple and complex tasks. The timing of tDCS relative to task performance could determine the efficacy of tDCS to modulate performance. The aim of this study was to compare the effects of a single session of anodal tDCS (1.5 mA, for 15 min) applied over the left primary motor cortex (M1) versus sham stimulation on performance of a go/no-go simple reaction-time task carried out at three different time points after tDCS-namely, 0, 30, or 60 min after stimulation. Performance zero min after anodal tDCS was improved during the whole course of the task. Performance 30 min after anodal tDCS was improved only in the last block of the reaction-time task. Performance 60 min after anodal tDCS was not significantly different throughout the entire task. These findings suggest that the motor cortex excitability changes induced by tDCS can improve motor responses, and these effects critically depend on the time interval between stimulation and task performance.

  2. Spiking Regularity and Coherence in Complex Hodgkin–Huxley Neuron Networks

    International Nuclear Information System (INIS)

    Zhi-Qiang, Sun; Ping, Xie; Wei, Li; Peng-Ye, Wang

    2010-01-01

    We study the effects of the strength of coupling between neurons on the spiking regularity and coherence in a complex network with randomly connected Hodgkin–Huxley neurons driven by colored noise. It is found that for the given topology realization and colored noise correlation time, there exists an optimal strength of coupling, at which the spiking regularity of the network reaches the best level. Moreover, when the temporal regularity reaches the best level, the spatial coherence of the system has already increased to a relatively high level. In addition, for the given number of neurons and noise correlation time, the values of average regularity and spatial coherence at the optimal strength of coupling are nearly independent of the topology realization. Furthermore, there exists an optimal value of colored noise correlation time at which the spiking regularity can reach its best level. These results may be helpful for understanding of the real neuron world. (cross-disciplinary physics and related areas of science and technology)

  3. Improving Delay-Range-Dependent Stability Condition for Systems with Interval Time-Varying Delay

    Directory of Open Access Journals (Sweden)

    Wei Qian

    2013-01-01

    Full Text Available This paper discusses the delay-range-dependent stability for systems with interval time-varying delay. Through defining the new Lyapunov-Krasovskii functional and estimating the derivative of the LKF by introducing new vectors, using free matrices and reciprocally convex approach, the new delay-range-dependent stability conditions are obtained. Two well-known examples are given to illustrate the less conservatism of the proposed theoretical results.

  4. Bypassing the Limits of Ll Regularization: Convex Sparse Signal Processing Using Non-Convex Regularization

    Science.gov (United States)

    Parekh, Ankit

    Sparsity has become the basis of some important signal processing methods over the last ten years. Many signal processing problems (e.g., denoising, deconvolution, non-linear component analysis) can be expressed as inverse problems. Sparsity is invoked through the formulation of an inverse problem with suitably designed regularization terms. The regularization terms alone encode sparsity into the problem formulation. Often, the ℓ1 norm is used to induce sparsity, so much so that ℓ1 regularization is considered to be `modern least-squares'. The use of ℓ1 norm, as a sparsity-inducing regularizer, leads to a convex optimization problem, which has several benefits: the absence of extraneous local minima, well developed theory of globally convergent algorithms, even for large-scale problems. Convex regularization via the ℓ1 norm, however, tends to under-estimate the non-zero values of sparse signals. In order to estimate the non-zero values more accurately, non-convex regularization is often favored over convex regularization. However, non-convex regularization generally leads to non-convex optimization, which suffers from numerous issues: convergence may be guaranteed to only a stationary point, problem specific parameters may be difficult to set, and the solution is sensitive to the initialization of the algorithm. The first part of this thesis is aimed toward combining the benefits of non-convex regularization and convex optimization to estimate sparse signals more effectively. To this end, we propose to use parameterized non-convex regularizers with designated non-convexity and provide a range for the non-convex parameter so as to ensure that the objective function is strictly convex. By ensuring convexity of the objective function (sum of data-fidelity and non-convex regularizer), we can make use of a wide variety of convex optimization algorithms to obtain the unique global minimum reliably. The second part of this thesis proposes a non-linear signal

  5. Association between prehospital time interval and short-term outcome in acute heart failure patients.

    Science.gov (United States)

    Takahashi, Masashi; Kohsaka, Shun; Miyata, Hiroaki; Yoshikawa, Tsutomu; Takagi, Atsutoshi; Harada, Kazumasa; Miyamoto, Takamichi; Sakai, Tetsuo; Nagao, Ken; Sato, Naoki; Takayama, Morimasa

    2011-09-01

    Acute heart failure (AHF) is one of the most frequently encountered cardiovascular conditions that can seriously affect the patient's prognosis. However, the importance of early triage and treatment initiation in the setting of AHF has not been recognized. The Tokyo Cardiac Care Unit Network Database prospectively collected information of emergency admissions to acute cardiac care facilities in 2005-2007 from 67 participating hospitals in the Tokyo metropolitan area. We analyzed records of 1,218 AHF patients transported to medical centers via emergency medical services (EMS). AHF was defined as rapid onset or change in the signs and symptoms of heart failure, resulting in the need for urgent therapy. Patients with acute coronary syndrome were excluded from this analysis. Logistic regression analysis was performed to calculate the risk-adjusted in-hospital mortality. A majority of the patients were elderly (76.1 ± 11.5 years old) and male (54.1%). The overall in-hospital mortality rate was 6.0%. The median time interval between symptom onset and EMS arrival (response time) was 64 minutes (interquartile range [IQR] 26-205 minutes), and that between EMS arrival and ER arrival (transportation time) was 27 minutes (IQR 9-78 minutes). The risk-adjusted mortality increased with transportation time, but did not correlate with the response time. Those who took >45 minutes to arrive at the medical centers were at a higher risk for in-hospital mortality (odds ratio 2.24, 95% confidence interval 1.17-4.31; P = .015). Transportation time correlated with risk-adjusted mortality, and steps should be taken to reduce the EMS transfer time to improve the outcome in AHF patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Time Interval to Initiation of Contraceptive Methods Following ...

    African Journals Online (AJOL)

    Objectives: The objectives of the study were to determine factors affecting the interval between a woman's last childbirth and the initiation of contraception. Materials and Methods: This was a retrospective study. Family planning clinic records of the Barau Dikko Teaching Hospital Kaduna from January 2000 to March 2014 ...

  7. On entire functions restricted to intervals, partition of unities, and dual Gabor frames

    DEFF Research Database (Denmark)

    Christensen, Ole; Kim, Hong Oh; Kim, Rae Young

    2014-01-01

    Partition of unities appears in many places in analysis. Typically it is generated by compactly supported functions with a certain regularity. In this paper we consider partition of unities obtained as integer-translates of entire functions restricted to finite intervals. We characterize the enti...

  8. The Impact of Computerization on Regular Employment (Japanese)

    OpenAIRE

    SUNADA Mitsuru; HIGUCHI Yoshio; ABE Masahiro

    2004-01-01

    This paper uses micro data from the Basic Survey of Japanese Business Structure and Activity to analyze the effects of companies' introduction of information and telecommunications technology on employment structures, especially regular versus non-regular employment. Firstly, examination of trends in the ratio of part-time workers recorded in the Basic Survey shows that part-time worker ratios in manufacturing firms are rising slightly, but that companies with a high proportion of part-timers...

  9. Model for the respiratory modulation of the heart beat-to-beat time interval series

    Science.gov (United States)

    Capurro, Alberto; Diambra, Luis; Malta, C. P.

    2005-09-01

    In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.

  10. Fast and compact regular expression matching

    DEFF Research Database (Denmark)

    Bille, Philip; Farach-Colton, Martin

    2008-01-01

    We study 4 problems in string matching, namely, regular expression matching, approximate regular expression matching, string edit distance, and subsequence indexing, on a standard word RAM model of computation that allows logarithmic-sized words to be manipulated in constant time. We show how...... to improve the space and/or remove a dependency on the alphabet size for each problem using either an improved tabulation technique of an existing algorithm or by combining known algorithms in a new way....

  11. Time interval between infective endocarditis first symptoms and diagnosis: relationship to infective endocarditis characteristics, microorganisms and prognosis.

    Science.gov (United States)

    N'Guyen, Yohan; Duval, Xavier; Revest, Matthieu; Saada, Matthieu; Erpelding, Marie-Line; Selton-Suty, Christine; Bouchiat, Coralie; Delahaye, François; Chirouze, Catherine; Alla, François; Strady, Christophe; Hoen, Bruno

    2017-03-01

    To analyze the characteristics and outcome of infective endocarditis (IE) according to the time interval between IE first symptoms and diagnosis. Among the IE cases of a French population-based epidemiological survey, patients having early-diagnosed IE (diagnosis of IE within 1 month of first symptoms) were compared with those having late-diagnosed IE (diagnosis of IE more than 1 month after first symptoms). Among the 486 definite-IE, 124 (25%) had late-diagnosed IE whereas others had early-diagnosed IE. Early-diagnosed IE were independently associated with female gender (OR = 1.8; 95% CI [1.0-3.0]), prosthetic valve (OR= 2.6; 95% CI [1.4-5.0]) and staphylococci as causative pathogen (OR = 3.7; 95% CI [2.2-6.2]). Cardiac surgery theoretical indication rates were not different between early and late-diagnosed IE (56.3% vs 58.9%), whereas valve surgery performance was lower in early-diagnosed IE (41% vs 53%; p = .03). In-hospital mortality rates were higher in early-diagnosed IE than in late-diagnosed IE (25.1% vs 16.1%; p endocarditis, which time interval between first symptoms and diagnosis was less than one month, were mainly due to Staphylococcus aureus in France. Staphylococcus aureus infective endocarditis were associated with septic shock, transient ischemic attack or stroke and higher mortality rates than infective endocarditis due to other bacteria or infective endocarditis, which time interval between first symptoms and diagnosis was more than one month. Infective endocarditis, which time interval between first symptoms and diagnosis was more than one month, were accounting for one quarter of all infective endocarditis in our study and were associated with vertebral osteomyelitis and a higher rate of cardiac surgery performed for hemodynamic indication than other infective endocarditis.

  12. A NOVEL APPROACH TO ARRHYTHMIA CLASSIFICATION USING RR INTERVAL AND TEAGER ENERGY

    Directory of Open Access Journals (Sweden)

    CHANDRAKAR KAMATH

    2012-12-01

    Full Text Available It is hypothesized that a key characteristic of electrocardiogram (ECG signal is its nonlinear dynamic behaviour and that the nonlinear component changes more significantly between normal and arrhythmia conditions than the linear component. The usual statistical descriptors used in RR (R to R interval analysis do not capture the nonlinear disposition of RR interval variability. In this paper we explore a novel approach to extract the features from nonlinear component of the RR interval signal using Teager energy operator (TEO. The key feature of Teager energy is that it models the energy of the source that generated the signal rather than the energy of the signal itself. Hence any deviations in regular rhythmic activity of the heart get reflected in the Teager energy function. The classification evaluated on MIT-BIH database, with RR interval and mean of Teager energy computed over RR interval as features, exhibits an average accuracy that exceeds 99.79%.

  13. Effects of varied doses of psilocybin on time interval reproduction in human subjects.

    Science.gov (United States)

    Wackermann, Jirí; Wittmann, Marc; Hasler, Felix; Vollenweider, Franz X

    2008-04-11

    Action of a hallucinogenic substance, psilocybin, on internal time representation was investigated in two double-blind, placebo-controlled studies: Experiment 1 with 12 subjects and graded doses, and Experiment 2 with 9 subjects and a very low dose. The task consisted in repeated reproductions of time intervals in the range from 1.5 to 5s. The effects were assessed by parameter kappa of the 'dual klepsydra' model of internal time representation, fitted to individual response data and intra-individually normalized with respect to initial values. The estimates kappa were in the same order of magnitude as in earlier studies. In both experiments, kappa was significantly increased by psilocybin at 90 min from the drug intake, indicating a higher loss rate of the internal duration representation. These findings are tentatively linked to qualitative alterations of subjective time in altered states of consciousness.

  14. Coupling regularizes individual units in noisy populations

    International Nuclear Information System (INIS)

    Ly Cheng; Ermentrout, G. Bard

    2010-01-01

    The regularity of a noisy system can modulate in various ways. It is well known that coupling in a population can lower the variability of the entire network; the collective activity is more regular. Here, we show that diffusive (reciprocal) coupling of two simple Ornstein-Uhlenbeck (O-U) processes can regularize the individual, even when it is coupled to a noisier process. In cellular networks, the regularity of individual cells is important when a select few play a significant role. The regularizing effect of coupling surprisingly applies also to general nonlinear noisy oscillators. However, unlike with the O-U process, coupling-induced regularity is robust to different kinds of coupling. With two coupled noisy oscillators, we derive an asymptotic formula assuming weak noise and coupling for the variance of the period (i.e., spike times) that accurately captures this effect. Moreover, we find that reciprocal coupling can regularize the individual period of higher dimensional oscillators such as the Morris-Lecar and Brusselator models, even when coupled to noisier oscillators. Coupling can have a counterintuitive and beneficial effect on noisy systems. These results have implications for the role of connectivity with noisy oscillators and the modulation of variability of individual oscillators.

  15. Learning regularization parameters for general-form Tikhonov

    International Nuclear Information System (INIS)

    Chung, Julianne; Español, Malena I

    2017-01-01

    Computing regularization parameters for general-form Tikhonov regularization can be an expensive and difficult task, especially if multiple parameters or many solutions need to be computed in real time. In this work, we assume training data is available and describe an efficient learning approach for computing regularization parameters that can be used for a large set of problems. We consider an empirical Bayes risk minimization framework for finding regularization parameters that minimize average errors for the training data. We first extend methods from Chung et al (2011 SIAM J. Sci. Comput. 33 3132–52) to the general-form Tikhonov problem. Then we develop a learning approach for multi-parameter Tikhonov problems, for the case where all involved matrices are simultaneously diagonalizable. For problems where this is not the case, we describe an approach to compute near-optimal regularization parameters by using operator approximations for the original problem. Finally, we propose a new class of regularizing filters, where solutions correspond to multi-parameter Tikhonov solutions, that requires less data than previously proposed optimal error filters, avoids the generalized SVD, and allows flexibility and novelty in the choice of regularization matrices. Numerical results for 1D and 2D examples using different norms on the errors show the effectiveness of our methods. (paper)

  16. Postprandial oxidative losses of dietary leucine depend on the time interval between consecutive meals

    NARCIS (Netherlands)

    Myszkowska-Ryciak, J.; Keller, J.S.; Bujko, J.; Stankiewicz-Ciupa, J.; Koopmanschap, R.E.; Schreurs, V.V.A.M.

    2015-01-01

    Postprandial oxidative losses of egg white-bound [1-13C]-leucine were studied as 13C recovery in the breath of rats in relation to different time intervals between two meals. Male Wistar rats (n = 48; 68.3 ±5.9 g) divided into 4 groups (n = 12) were fed two meals a day (9:00

  17. Effect of a data buffer on the recorded distribution of time intervals for random events

    Energy Technology Data Exchange (ETDEWEB)

    Barton, J C [Polytechnic of North London (UK)

    1976-03-15

    The use of a data buffer enables the distribution of the time intervals between events to be studied for times less than the recording system dead-time but the usual negative exponential distribution for random events has to be modified. The theory for this effect is developed for an n-stage buffer followed by an asynchronous recorder. Results are evaluated for the values of n from 1 to 5. In the language of queueing theory the system studied is of type M/D/1/n+1, i.e. with constant service time and a finite number of places.

  18. Regular character of chromatin degradation in lymphoid tissues after treatment with biological alkylating agents in vivo

    International Nuclear Information System (INIS)

    Matyasova, J.; Skalka, M.; Cejkova, M.

    1979-01-01

    The chromatin changes are reevaluated occurring in lymphoid tissues of mice treated with alkylating agents of the nitrogen-mustard type in relation to recent evidence on the nucleosomal organization of chromatin and to our new data on the regular character of chromatin degradation in lymphoid tissues of irradiated mice. DNA was isolated from nuclei at various intervals (1 to 18 h) after treatment of mice and subjected to gel electrophoresis in polyacrylamide gels. Thymus chromatin from treated mice has been shown to degrade in a regular fashion and to yield discrete DNA fragments, resembling those that originate in lymphoid tissues of irradiated mice or in thymus nuclei digested with micrococcal nuclease in vitro. With increasing interval after treatment higher amounts of smaller DNA fragments appear. Chromatin in spleen cells responds to treatment in a similar way, whilst no degradation in vivo takes place in liver chromatin. Chromatin of LS/BL lymphosarcoma cells in mice treated with alkylating agents or with irradiation suffers from a similar regular degradation. The results stress the significance of the action of liberated or activated endogenous nuclease(s) in the development of chromatin damage in lymphoid cells after treatment with alkylating agents. (author)

  19. Effect of insertion method and postinsertion time interval prior to force application on the removal torque of orthodontic miniscrews.

    Science.gov (United States)

    Sharifi, Maryam; Ghassemi, Amirreza; Bayani, Shahin

    2015-01-01

    Success of orthodontic miniscrews in providing stable anchorage is dependent on their stability. The purpose of this study was to assess the effect of insertion method and postinsertion time interval on the removal torque of miniscrews as an indicator of their stability. Seventy-two miniscrews (Jeil Medical) were inserted into the femoral bones of three male German Shepherd dogs and assigned to nine groups of eight miniscrews. Three insertion methods, including hand-driven, motor-driven with 5.0-Ncm insertion torque, and motor-driven with 20.0-Ncm insertion torque, were tested. Three time intervals of 0, 2, and 6 weeks between miniscrew insertion and removal were tested as well. Removal torque values were measured in newton centimeters by a removal torque tester (IMADA). Data were analyzed by one-way analysis of variance (ANOVA) followed by the Bonferroni post hoc test at a .05 level of significance. A miniscrew survival rate of 93% was observed in this study. The highest mean value of removal torque among the three postinsertion intervals (2.4 ± 0.59 Ncm) was obtained immediately after miniscrew insertion with a statistically significant difference from the other two time intervals (P torque values were obtained immediately after insertion.

  20. Evaluation of the Trail Making Test and interval timing as measures of cognition in healthy adults: comparisons by age, education, and gender.

    Science.gov (United States)

    Płotek, Włodzimierz; Łyskawa, Wojciech; Kluzik, Anna; Grześkowiak, Małgorzata; Podlewski, Roland; Żaba, Zbigniew; Drobnik, Leon

    2014-02-03

    Human cognitive functioning can be assessed using different methods of testing. Age, level of education, and gender may influence the results of cognitive tests. The well-known Trail Making Test (TMT), which is often used to measure the frontal lobe function, and the experimental test of Interval Timing (IT) were compared. The methods used in IT included reproduction of auditory and visual stimuli, with the subsequent production of the time intervals of 1-, 2-, 5-, and 7-seconds durations with no pattern. Subjects included 64 healthy adult volunteers aged 18-63 (33 women, 31 men). Comparisons were made based on age, education, and gender. TMT was performed quickly and was influenced by age, education, and gender. All reproduced visual and produced intervals were shortened and the reproduction of auditory stimuli was more complex. Age, education, and gender have more pronounced impact on the cognitive test than on the interval timing test. The reproduction of the short auditory stimuli was more accurate in comparison to other modalities used in the IT test. The interval timing, when compared to the TMT, offers an interesting possibility of testing. Further studies are necessary to confirm the initial observation.

  1. Measurement of the ecological flow of the Acaponeta river, Nayarit, comparing different time intervals

    Directory of Open Access Journals (Sweden)

    Guadalupe de la Lanza Espino

    2012-07-01

    Full Text Available The diverse management of river water in Mexico has been unequal due to the different anthropological activities, and it is associated with inter-annual changes in the climate and runoff patterns, leading to a loss of the ecosystem integrity. However, nowadays there are different methods to assess the water volume that is necessary to conserve the environment, among which are hydrological methods, such as those applied here, that are based on information on water volumes recorded over decades, which are not always available in the country. For this reason, this study compares runoff records for different time ranges: minimum of 10 years, medium of 20 years, and more than 50 years, to quantify the environmental flow. These time intervals provided similar results, which mean that not only for the Acaponeta river, but possibly for others lotic systems as well, a 10-year interval may be used satisfactorily. In this river, the runoff water that must be kept for environmental purposes is: for 10 years 70.1%, for 20 years 78.1% and for >50 years 68.8%, with an average of 72.3% of the total water volume or of the average annual runoff.

  2. A comparison between brachial and echocardiographic systolic time intervals.

    Directory of Open Access Journals (Sweden)

    Ho-Ming Su

    Full Text Available Systolic time interval (STI is an established noninvasive technique for the assessment of cardiac function. Brachial STIs can be automatically determined by an ankle-brachial index (ABI-form device. The aims of this study are to evaluate whether the STIs measured from ABI-form device can represent those measured from echocardiography and to compare the diagnostic values of brachial and echocardiographic STIs in the prediction of left ventricular ejection fraction (LVEF <50%. A total of 849 patients were included in the study. Brachial pre-ejection period (bPEP and brachial ejection time (bET were measured using an ABI-form device and pre-ejection period (PEP and ejection time (ET were measured from echocardiography. Agreement was assessed by correlation coefficient and Bland-Altman plot. Brachial STIs had a significant correlation with echocardiographic STIs (r = 0.644, P<0.001 for bPEP and PEP; r  = 0.850, P<0.001 for bET and ET; r = 0.708, P<0.001 for bPEP/bET and PEP/ET. The disagreement between brachial and echocardiographic STIs (brachial STIs minus echocardiographic STIs was 28.55 ms for bPEP and PEP, -4.15 ms for bET and ET and -0.11 for bPEP/bET and PEP/ET. The areas under the curve for bPEP/bET and PEP/ET in the prediction of LVEF <50% were 0.771 and 0.765, respectively. Brachial STIs were good alternatives to STIs obtained from echocardiography and also helpful in prediction of LVEF <50%. Brachial STIs automatically obtained from an ABI-form device may be helpful for evaluation of left ventricular systolic dysfunction.

  3. Time interval between stroke onset and hospital arrival in acute ischemic stroke patients in Shanghai, China.

    Science.gov (United States)

    Fang, Jing; Yan, Weihong; Jiang, Guo-Xin; Li, Wei; Cheng, Qi

    2011-02-01

    To observe the time interval between stroke onset and hospital arrival (time-to-hospital) in acute ischemic stroke patients and analyze its putatively associated factors. During the period from November 1, 2006 to August 31, 2008, patients with acute ischemic stroke admitted consecutively to the Department of Neurology, Ninth Hospital, Shanghai, were enrolled in the study. Information of the patients was registered including the time-to-hospital, demographic data, history of stroke, season at attack, neurological symptom at onset, etc. Characteristics of the patients were analyzed and logistic regression analyses were conducted to identify factors associated with the time-to-hospital. There were 536 patients in the study, 290 (54.1%) males and 246 (45.9%) females. The median time-to-hospital was 8h (ranged from 0.1 to 300 h) for all patients. Within 3h after the onset of stroke, 162 patients (30.2%) arrived at our hospital; and within 6h, 278 patients (51.9%). Patients with a history of stroke, unconsciousness at onset, or a high NIHSS score at admission had significantly less time-to-hospital. The time interval between stroke onset and hospital arrival was importance of seeking immediate medical help after stroke onset of patients and their relatives could significantly influence their actions. Copyright © 2010 Elsevier B.V. All rights reserved.

  4. Time interval measurement between to emission: a systematics; Mesure de l`intervalle de temps entre deux emissions: une systematique

    Energy Technology Data Exchange (ETDEWEB)

    Bizard, G.; Bougault, R.; Brou, R.; Colin, J.; Durand, D.; Genoux-Lubain, A.; Horn, D.; Kerambrun, A.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Lopez, O.; Louvel, M.; Mahi, M.; Meslin, C.; Steckmeyer, J.C.; Tamain, B.; Wieloch, A. [Lab. de Physique Corpusculaire, Caen Univ., 14 (France); LPC (Caen) - CRN (Strasbourg) Collaboration

    1998-04-01

    A systematic study of the evolution of intervals of fragment emission times as a function of the energy deposited in the compound system was performed. Several measurements, Ne at 60 MeV/u, Ar at 30 and 60 MeV/u and two measurements for Kr at 60 MeV/u (central and semi-peripheral collisions) are presented. In all the experiments the target was Au and the mass of the compounds system was around A = 200. The excitation energies per nucleon reached in the case of these heavy systems cover the range of 3 to 5.5 MeV/u. The method used to determine the emission time intervals is based on the correlation functions associated to the relative angle distributions. The gaps between the data and simulations allow to evaluate the emission times. A rapid decrease of these time intervals was observed when the excitation energy increased. This variation starts at 500 fm/c which corresponds to a sequential emission. This relatively long time which indicates a weak interaction between fragments, corresponds practically to the measurement threshold. The shortest intervals (about 50 fm/c) are associated to a spontaneous multifragmentation and were observed in the case of central collisions at Ar+Au and Kr+Au at 60 MeV/u. Two interpretations are possible. The multifragmentation process might be viewed as a sequential process of very short time-separation or else, one can separate two zones heaving in mind that the multifragmentation is predominant from 4,5 MeV/u excitation energy upwards. This question is still open and its study is under way at LPC. An answer could come from the study of the rupture process of an excited nucleus, notably by the determination of its life-time

  5. The effectiveness of regular leisure-time physical activities on long-term glycemic control in people with type 2 diabetes: A systematic review and meta-analysis.

    Science.gov (United States)

    Pai, Lee-Wen; Li, Tsai-Chung; Hwu, Yueh-Juen; Chang, Shu-Chuan; Chen, Li-Li; Chang, Pi-Ying

    2016-03-01

    The objective of this study was to systematically review the effectiveness of different types of regular leisure-time physical activities and pooled the effect sizes of those activities on long-term glycemic control in people with type 2 diabetes compared with routine care. This review included randomized controlled trials from 1960 to May 2014. A total of 10 Chinese and English databases were searched, following selection and critical appraisal, 18 randomized controlled trials with 915 participants were included. The standardized mean difference was reported as the summary statistic for the overall effect size in a random effects model. The results indicated yoga was the most effective in lowering glycated haemoglobin A1c (HbA1c) levels. Meta-analysis also revealed that the decrease in HbA1c levels of the subjects who took part in regular leisure-time physical activities was 0.60% more than that of control group participants. A higher frequency of regular leisure-time physical activities was found to be more effective in reducing HbA1c levels. The results of this review provide evidence of the benefits associated with regular leisure-time physical activities compared with routine care for lowering HbA1c levels in people with type 2 diabetes. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  6. Cardiac time intervals and the association with 2D-speckle-tracking, tissue Doppler and conventional echocardiography

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Jensen, Jan Skov; Andersen, Henrik Ullits

    2016-01-01

    Cardiac time intervals (CTI) are prognostic above and beyond conventional echocardiographic measures. The explanation may be that CTI contain information about both systolic and diastolic measures; this is, however, unknown. The relationship between the CTI and systolic and diastolic function...

  7. Using hemoglobin A1C as a predicting model for time interval from pre-diabetes progressing to diabetes.

    Directory of Open Access Journals (Sweden)

    Chen-Ling Huang

    Full Text Available The early identification of subjects at high risk for diabetes is essential, thus, random rather than fasting plasma glucose is more useful. We aim to evaluate the time interval between pre-diabetes to diabetes with anti-diabetic drugs by using HbA1C as a diagnostic tool, and predicting it using a mathematic model.We used the Taipei Medical University Affiliated Hospital Patient Profile Database (AHPPD from January-2007 to June-2011. The patients who progressed and were prescribed anti-diabetic drugs were selected from AHPPD. The mathematical model used to predict the time interval of HbA1C value ranged from 5.7% to 6.5% for diabetes progression.We predicted an average overall time interval for all participants in between 5.7% to 6.5% during a total of 907 days (standard error, 103 days. For each group found among 5.7% to 6.5% we determined 1169.3 days for the low risk group (i.e. 3.2 years, 1080.5 days (i.e. 2.96 years for the increased risk group and 729.4 days (i.e. 1.99 years for the diabetes group. This indicates the patients will take an average of 2.49 years to reach 6.5%.This prediction model is very useful to help prioritize the diagnosis at an early stage for targeting individuals with risk of diabetes. Using patients' HbA1C before anti-diabetes drugs are used we predicted the time interval from pre-diabetes progression to diabetes is 2.49 years without any influence of age and gender. Additional studies are needed to support this model for a long term prediction.

  8. [Dynamic Attending Binds Time and Rhythm Perception].

    Science.gov (United States)

    Kuroda, Tsuyoshi; Ono, Fuminori; Kadota, Hiroshi

    2017-11-01

    Relations between time and rhythm perception are discussed in this review of psychophysical research relevant to the multiple-look effect and dynamic-attending theory. Discrimination of two neighboring intervals that are marked by three successive sounds is improved when the presentation of the first (standard, S) interval is repeated before that of the second (comparison, C), as SSSSC. This improvement in sensitivity, called the multiple-look effect, occurs because listeners (1) perceive regular rhythm during the repetition of the standard interval, (2) predict the timing of subsequent sounds, and (3) detect sounds that are deviated from the predicted timing. The dynamic-attending theory attributes such predictions to the entrainment of attentional rhythms. An endogenous attentional rhythm is synchronized with the periodic succession of sounds marking the repeated standard. The standard and the comparison are discriminated on the basis of whether the ending marker of the comparison appears at the peak of the entrained attentional rhythm. This theory is compatible with the findings of recent neurophysiological studies that relate temporal prediction to neural oscillations.

  9. Distance-regular graphs

    NARCIS (Netherlands)

    van Dam, Edwin R.; Koolen, Jack H.; Tanaka, Hajime

    2016-01-01

    This is a survey of distance-regular graphs. We present an introduction to distance-regular graphs for the reader who is unfamiliar with the subject, and then give an overview of some developments in the area of distance-regular graphs since the monograph 'BCN'[Brouwer, A.E., Cohen, A.M., Neumaier,

  10. Regular expressions cookbook

    CERN Document Server

    Goyvaerts, Jan

    2009-01-01

    This cookbook provides more than 100 recipes to help you crunch data and manipulate text with regular expressions. Every programmer can find uses for regular expressions, but their power doesn't come worry-free. Even seasoned users often suffer from poor performance, false positives, false negatives, or perplexing bugs. Regular Expressions Cookbook offers step-by-step instructions for some of the most common tasks involving this tool, with recipes for C#, Java, JavaScript, Perl, PHP, Python, Ruby, and VB.NET. With this book, you will: Understand the basics of regular expressions through a

  11. Perceptions of Time and Long Time Intervals

    International Nuclear Information System (INIS)

    Drottz-Sjoeberg, Britt-Marie

    2006-01-01

    There are certainly many perspectives presented in the literature on time and time perception. This contribution has focused on perceptions of the time frames related to risk and danger of radiation from a planned Swedish repository for spent nuclear fuel. Respondents from two municipalities judged SSI's reviews of the entrepreneur's plans and work of high importance, and more important the closer to our time the estimate was given. Similarly were the consequences of potential leakage from a repository perceived as more serious the closer it would be to our time. Judgements of risks related to the storage of spent nuclear fuel were moderately large on the used measurement scales. Experts are experts because they have more knowledge, and in this context they underlined e.g. the importance of reviews of the radiation situation of time periods up to 100,000 years. It was of interest to note that 55% of the respondents from the municipalities did not believe that the future repository would leak radioactivity. They were much more pessimistic with respect to world politics, i.e. a new world war. However, with respect to the seriousness of the consequences given a leakage from the repository, the public group consistently gave high risk estimates, often significantly higher than those of the expert group. The underestimations of time estimates, as seen in the tasks of pinpointing historic events, provide examples of the difficulty of making estimations involving long times. Similar results showed that thinking of 'the future' most often involved about 30 years. On average, people reported memories of about 2.5 generations back in time, and emotional relationships stretching approximately 2.5 generations into the future; 94% of the responses, with respect to how many future generations one had an emotional relationship, were given in the range of 1-5 generations. Similarly, Svenson and Nilsson found the opinion that the current generations' general responsibility for

  12. LL-regular grammars

    NARCIS (Netherlands)

    Nijholt, Antinus

    1980-01-01

    Culik II and Cogen introduced the class of LR-regular grammars, an extension of the LR(k) grammars. In this paper we consider an analogous extension of the LL(k) grammars called the LL-regular grammars. The relation of this class of grammars to other classes of grammars will be shown. Any LL-regular

  13. Improvements in GRACE Gravity Fields Using Regularization

    Science.gov (United States)

    Save, H.; Bettadpur, S.; Tapley, B. D.

    2008-12-01

    The unconstrained global gravity field models derived from GRACE are susceptible to systematic errors that show up as broad "stripes" aligned in a North-South direction on the global maps of mass flux. These errors are believed to be a consequence of both systematic and random errors in the data that are amplified by the nature of the gravity field inverse problem. These errors impede scientific exploitation of the GRACE data products, and limit the realizable spatial resolution of the GRACE global gravity fields in certain regions. We use regularization techniques to reduce these "stripe" errors in the gravity field products. The regularization criteria are designed such that there is no attenuation of the signal and that the solutions fit the observations as well as an unconstrained solution. We have used a computationally inexpensive method, normally referred to as "L-ribbon", to find the regularization parameter. This paper discusses the characteristics and statistics of a 5-year time-series of regularized gravity field solutions. The solutions show markedly reduced stripes, are of uniformly good quality over time, and leave little or no systematic observation residuals, which is a frequent consequence of signal suppression from regularization. Up to degree 14, the signal in regularized solution shows correlation greater than 0.8 with the un-regularized CSR Release-04 solutions. Signals from large-amplitude and small-spatial extent events - such as the Great Sumatra Andaman Earthquake of 2004 - are visible in the global solutions without using special post-facto error reduction techniques employed previously in the literature. Hydrological signals as small as 5 cm water-layer equivalent in the small river basins, like Indus and Nile for example, are clearly evident, in contrast to noisy estimates from RL04. The residual variability over the oceans relative to a seasonal fit is small except at higher latitudes, and is evident without the need for de-striping or

  14. Analysis of the Factors Affecting the Interval between Blood Donations Using Log-Normal Hazard Model with Gamma Correlated Frailties.

    Science.gov (United States)

    Tavakol, Najmeh; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Time to donating blood plays a major role in a regular donor to becoming continues one. The aim of this study was to determine the effective factors on the interval between the blood donations. In a longitudinal study in 2008, 864 samples of first-time donors in Shahrekord Blood Transfusion Center,  capital city of Chaharmahal and Bakhtiari Province, Iran were selected by a systematic sampling and were followed up for five years. Among these samples, a subset of 424 donors who had at least two successful blood donations were chosen for this study and the time intervals between their donations were measured as response variable. Sex, body weight, age, marital status, education, stay and job were recorded as independent variables. Data analysis was performed based on log-normal hazard model with gamma correlated frailty. In this model, the frailties are sum of two independent components assumed a gamma distribution. The analysis was done via Bayesian approach using Markov Chain Monte Carlo algorithm by OpenBUGS. Convergence was checked via Gelman-Rubin criteria using BOA program in R. Age, job and education were significant on chance to donate blood (Pdonation for the higher-aged donors, clericals, workers, free job, students and educated donors were higher and in return, time intervals between their blood donations were shorter. Due to the significance effect of some variables in the log-normal correlated frailty model, it is necessary to plan educational and cultural program to encourage the people with longer inter-donation intervals to donate more frequently.

  15. Cardiac interbeat interval dynamics from childhood to senescence : comparison of conventional and new measures based on fractals and chaos theory

    Science.gov (United States)

    Pikkujamsa, S. M.; Makikallio, T. H.; Sourander, L. B.; Raiha, I. J.; Puukka, P.; Skytta, J.; Peng, C. K.; Goldberger, A. L.; Huikuri, H. V.

    1999-01-01

    BACKGROUND: New methods of R-R interval variability based on fractal scaling and nonlinear dynamics ("chaos theory") may give new insights into heart rate dynamics. The aims of this study were to (1) systematically characterize and quantify the effects of aging from early childhood to advanced age on 24-hour heart rate dynamics in healthy subjects; (2) compare age-related changes in conventional time- and frequency-domain measures with changes in newly derived measures based on fractal scaling and complexity (chaos) theory; and (3) further test the hypothesis that there is loss of complexity and altered fractal scaling of heart rate dynamics with advanced age. METHODS AND RESULTS: The relationship between age and cardiac interbeat (R-R) interval dynamics from childhood to senescence was studied in 114 healthy subjects (age range, 1 to 82 years) by measurement of the slope, beta, of the power-law regression line (log power-log frequency) of R-R interval variability (10(-4) to 10(-2) Hz), approximate entropy (ApEn), short-term (alpha(1)) and intermediate-term (alpha(2)) fractal scaling exponents obtained by detrended fluctuation analysis, and traditional time- and frequency-domain measures from 24-hour ECG recordings. Compared with young adults (60 years, n=29). CONCLUSIONS: Cardiac interbeat interval dynamics change markedly from childhood to old age in healthy subjects. Children show complexity and fractal correlation properties of R-R interval time series comparable to those of young adults, despite lower overall heart rate variability. Healthy aging is associated with R-R interval dynamics showing higher regularity and altered fractal scaling consistent with a loss of complex variability.

  16. Deficits in Interval Timing Measured by the Dual-Task Paradigm among Children and Adolescents with Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Hwang, Shoou-Lian; Gau, Susan Shur-Fen; Hsu, Wen-Yau; Wu, Yu-Yu

    2010-01-01

    Background: The underlying mechanism of time perception deficit in long time intervals in attention-deficit/hyperactivity disorder (ADHD) is still unclear. This study used the time reproduction dual task to explore the role of the attentional resource in time perception deficits among children and adolescents with ADHD. Methods: Participants…

  17. Proceedings of Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting (23rd) held in Pasadena, California on December 3-5, 1991

    Science.gov (United States)

    1991-12-05

    Between Two Western European Time Laboratories and VNIIFTRI ............. 341 P Daly, University of Leeds, N.B. Koshelyaevsky, VNIIFTRI , and W Lewandowski...equipped with GPS time receivers and contributing to TAI. The last GPS antenna position determined by the BIPM is installed near Moscow in the VNIIFTRI : it...Leeds and VNIIFTRI ", accepted in Proc. 23rd Annual Precise Time and Time Interval (PTTI) Applications and Planning Meeting, 1991. 15. W. Lewandowski and

  18. Predicting life-time and regular cannabis use during adolescence; the roles of temperament and peer substance use: the TRAILS study

    NARCIS (Netherlands)

    Creemers, H.E.; Dijkstra, J.K.; Vollebergh, W.A.M.; Ormel, J.; Verhulst, F.C.; Huizink, A.C.

    2010-01-01

    Aims The aim of the present study was to determine the mediating role of affiliation with cannabis-using peers in the pathways from various dimensions of temperament to life-time cannabis use, and to determine if these associations also contributed to the development of regular cannabis

  19. Perceptions of Time and Long Time Intervals

    Energy Technology Data Exchange (ETDEWEB)

    Drottz-Sjoeberg, Britt-Marie [Norwegian Univ. of Science and Technology, Trondheim (Norway). Dept. of Psychology

    2006-09-15

    There are certainly many perspectives presented in the literature on time and time perception. This contribution has focused on perceptions of the time frames related to risk and danger of radiation from a planned Swedish repository for spent nuclear fuel. Respondents from two municipalities judged SSI's reviews of the entrepreneur's plans and work of high importance, and more important the closer to our time the estimate was given. Similarly were the consequences of potential leakage from a repository perceived as more serious the closer it would be to our time. Judgements of risks related to the storage of spent nuclear fuel were moderately large on the used measurement scales. Experts are experts because they have more knowledge, and in this context they underlined e.g. the importance of reviews of the radiation situation of time periods up to 100,000 years. It was of interest to note that 55% of the respondents from the municipalities did not believe that the future repository would leak radioactivity. They were much more pessimistic with respect to world politics, i.e. a new world war. However, with respect to the seriousness of the consequences given a leakage from the repository, the public group consistently gave high risk estimates, often significantly higher than those of the expert group. The underestimations of time estimates, as seen in the tasks of pinpointing historic events, provide examples of the difficulty of making estimations involving long times. Similar results showed that thinking of 'the future' most often involved about 30 years. On average, people reported memories of about 2.5 generations back in time, and emotional relationships stretching approximately 2.5 generations into the future; 94% of the responses, with respect to how many future generations one had an emotional relationship, were given in the range of 1-5 generations. Similarly, Svenson and Nilsson found the opinion that the current generations

  20. Reducing errors in the GRACE gravity solutions using regularization

    Science.gov (United States)

    Save, Himanshu; Bettadpur, Srinivas; Tapley, Byron D.

    2012-09-01

    The nature of the gravity field inverse problem amplifies the noise in the GRACE data, which creeps into the mid and high degree and order harmonic coefficients of the Earth's monthly gravity fields provided by GRACE. Due to the use of imperfect background models and data noise, these errors are manifested as north-south striping in the monthly global maps of equivalent water heights. In order to reduce these errors, this study investigates the use of the L-curve method with Tikhonov regularization. L-curve is a popular aid for determining a suitable value of the regularization parameter when solving linear discrete ill-posed problems using Tikhonov regularization. However, the computational effort required to determine the L-curve is prohibitively high for a large-scale problem like GRACE. This study implements a parameter-choice method, using Lanczos bidiagonalization which is a computationally inexpensive approximation to L-curve. Lanczos bidiagonalization is implemented with orthogonal transformation in a parallel computing environment and projects a large estimation problem on a problem of the size of about 2 orders of magnitude smaller for computing the regularization parameter. Errors in the GRACE solution time series have certain characteristics that vary depending on the ground track coverage of the solutions. These errors increase with increasing degree and order. In addition, certain resonant and near-resonant harmonic coefficients have higher errors as compared with the other coefficients. Using the knowledge of these characteristics, this study designs a regularization matrix that provides a constraint on the geopotential coefficients as a function of its degree and order. This regularization matrix is then used to compute the appropriate regularization parameter for each monthly solution. A 7-year time-series of the candidate regularized solutions (Mar 2003-Feb 2010) show markedly reduced error stripes compared with the unconstrained GRACE release 4

  1. Periodicity in the autocorrelation function as a mechanism for regularly occurring zero crossings or extreme values of a Gaussian process.

    Science.gov (United States)

    Wilson, Lorna R M; Hopcraft, Keith I

    2017-12-01

    The problem of zero crossings is of great historical prevalence and promises extensive application. The challenge is to establish precisely how the autocorrelation function or power spectrum of a one-dimensional continuous random process determines the density function of the intervals between the zero crossings of that process. This paper investigates the case where periodicities are incorporated into the autocorrelation function of a smooth process. Numerical simulations, and statistics about the number of crossings in a fixed interval, reveal that in this case the zero crossings segue between a random and deterministic point process depending on the relative time scales of the periodic and nonperiodic components of the autocorrelation function. By considering the Laplace transform of the density function, we show that incorporating correlation between successive intervals is essential to obtaining accurate results for the interval variance. The same method enables prediction of the density function tail in some regions, and we suggest approaches for extending this to cover all regions. In an ever-more complex world, the potential applications for this scale of regularity in a random process are far reaching and powerful.

  2. Periodicity in the autocorrelation function as a mechanism for regularly occurring zero crossings or extreme values of a Gaussian process

    Science.gov (United States)

    Wilson, Lorna R. M.; Hopcraft, Keith I.

    2017-12-01

    The problem of zero crossings is of great historical prevalence and promises extensive application. The challenge is to establish precisely how the autocorrelation function or power spectrum of a one-dimensional continuous random process determines the density function of the intervals between the zero crossings of that process. This paper investigates the case where periodicities are incorporated into the autocorrelation function of a smooth process. Numerical simulations, and statistics about the number of crossings in a fixed interval, reveal that in this case the zero crossings segue between a random and deterministic point process depending on the relative time scales of the periodic and nonperiodic components of the autocorrelation function. By considering the Laplace transform of the density function, we show that incorporating correlation between successive intervals is essential to obtaining accurate results for the interval variance. The same method enables prediction of the density function tail in some regions, and we suggest approaches for extending this to cover all regions. In an ever-more complex world, the potential applications for this scale of regularity in a random process are far reaching and powerful.

  3. Association between regular physical exercise and depressive symptoms mediated through social support and resilience in Japanese company workers: a cross-sectional study.

    Science.gov (United States)

    Yoshikawa, Eisho; Nishi, Daisuke; Matsuoka, Yutaka J

    2016-07-12

    Regular physical exercise has been reported to reduce depressive symptoms. Several lines of evidence suggest that physical exercise may prevent depression by promoting social support or resilience, which is the ability to adapt to challenging life conditions. The aim of this study was to compare depressive symptoms, social support, and resilience between Japanese company workers who engaged in regular physical exercise and workers who did not exercise regularly. We also investigated whether regular physical exercise has an indirect association with depressive symptoms through social support and resilience. Participants were 715 Japanese employees at six worksites. Depressive symptoms were assessed with the Center for Epidemiologic Studies Depression (CES-D) scale, social support with the short version of the Social Support Questionnaire (SSQ), and resilience with the 14-item Resilience Scale (RS-14). A self-report questionnaire, which was extracted from the Japanese version of the Health-Promoting Lifestyle Profile, was used to assess whether participants engage in regular physical exercise, defined as more than 20 min, three or more times per week. The group differences in CES-D, SSQ, and RS-14 scores were investigated by using analysis of covariance (ANCOVA). Mediation analysis was conducted by using Preacher and Hayes' bootstrap script to assess whether regular physical exercise is associated with depressive symptoms indirectly through resilience and social support. The SSQ Number score (F = 4.82, p = 0.03), SSQ Satisfaction score (F = 6.68, p = 0.01), and RS-14 score (F = 6.01, p = 0.01) were significantly higher in the group with regular physical exercise (n = 83) than in the group without regular physical exercise (n = 632) after adjusting for age, education, marital status, and job status. The difference in CES-D score was not significant (F = 2.90, p = 0.09). Bootstrapping revealed significant negative indirect

  4. The effects of interval- vs. continuous exercise on excess post-exercise oxygen consumption and substrate oxidation rates in subjects with type 2 diabetes

    DEFF Research Database (Denmark)

    Karstoft, Kristian; Wallis, Gareth A.; Pedersen, Bente K.

    2016-01-01

    Background For unknown reasons, interval training often reduces body weight more than energy-expenditure matched continuous training. We compared the acute effects of time-duration and oxygen-consumption matched interval- vs. continuous exercise on excess post-exercise oxygen consumption (EPOC...... (MMTT, 450 kcal) was consumed by the subjects 45 min after completion of the intervention with blood samples taken regularly. Results Exercise interventions were successfully matched for total oxygen consumption (CW = 1641 ± 133 mL/min; IW = 1634 ± 126 mL/min, P > 0.05). EPOC was higher after IW (8......, free fatty acids and glycerol concentrations, and glycerol kinetics were increased comparably during and after IW and CW compared to CON. Conclusions Interval exercise results in greater EPOC than oxygen-consumption matched continuous exercise during a post-exercise MMTT in subjects with T2D, whereas...

  5. 76 FR 3629 - Regular Meeting

    Science.gov (United States)

    2011-01-20

    ... Meeting SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). Date and Time: The meeting of the Board will be held at the offices of the Farm... meeting of the Board will be open to the [[Page 3630

  6. Rapid determination of long-lived artificial alpha radionuclides using time interval analysis

    International Nuclear Information System (INIS)

    Uezu, Yasuhiro; Koarashi, Jun; Sanada, Yukihisa; Hashimoto, Tetsuo

    2003-01-01

    It is important to monitor long lived alpha radionuclides as plutonium ( 238 Pu, 239+240 Pu) in the field of working area and environment of nuclear fuel cycle facilities, because it is well known that potential risks of cancer-causing from alpha radiation is higher than gamma radiations. Thus, these monitoring are required high sensitivity, high resolution and rapid determination in order to measure a very low-level concentration of plutonium isotopes. In such high sensitive monitoring, natural radionuclides, including radon ( 222 Rn or 220 Rn) and their progenies, should be eliminated as low as possible. In this situation, a sophisticated discrimination method between Pu and progenies of 222 Rn or 220 Rn using time interval analysis (TIA), which was able to subtract short-lived radionuclides using the time interval distributions calculation of successive alpha and beta decay events within millisecond or microsecond orders, was designed and developed. In this system, alpha rays from 214 Po, 216 Po and 212 Po are extractable. TIA measuring system composes of Silicon Surface Barrier Detector (SSD), an amplifier, an Analog to Digital Converter (ADC), a Multi-Channel Analyzer (MCA), a high-resolution timer (TIMER), a multi-parameter collector and a personal computer. In ADC, incidental alpha and beta pulses are sent to the MCA and the TIMER simultaneously. Pulses from them are synthesized by the multi-parameter collector. After measurement, natural radionuclides are subtracted. Airborne particles were collected on membrane filter for 60 minutes at 100 L/min. Small Pu particles were added on the surface of it. Alpha and beta rays were measured and natural radionuclides were subtracted within 5 times of 145 msec. by TIA. As a result of it, the hidden Pu in natural background could be recognized clearly. The lower limit of determination of 239 Pu is calculated as 6x10 -9 Bq/cm 3 . This level is satisfied with the derived air concentration (DAC) of 239 Pu (8x10 -9 Bq/cm 3

  7. Time interval between cover crop termination and planting influences corn seedling disease, plant growth, and yield

    Science.gov (United States)

    Experiments were established in controlled and field environment to evaluate the effect of time intervals between cereal rye cover crop termination and corn planting on corn seedling disease, corn growth, and grain yield in 2014 and 2015. Rye termination dates ranged from 25 days before planting (DB...

  8. Contour Propagation With Riemannian Elasticity Regularization

    DEFF Research Database (Denmark)

    Bjerre, Troels; Hansen, Mads Fogtmann; Sapru, W.

    2011-01-01

    Purpose/Objective(s): Adaptive techniques allow for correction of spatial changes during the time course of the fractionated radiotherapy. Spatial changes include tumor shrinkage and weight loss, causing tissue deformation and residual positional errors even after translational and rotational image...... the planning CT onto the rescans and correcting to reflect actual anatomical changes. For deformable registration, a free-form, multi-level, B-spline deformation model with Riemannian elasticity, penalizing non-rigid local deformations, and volumetric changes, was used. Regularization parameters was defined...... on the original delineation and tissue deformation in the time course between scans form a better starting point than rigid propagation. There was no significant difference of locally and globally defined regularization. The method used in the present study suggests that deformed contours need to be reviewed...

  9. On solving wave equations on fixed bounded intervals involving Robin boundary conditions with time-dependent coefficients

    Science.gov (United States)

    van Horssen, Wim T.; Wang, Yandong; Cao, Guohua

    2018-06-01

    In this paper, it is shown how characteristic coordinates, or equivalently how the well-known formula of d'Alembert, can be used to solve initial-boundary value problems for wave equations on fixed, bounded intervals involving Robin type of boundary conditions with time-dependent coefficients. A Robin boundary condition is a condition that specifies a linear combination of the dependent variable and its first order space-derivative on a boundary of the interval. Analytical methods, such as the method of separation of variables (SOV) or the Laplace transform method, are not applicable to those types of problems. The obtained analytical results by applying the proposed method, are in complete agreement with those obtained by using the numerical, finite difference method. For problems with time-independent coefficients in the Robin boundary condition(s), the results of the proposed method also completely agree with those as for instance obtained by the method of separation of variables, or by the finite difference method.

  10. Interval selection with machine-dependent intervals

    OpenAIRE

    Bohmova K.; Disser Y.; Mihalak M.; Widmayer P.

    2013-01-01

    We study an offline interval scheduling problem where every job has exactly one associated interval on every machine. To schedule a set of jobs, exactly one of the intervals associated with each job must be selected, and the intervals selected on the same machine must not intersect.We show that deciding whether all jobs can be scheduled is NP-complete already in various simple cases. In particular, by showing the NP-completeness for the case when all the intervals associated with the same job...

  11. Interval Running Training Improves Cognitive Flexibility and Aerobic Power of Young Healthy Adults.

    Science.gov (United States)

    Venckunas, Tomas; Snieckus, Audrius; Trinkunas, Eugenijus; Baranauskiene, Neringa; Solianik, Rima; Juodsnukis, Antanas; Streckis, Vytautas; Kamandulis, Sigitas

    2016-08-01

    Venckunas, T, Snieckus, A, Trinkunas, E, Baranauskiene, N, Solianik, R, Juodsnukis, A, Streckis, V, and Kamandulis, S. Interval running training improves cognitive flexibility and aerobic power of young healthy adults. J Strength Cond Res 30(8): 2114-2121, 2016-The benefits of regular physical exercise may well extend beyond the reduction of chronic diseases risk and augmentation of working capacity, to many other aspects of human well-being, including improved cognitive functioning. Although the effects of moderate intensity continuous training on cognitive performance are relatively well studied, the benefits of interval training have not been investigated in this respect so far. The aim of the current study was to assess whether 7 weeks of interval running training is effective at improving both aerobic fitness and cognitive performance. For this purpose, 8 young dinghy sailors (6 boys and 2 girls) completed the interval running program with 200 m and 2,000 m running performance, cycling maximal oxygen uptake, and cognitive function was measured before and after the intervention. The control group consisted of healthy age-matched subjects (8 boys and 2 girls) who continued their active lifestyle and were tested in the same way as the experimental group, but did not complete any regular training. In the experimental group, 200 m and 2,000 m running performance and cycling maximal oxygen uptake increased together with improved results on cognitive flexibility tasks. No changes in the results of short-term and working memory tasks were observed in the experimental group, and no changes in any of the measured indices were evident in the controls. In conclusion, 7 weeks of interval running training improved running performance and cycling aerobic power, and were sufficient to improve the ability to adjust behavior to changing demands in young active individuals.

  12. [Affect regularity of medicinal species and heating time on flavonoids contents in Epimedium cut crude drug].

    Science.gov (United States)

    Sun, E; Chen, Ling-ling; Jia, Xiao-bin; Qian, Qian; Cui, Li

    2012-09-01

    To study the affect regularity of medicinal species and heating time on flavonoids contents in Epimedium cut crude drug. Setting processing temperature at 170 degrees C, 39 batches Epimedium cut crude drug of different species were heated for 0, 5, 10 minutes. The contents of epimedin A, B, C, icariin, Baohuoside I in different species of Epimedium were determined by HPLC. The variance analysis was used to study the effect of medicinal species and heating time on the contents change of five major flavonoids. The contents of Epimedin A, B, C were significantly impacted by medicinal species (P time (P time and species (P > 0.05). The medicinal species and heat processed time are two important influence factors on the flavonoids contents in Epimedium. The contents of Epimedin A, C are abundant in Epimedium pubescens, and the contents of Epimedin B, Baohuoside I are higher in Epimedium brevicornu. After heating, the contents of Epimedin A, B, C are decreased, and icariin, Baohuoside I are increased. This study provides scientific evidences for variety certification, optimizing processing technology, exploring processing mechanism and clinical rational administration.

  13. An iterative method for Tikhonov regularization with a general linear regularization operator

    NARCIS (Netherlands)

    Hochstenbach, M.E.; Reichel, L.

    2010-01-01

    Tikhonov regularization is one of the most popular approaches to solve discrete ill-posed problems with error-contaminated data. A regularization operator and a suitable value of a regularization parameter have to be chosen. This paper describes an iterative method, based on Golub-Kahan

  14. High-intensity interval training improves insulin sensitivity in older individuals.

    Science.gov (United States)

    Søgaard, D; Lund, M T; Scheuer, C M; Dehlbaek, M S; Dideriksen, S G; Abildskov, C V; Christensen, K K; Dohlmann, T L; Larsen, S; Vigelsø, A H; Dela, F; Helge, J W

    2018-04-01

    Metabolic health may deteriorate with age as a result of altered body composition and decreased physical activity. Endurance exercise is known to counter these changes delaying or even preventing onset of metabolic diseases. High-intensity interval training (HIIT) is a time efficient alternative to regular endurance exercise, and the aim of this study was to investigate the metabolic benefit of HIIT in older subjects. Twenty-two sedentary male (n = 11) and female (n = 11) subjects aged 63 ± 1 years performed HIIT training three times/week for 6 weeks on a bicycle ergometer. Each HIIT session consisted of five 1-minute intervals interspersed with 1½-minute rest. Prior to the first and after the last HIIT session whole-body insulin sensitivity, measured by a hyperinsulinaemic-euglycaemic clamp, plasma lipid levels, HbA1c, glycaemic parameters, body composition and maximal oxygen uptake were assessed. Muscle biopsies were obtained wherefrom content of glycogen and proteins involved in muscle glucose handling were determined. Insulin sensitivity (P = .011) and maximal oxygen uptake increased (P body fat (P < .05) decreased after 6 weeks of HIIT. HbA1c decreased only in males (P = .001). Muscle glycogen content increased in both genders (P = .001) and in line GLUT4 (P < .05), glycogen synthase (P = .001) and hexokinase II (P < .05) content all increased. Six weeks of HIIT significantly improves metabolic health in older males and females by reducing age-related risk factors for cardiometabolic disease. © 2017 Scandinavian Physiological Society. Published by John Wiley & Sons Ltd.

  15. Regular Expression Pocket Reference

    CERN Document Server

    Stubblebine, Tony

    2007-01-01

    This handy little book offers programmers a complete overview of the syntax and semantics of regular expressions that are at the heart of every text-processing application. Ideal as a quick reference, Regular Expression Pocket Reference covers the regular expression APIs for Perl 5.8, Ruby (including some upcoming 1.9 features), Java, PHP, .NET and C#, Python, vi, JavaScript, and the PCRE regular expression libraries. This concise and easy-to-use reference puts a very powerful tool for manipulating text and data right at your fingertips. Composed of a mixture of symbols and text, regular exp

  16. Water Residence Time estimation by 1D deconvolution in the form of a l2 -regularized inverse problem with smoothness, positivity and causality constraints

    Science.gov (United States)

    Meresescu, Alina G.; Kowalski, Matthieu; Schmidt, Frédéric; Landais, François

    2018-06-01

    The Water Residence Time distribution is the equivalent of the impulse response of a linear system allowing the propagation of water through a medium, e.g. the propagation of rain water from the top of the mountain towards the aquifers. We consider the output aquifer levels as the convolution between the input rain levels and the Water Residence Time, starting with an initial aquifer base level. The estimation of Water Residence Time is important for a better understanding of hydro-bio-geochemical processes and mixing properties of wetlands used as filters in ecological applications, as well as protecting fresh water sources for wells from pollutants. Common methods of estimating the Water Residence Time focus on cross-correlation, parameter fitting and non-parametric deconvolution methods. Here we propose a 1D full-deconvolution, regularized, non-parametric inverse problem algorithm that enforces smoothness and uses constraints of causality and positivity to estimate the Water Residence Time curve. Compared to Bayesian non-parametric deconvolution approaches, it has a fast runtime per test case; compared to the popular and fast cross-correlation method, it produces a more precise Water Residence Time curve even in the case of noisy measurements. The algorithm needs only one regularization parameter to balance between smoothness of the Water Residence Time and accuracy of the reconstruction. We propose an approach on how to automatically find a suitable value of the regularization parameter from the input data only. Tests on real data illustrate the potential of this method to analyze hydrological datasets.

  17. 33 CFR 150.503 - What are the time interval requirements for maintenance on survival craft falls?

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false What are the time interval requirements for maintenance on survival craft falls? 150.503 Section 150.503 Navigation and Navigable Waters... maintenance on survival craft falls? (a) Each fall used in a launching device for survival craft or rescue...

  18. Emergence of synchronization and regularity in firing patterns in time-varying neural hypernetworks

    Science.gov (United States)

    Rakshit, Sarbendu; Bera, Bidesh K.; Ghosh, Dibakar; Sinha, Sudeshna

    2018-05-01

    We study synchronization of dynamical systems coupled in time-varying network architectures, composed of two or more network topologies, corresponding to different interaction schemes. As a representative example of this class of time-varying hypernetworks, we consider coupled Hindmarsh-Rose neurons, involving two distinct types of networks, mimicking interactions that occur through the electrical gap junctions and the chemical synapses. Specifically, we consider the connections corresponding to the electrical gap junctions to form a small-world network, while the chemical synaptic interactions form a unidirectional random network. Further, all the connections in the hypernetwork are allowed to change in time, modeling a more realistic neurobiological scenario. We model this time variation by rewiring the links stochastically with a characteristic rewiring frequency f . We find that the coupling strength necessary to achieve complete neuronal synchrony is lower when the links are switched rapidly. Further, the average time required to reach the synchronized state decreases as synaptic coupling strength and/or rewiring frequency increases. To quantify the local stability of complete synchronous state we use the Master Stability Function approach, and for global stability we employ the concept of basin stability. The analytically derived necessary condition for synchrony is in excellent agreement with numerical results. Further we investigate the resilience of the synchronous states with respect to increasing network size, and we find that synchrony can be maintained up to larger network sizes by increasing either synaptic strength or rewiring frequency. Last, we find that time-varying links not only promote complete synchronization, but also have the capacity to change the local dynamics of each single neuron. Specifically, in a window of rewiring frequency and synaptic coupling strength, we observe that the spiking behavior becomes more regular.

  19. Individual Case Analysis of Postmortem Interval Time on Brain Tissue Preservation.

    Directory of Open Access Journals (Sweden)

    Jeffrey A Blair

    Full Text Available At autopsy, the time that has elapsed since the time of death is routinely documented and noted as the postmortem interval (PMI. The PMI of human tissue samples is a parameter often reported in research studies and comparable PMI is preferred when comparing different populations, i.e., disease versus control patients. In theory, a short PMI may alleviate non-experimental protein denaturation, enzyme activity, and other chemical changes such as the pH, which could affect protein and nucleic acid integrity. Previous studies have compared PMI en masse by looking at many different individual cases each with one unique PMI, which may be affected by individual variance. To overcome this obstacle, in this study human hippocampal segments from the same individuals were sampled at different time points after autopsy creating a series of PMIs for each case. Frozen and fixed tissue was then examined by Western blot, RT-PCR, and immunohistochemistry to evaluate the effect of extended PMI on proteins, nucleic acids, and tissue morphology. In our results, immunostaining profiles for most proteins remained unchanged even after PMI of over 50 h, yet by Western blot distinctive degradation patterns were observed in different protein species. Finally, RNA integrity was lower after extended PMI; however, RNA preservation was variable among cases suggesting antemortem factors may play a larger role than PMI in protein and nucleic acid integrity.

  20. Prevalence and Correlates of Having a Regular Physician among Women Presenting for Induced Abortion.

    Science.gov (United States)

    Chor, Julie; Hebert, Luciana E; Hasselbacher, Lee A; Whitaker, Amy K

    2016-01-01

    To determine the prevalence and correlates of having a regular physician among women presenting for induced abortion. We conducted a retrospective review of women presenting to an urban, university-based family planning clinic for abortion between January 2008 and September 2011. We conducted bivariate analyses, comparing women with and without a regular physician, and multivariable regression modeling, to identify factors associated with not having a regular physician. Of 834 women, 521 (62.5%) had a regular physician and 313 (37.5%) did not. Women with a prior pregnancy, live birth, or spontaneous abortion were more likely than women without these experiences to have a regular physician. Women with a prior induced abortion were not more likely than women who had never had a prior induced abortion to have a regular physician. Compared with women younger than 18 years, women aged 18 to 26 years were less likely to have a physician (adjusted odds ratio [aOR], 0.25; 95% confidence interval [CI], 0.10-0.62). Women with a prior live birth had increased odds of having a regular physician compared with women without a prior pregnancy (aOR, 1.89; 95% CI, 1.13-3.16). Women without medical/fetal indications and who had not been victims of sexual assault (self-indicated) were less likely to report having a regular physician compared with women with medical/fetal indications (aOR, 0.55; 95% CI, 0.17-0.82). The abortion visit is a point of contact with a large number of women without a regular physician and therefore provides an opportunity to integrate women into health care. Copyright © 2016 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  1. Dependency of magnetocardiographically determined fetal cardiac time intervals on gestational age, gender and postnatal biometrics in healthy pregnancies

    Directory of Open Access Journals (Sweden)

    Geue Daniel

    2004-04-01

    Full Text Available Abstract Background Magnetocardiography enables the precise determination of fetal cardiac time intervals (CTI as early as the second trimester of pregnancy. It has been shown that fetal CTI change in course of gestation. The aim of this work was to investigate the dependency of fetal CTI on gestational age, gender and postnatal biometric data in a substantial sample of subjects during normal pregnancy. Methods A total of 230 fetal magnetocardiograms were obtained in 47 healthy fetuses between the 15th and 42nd week of gestation. In each recording, after subtraction of the maternal cardiac artifact and the identification of fetal beats, fetal PQRST courses were signal averaged. On the basis of therein detected wave onsets and ends, the following CTI were determined: P wave, PR interval, PQ interval, QRS complex, ST segment, T wave, QT and QTc interval. Using regression analysis, the dependency of the CTI were examined with respect to gestational age, gender and postnatal biometric data. Results Atrioventricular conduction and ventricular depolarization times could be determined dependably whereas the T wave was often difficult to detect. Linear and nonlinear regression analysis established strong dependency on age for the P wave and QRS complex (r2 = 0.67, p r2 = 0.66, p r2 = 0.21, p r2 = 0.13, p st week onward (p Conclusion We conclude that 1 from approximately the 18th week to term, fetal CTI which quantify depolarization times can be reliably determined using magnetocardiography, 2 the P wave and QRS complex duration show a high dependency on age which to a large part reflects fetal growth and 3 fetal gender plays a role in QRS complex duration in the third trimester. Fetal development is thus in part reflected in the CTI and may be useful in the identification of intrauterine growth retardation.

  2. Strong Bisimilarity and Regularity of Basic Parallel Processes is PSPACE-Hard

    DEFF Research Database (Denmark)

    Srba, Jirí

    2002-01-01

    We show that the problem of checking whether two processes definable in the syntax of Basic Parallel Processes (BPP) are strongly bisimilar is PSPACE-hard. We also demonstrate that there is a polynomial time reduction from the strong bisimilarity checking problem of regular BPP to the strong...... regularity (finiteness) checking of BPP. This implies that strong regularity of BPP is also PSPACE-hard....

  3. Synchronization of Markovian jumping stochastic complex networks with distributed time delays and probabilistic interval discrete time-varying delays

    International Nuclear Information System (INIS)

    Li Hongjie; Yue Dong

    2010-01-01

    The paper investigates the synchronization stability problem for a class of complex dynamical networks with Markovian jumping parameters and mixed time delays. The complex networks consist of m modes and the networks switch from one mode to another according to a Markovian chain with known transition probability. The mixed time delays are composed of discrete and distributed delays, the discrete time delay is assumed to be random and its probability distribution is known a priori. In terms of the probability distribution of the delays, the new type of system model with probability-distribution-dependent parameter matrices is proposed. Based on the stochastic analysis techniques and the properties of the Kronecker product, delay-dependent synchronization stability criteria in the mean square are derived in the form of linear matrix inequalities which can be readily solved by using the LMI toolbox in MATLAB, the solvability of derived conditions depends on not only the size of the delay, but also the probability of the delay-taking values in some intervals. Finally, a numerical example is given to illustrate the feasibility and effectiveness of the proposed method.

  4. Regular use of aspirin and pancreatic cancer risk

    Directory of Open Access Journals (Sweden)

    Mahoney Martin C

    2002-09-01

    Full Text Available Abstract Background Regular use of aspirin and other non-steroidal anti-inflammatory drugs (NSAIDs has been consistently associated with reduced risk of colorectal cancer and adenoma, and there is some evidence for a protective effect for other types of cancer. As experimental studies reveal a possible role for NSAIDs is reducing the risk of pancreatic cancer, epidemiological studies examining similar associations in human populations become more important. Methods In this hospital-based case-control study, 194 patients with pancreatic cancer were compared to 582 age and sex-matched patients with non-neoplastic conditions to examine the association between aspirin use and risk of pancreatic cancer. All participants received medical services at the Roswell Park Cancer Institute in Buffalo, NY and completed a comprehensive epidemiologic questionnaire that included information on demographics, lifestyle factors and medical history as well as frequency and duration of aspirin use. Patients using at least one tablet per week for at least six months were classified as regular aspirin users. Unconditional logistic regression was used to compute crude and adjusted odds ratios (ORs with 95% confidence intervals (CIs. Results Pancreatic cancer risk in aspirin users was not changed relative to non-users (adjusted OR = 1.00; 95% CI 0.72–1.39. No significant change in risk was found in relation to greater frequency or prolonged duration of use, in the total sample or in either gender. Conclusions These data suggest that regular aspirin use may not be associated with lower risk of pancreatic cancer.

  5. New applications of Boson's coherent states of double modes at regular product

    International Nuclear Information System (INIS)

    Zhang Yongde; Ren Yong

    1987-05-01

    This paper presents a series of new applications of boson's coherent states of double modes by means of the technique of regular products. They include non-coupled double oscillator solutions at two time dependent extra-sources; coupled double oscillator solutions at two time dependent extra-sources; some applications to regular momentum theory; an explicit expression for time-reversal operator. (author). 7 refs

  6. Self-calibration for lab-μCT using space-time regularized projection-based DVC and model reduction

    Science.gov (United States)

    Jailin, C.; Buljac, A.; Bouterf, A.; Poncelet, M.; Hild, F.; Roux, S.

    2018-02-01

    An online calibration procedure for x-ray lab-CT is developed using projection-based digital volume correlation. An initial reconstruction of the sample is positioned in the 3D space for every angle so that its projection matches the initial one. This procedure allows a space-time displacement field to be estimated for the scanned sample, which is regularized with (i) rigid body motions in space and (ii) modal time shape functions computed using model reduction techniques (i.e. proper generalized decomposition). The result is an accurate identification of the position of the sample adapted for each angle, which may deviate from the desired perfect rotation required for standard reconstructions. An application of this procedure to a 4D in situ mechanical test is shown. The proposed correction leads to a much improved tomographic reconstruction quality.

  7. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    Science.gov (United States)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  8. The geometry of continuum regularization

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1987-03-01

    This lecture is primarily an introduction to coordinate-invariant regularization, a recent advance in the continuum regularization program. In this context, the program is seen as fundamentally geometric, with all regularization contained in regularized DeWitt superstructures on field deformations

  9. Regular expression containment

    DEFF Research Database (Denmark)

    Henglein, Fritz; Nielsen, Lasse

    2011-01-01

    We present a new sound and complete axiomatization of regular expression containment. It consists of the conventional axiomatiza- tion of concatenation, alternation, empty set and (the singleton set containing) the empty string as an idempotent semiring, the fixed- point rule E* = 1 + E × E......* for Kleene-star, and a general coin- duction rule as the only additional rule. Our axiomatization gives rise to a natural computational inter- pretation of regular expressions as simple types that represent parse trees, and of containment proofs as coercions. This gives the axiom- atization a Curry......-Howard-style constructive interpretation: Con- tainment proofs do not only certify a language-theoretic contain- ment, but, under our computational interpretation, constructively transform a membership proof of a string in one regular expres- sion into a membership proof of the same string in another regular expression. We...

  10. Supersymmetric dimensional regularization

    International Nuclear Information System (INIS)

    Siegel, W.; Townsend, P.K.; van Nieuwenhuizen, P.

    1980-01-01

    There is a simple modification of dimension regularization which preserves supersymmetry: dimensional reduction to real D < 4, followed by analytic continuation to complex D. In terms of component fields, this means fixing the ranges of all indices on the fields (and therefore the numbers of Fermi and Bose components). For superfields, it means continuing in the dimensionality of x-space while fixing the dimensionality of theta-space. This regularization procedure allows the simple manipulation of spinor derivatives in supergraph calculations. The resulting rules are: (1) First do all algebra exactly as in D = 4; (2) Then do the momentum integrals as in ordinary dimensional regularization. This regularization procedure needs extra rules before one can say that it is consistent. Such extra rules needed for superconformal anomalies are discussed. Problems associated with renormalizability and higher order loops are also discussed

  11. Rule-based learning of regular past tense in children with specific language impairment.

    Science.gov (United States)

    Smith-Lock, Karen M

    2015-01-01

    The treatment of children with specific language impairment was used as a means to investigate whether a single- or dual-mechanism theory best conceptualizes the acquisition of English past tense. The dual-mechanism theory proposes that regular English past-tense forms are produced via a rule-based process whereas past-tense forms of irregular verbs are stored in the lexicon. Single-mechanism theories propose that both regular and irregular past-tense verbs are stored in the lexicon. Five 5-year-olds with specific language impairment received treatment for regular past tense. The children were tested on regular past-tense production and third-person singular "s" twice before treatment and once after treatment, at eight-week intervals. Treatment consisted of one-hour play-based sessions, once weekly, for eight weeks. Crucially, treatment focused on different lexical items from those in the test. Each child demonstrated significant improvement on the untreated past-tense test items after treatment, but no improvement on the untreated third-person singular "s". Generalization to untreated past-tense verbs could not be attributed to a frequency effect or to phonological similarity of trained and tested items. It is argued that the results are consistent with a dual-mechanism theory of past-tense inflection.

  12. Association between regular physical exercise and depressive symptoms mediated through social support and resilience in Japanese company workers: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Eisho Yoshikawa

    2016-07-01

    Full Text Available Abstract Background Regular physical exercise has been reported to reduce depressive symptoms. Several lines of evidence suggest that physical exercise may prevent depression by promoting social support or resilience, which is the ability to adapt to challenging life conditions. The aim of this study was to compare depressive symptoms, social support, and resilience between Japanese company workers who engaged in regular physical exercise and workers who did not exercise regularly. We also investigated whether regular physical exercise has an indirect association with depressive symptoms through social support and resilience. Methods Participants were 715 Japanese employees at six worksites. Depressive symptoms were assessed with the Center for Epidemiologic Studies Depression (CES-D scale, social support with the short version of the Social Support Questionnaire (SSQ, and resilience with the 14-item Resilience Scale (RS-14. A self-report questionnaire, which was extracted from the Japanese version of the Health-Promoting Lifestyle Profile, was used to assess whether participants engage in regular physical exercise, defined as more than 20 min, three or more times per week. The group differences in CES-D, SSQ, and RS-14 scores were investigated by using analysis of covariance (ANCOVA. Mediation analysis was conducted by using Preacher and Hayes’ bootstrap script to assess whether regular physical exercise is associated with depressive symptoms indirectly through resilience and social support. Results The SSQ Number score (F = 4.82, p = 0.03, SSQ Satisfaction score (F = 6.68, p = 0.01, and RS-14 score (F = 6.01, p = 0.01 were significantly higher in the group with regular physical exercise (n = 83 than in the group without regular physical exercise (n = 632 after adjusting for age, education, marital status, and job status. The difference in CES-D score was not significant (F = 2.90, p = 0

  13. Stochastic differential equations as a tool to regularize the parameter estimation problem for continuous time dynamical systems given discrete time measurements.

    Science.gov (United States)

    Leander, Jacob; Lundh, Torbjörn; Jirstrand, Mats

    2014-05-01

    In this paper we consider the problem of estimating parameters in ordinary differential equations given discrete time experimental data. The impact of going from an ordinary to a stochastic differential equation setting is investigated as a tool to overcome the problem of local minima in the objective function. Using two different models, it is demonstrated that by allowing noise in the underlying model itself, the objective functions to be minimized in the parameter estimation procedures are regularized in the sense that the number of local minima is reduced and better convergence is achieved. The advantage of using stochastic differential equations is that the actual states in the model are predicted from data and this will allow the prediction to stay close to data even when the parameters in the model is incorrect. The extended Kalman filter is used as a state estimator and sensitivity equations are provided to give an accurate calculation of the gradient of the objective function. The method is illustrated using in silico data from the FitzHugh-Nagumo model for excitable media and the Lotka-Volterra predator-prey system. The proposed method performs well on the models considered, and is able to regularize the objective function in both models. This leads to parameter estimation problems with fewer local minima which can be solved by efficient gradient-based methods. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Timing matters: The processing of pitch relations

    Directory of Open Access Journals (Sweden)

    Annekathrin eWeise

    2014-06-01

    Full Text Available The human central auditory system can automatically extract abstract regularities from a variant auditory input. To this end, temporarily separated events need to be related. This study tested whether the timing between events, falling either within or outside the temporal window of integration (~350 ms, impacts the extraction of abstract feature relations. We utilized tone pairs for which tones within but not across pairs revealed a constant pitch relation (e.g. pitch of 2nd tone of a pair higher than pitch of 1st tone, while absolute pitch values varied across pairs. We measured the Mismatch Negativity (MMN; the brain’s error signal to auditory regularity violations to 2nd tones that rarely violated the pitch relation (e.g. pitch of 2nd tone lower. A Short condition in which tone duration (90 ms and stimulus onset asynchrony between the tones of a pair were short (110 ms was compared to two conditions, where this onset asynchrony was long (510 ms. In the Long Gap condition the tone durations were identical to Short (90 ms, but the silent interval was prolonged by 400 ms. In Long Tone the duration of the first tone was prolonged by 400 ms, while the silent interval was comparable to Short (20 ms. Results show a frontocentral MMN of comparable amplitude in all conditions. Thus, abstract pitch relations can be extracted even when the within-pair timing exceeds the integration period. Source analyses indicate MMN generators in the supratemporal cortex. Interestingly, they were located more anterior in Long Gap than in Short and Long Tone. Moreover, frontal generator activity was found for Long Gap and Long Tone. Thus, the way in which the system automatically registers irregular abstract pitch relations depends on the timing of the events to be linked. Pending that the current MMN data mirror established abstract rule representations coding the regular pitch relation, neural processes building these templates vary with timing.

  15. Timing matters: the processing of pitch relations

    Science.gov (United States)

    Weise, Annekathrin; Grimm, Sabine; Trujillo-Barreto, Nelson J.; Schröger, Erich

    2014-01-01

    The human central auditory system can automatically extract abstract regularities from a variant auditory input. To this end, temporarily separated events need to be related. This study tested whether the timing between events, falling either within or outside the temporal window of integration (~350 ms), impacts the extraction of abstract feature relations. We utilized tone pairs for which tones within but not across pairs revealed a constant pitch relation (e.g., pitch of second tone of a pair higher than pitch of first tone, while absolute pitch values varied across pairs). We measured the mismatch negativity (MMN; the brain’s error signal to auditory regularity violations) to second tones that rarely violated the pitch relation (e.g., pitch of second tone lower). A Short condition in which tone duration (90 ms) and stimulus onset asynchrony between the tones of a pair were short (110 ms) was compared to two conditions, where this onset asynchrony was long (510 ms). In the Long Gap condition, the tone durations were identical to Short (90 ms), but the silent interval was prolonged by 400 ms. In Long Tone, the duration of the first tone was prolonged by 400 ms, while the silent interval was comparable to Short (20 ms). Results show a frontocentral MMN of comparable amplitude in all conditions. Thus, abstract pitch relations can be extracted even when the within-pair timing exceeds the integration period. Source analyses indicate MMN generators in the supratemporal cortex. Interestingly, they were located more anterior in Long Gap than in Short and Long Tone. Moreover, frontal generator activity was found for Long Gap and Long Tone. Thus, the way in which the system automatically registers irregular abstract pitch relations depends on the timing of the events to be linked. Pending that the current MMN data mirror established abstract rule representations coding the regular pitch relation, neural processes building these templates vary with timing. PMID:24966823

  16. Regularization by External Variables

    DEFF Research Database (Denmark)

    Bossolini, Elena; Edwards, R.; Glendinning, P. A.

    2016-01-01

    Regularization was a big topic at the 2016 CRM Intensive Research Program on Advances in Nonsmooth Dynamics. There are many open questions concerning well known kinds of regularization (e.g., by smoothing or hysteresis). Here, we propose a framework for an alternative and important kind of regula......Regularization was a big topic at the 2016 CRM Intensive Research Program on Advances in Nonsmooth Dynamics. There are many open questions concerning well known kinds of regularization (e.g., by smoothing or hysteresis). Here, we propose a framework for an alternative and important kind...

  17. Regular Single Valued Neutrosophic Hypergraphs

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam Malik

    2016-12-01

    Full Text Available In this paper, we define the regular and totally regular single valued neutrosophic hypergraphs, and discuss the order and size along with properties of regular and totally regular single valued neutrosophic hypergraphs. We also extend work on completeness of single valued neutrosophic hypergraphs.

  18. Use of regularized algebraic methods in tomographic reconstruction

    International Nuclear Information System (INIS)

    Koulibaly, P.M.; Darcourt, J.; Blanc-Ferraud, L.; Migneco, O.; Barlaud, M.

    1997-01-01

    The algebraic methods are used in emission tomography to facilitate the compensation of attenuation and of Compton scattering. We have tested on a phantom the use of a regularization (a priori introduction of information), as well as the taking into account of spatial resolution variation with the depth (SRVD). Hence, we have compared the performances of the two methods by back-projection filtering (BPF) and of the two algebraic methods (AM) in terms of FWHM (by means of a point source), of the reduction of background noise (σ/m) on the homogeneous part of Jaszczak's phantom and of reconstruction speed (time unit = BPF). The BPF methods make use of a grade filter (maximal resolution, no noise treatment), single or associated with a Hann's low-pass (f c = 0.4), as well as of an attenuation correction. The AM which embody attenuation and scattering corrections are, on one side, the OS EM (Ordered Subsets, partitioning and rearranging of the projection matrix; Expectation Maximization) without regularization or SRVD correction, and, on the other side, the OS MAP EM (Maximum a posteriori), regularized and embodying the SRVD correction. A table is given containing for each used method (grade, Hann, OS EM and OS MAP EM) the values of FWHM, σ/m and time, respectively. One can observe that the OS MAP EM algebraic method allows ameliorating both the resolution, by taking into account the SRVD in the reconstruction process and noise treatment by regularization. In addition, due to the OS technique the reconstruction times are acceptable

  19. Regular Benzodiazepine and Z-Substance Use and Risk of Dementia: An Analysis of German Claims Data.

    Science.gov (United States)

    Gomm, Willy; von Holt, Klaus; Thomé, Friederike; Broich, Karl; Maier, Wolfgang; Weckbecker, Klaus; Fink, Anne; Doblhammer, Gabriele; Haenisch, Britta

    2016-09-06

    While acute detrimental effects of benzodiazepine (BDZ), and BDZ and related z-substance (BDZR) use on cognition and memory are known, the association of BDZR use and risk of dementia in the elderly is controversially discussed. Previous studies on cohort or claims data mostly show an increased risk for dementia with the use of BDZs or BDZRs. For Germany, analyses on large population-based data sets are missing. To evaluate the association between regular BDZR use and incident any dementia in a large German claims data set. Using longitudinal German public health insurance data from 2004 to 2011 we analyzed the association between regular BDZR use (versus no BDZR use) and incident dementia in a case-control design. We examined patient samples aged≥60 years that were free of dementia at baseline. To address potential protopathic bias we introduced a lag time between BDZR prescription and dementia diagnosis. Odds ratios were calculated applying conditional logistic regression, adjusted for potential confounding factors such as comorbidities and polypharmacy. The regular use of BDZRs was associated with a significant increased risk of incident dementia for patients aged≥60 years (adjusted odds ratio [OR] 1.21, 95% confidence interval [CI] 1.13-1.29). The association was slightly stronger for long-acting substances than for short-acting ones. A trend for increased risk for dementia with higher exposure was observed. The restricted use of BDZRs may contribute to dementia prevention in the elderly.

  20. A study on assessment methodology of surveillance test interval and allowed outage time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol

    1996-07-01

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method

  1. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol [Seoul Nationl Univ., Seoul (Korea, Republic of)] (and others)

    1996-07-15

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method.

  2. Regularized forecasting of chaotic dynamical systems

    International Nuclear Information System (INIS)

    Bollt, Erik M.

    2017-01-01

    While local models of dynamical systems have been highly successful in terms of using extensive data sets observing even a chaotic dynamical system to produce useful forecasts, there is a typical problem as follows. Specifically, with k-near neighbors, kNN method, local observations occur due to recurrences in a chaotic system, and this allows for local models to be built by regression to low dimensional polynomial approximations of the underlying system estimating a Taylor series. This has been a popular approach, particularly in context of scalar data observations which have been represented by time-delay embedding methods. However such local models can generally allow for spatial discontinuities of forecasts when considered globally, meaning jumps in predictions because the collected near neighbors vary from point to point. The source of these discontinuities is generally that the set of near neighbors varies discontinuously with respect to the position of the sample point, and so therefore does the model built from the near neighbors. It is possible to utilize local information inferred from near neighbors as usual but at the same time to impose a degree of regularity on a global scale. We present here a new global perspective extending the general local modeling concept. In so doing, then we proceed to show how this perspective allows us to impose prior presumed regularity into the model, by involving the Tikhonov regularity theory, since this classic perspective of optimization in ill-posed problems naturally balances fitting an objective with some prior assumed form of the result, such as continuity or derivative regularity for example. This all reduces to matrix manipulations which we demonstrate on a simple data set, with the implication that it may find much broader context.

  3. The Temporal Dynamics of Regularity Extraction in Non-Human Primates

    Science.gov (United States)

    Minier, Laure; Fagot, Joël; Rey, Arnaud

    2016-01-01

    Extracting the regularities of our environment is one of our core cognitive abilities. To study the fine-grained dynamics of the extraction of embedded regularities, a method combining the advantages of the artificial language paradigm (Saffran, Aslin, & Newport, [Saffran, J. R., 1996]) and the serial response time task (Nissen & Bullemer,…

  4. Multiple-step fault estimation for interval type-II T-S fuzzy system of hypersonic vehicle with time-varying elevator faults

    Directory of Open Access Journals (Sweden)

    Jin Wang

    2017-03-01

    Full Text Available This article proposes a multiple-step fault estimation algorithm for hypersonic flight vehicles that uses an interval type-II Takagi–Sugeno fuzzy model. An interval type-II Takagi–Sugeno fuzzy model is developed to approximate the nonlinear dynamic system and handle the parameter uncertainties of hypersonic firstly. Then, a multiple-step time-varying additive fault estimation algorithm is designed to estimate time-varying additive elevator fault of hypersonic flight vehicles. Finally, the simulation is conducted in both aspects of modeling and fault estimation; the validity and availability of such method are verified by a series of the comparison of numerical simulation results.

  5. Assessment of time interval between tramadol intake and seizure and second drug-induced attack

    Directory of Open Access Journals (Sweden)

    Bahareh Abbasi

    2015-11-01

    Full Text Available Background: Tramadol is a synthetic drug which is prescribed in moderate and severe pain. Tramadol overdose can induce severe complications such as consciousness impairment and convulsions. This study was done to determine the convulsions incidence after tramadol use until one week after hospital discharge. Methods: This prospective study was done in tramadol overdose patients without uncontrolled epilepsy and head injury history. All cases admitted in Loghman and Rasol Akram Hospitals, Tehran, Iran from 1, April 2011 to 1, April 2012 were included and observed for at least 12 hours. Time interval between tramadol intake and first seizure were record. Then, patients with second drug-induced seizure were recognized and log time between the first and second seizure was analyzed. The patients were transferred to the intensive care unit (ICU if clinical worsening status observed. One week after hospital discharge, telephone follow-up was conducted. Results: A total of 150 patients with a history of tramadol induced seizures (141 men, 9 women, age: 23.23±5.94 years were enrolled in this study. Convulsion was seen in 104 patients (69.3%. In 8 out of 104 patients (7.6% two or more convulsion was seen. Time interval between tramadol use and the onset of the first and second seizure were 0.93±0.17 and 2.5±0.75 hours, respectively. Tramadol induced seizures are more likely to occur in males and patients with a history of drug abuse. Finally, one hundred forty nine patients (99.3% were discharged with good condition and the only one patient died from tramadol overdose. Conclusion: The results of the study showed tramadol induced seizure most frequently occurred within the first 4 hours of tramadol intake. The chance of experiencing a second seizure exists in the susceptible population. Thus, 4 hours after drug intake is the best time for patients to be hospital discharged.

  6. Delay-Dependent Stability Criterion for Bidirectional Associative Memory Neural Networks with Interval Time-Varying Delays

    Science.gov (United States)

    Park, Ju H.; Kwon, O. M.

    In the letter, the global asymptotic stability of bidirectional associative memory (BAM) neural networks with delays is investigated. The delay is assumed to be time-varying and belongs to a given interval. A novel stability criterion for the stability is presented based on the Lyapunov method. The criterion is represented in terms of linear matrix inequality (LMI), which can be solved easily by various optimization algorithms. Two numerical examples are illustrated to show the effectiveness of our new result.

  7. On a correspondence between regular and non-regular operator monotone functions

    DEFF Research Database (Denmark)

    Gibilisco, P.; Hansen, Frank; Isola, T.

    2009-01-01

    We prove the existence of a bijection between the regular and the non-regular operator monotone functions satisfying a certain functional equation. As an application we give a new proof of the operator monotonicity of certain functions related to the Wigner-Yanase-Dyson skew information....

  8. Optimal Data Interval for Estimating Advertising Response

    OpenAIRE

    Gerard J. Tellis; Philip Hans Franses

    2006-01-01

    The abundance of highly disaggregate data (e.g., at five-second intervals) raises the question of the optimal data interval to estimate advertising carryover. The literature assumes that (1) the optimal data interval is the interpurchase time, (2) too disaggregate data causes a disaggregation bias, and (3) recovery of true parameters requires assumption of the underlying advertising process. In contrast, we show that (1) the optimal data interval is what we call , (2) too disaggregate data do...

  9. The Influence of Pretreatment Characteristics and Radiotherapy Parameters on Time Interval to Development of Radiation-Associated Meningioma

    International Nuclear Information System (INIS)

    Paulino, Arnold C.; Ahmed, Irfan M.; Mai, Wei Y.; Teh, Bin S.

    2009-01-01

    Purpose: To identify pretreatment characteristics and radiotherapy parameters which may influence time interval to development of radiation-associated meningioma (RAM). Methods and Materials: A Medline/PUBMED search of articles dealing with RAM yielded 66 studies between 1981 and 2006. Factors analyzed included patient age and gender, type of initial tumor treated, radiotherapy (RT) dose and volume, and time interval from RT to development of RAM. Results: A total of 143 patients with a median age at RT of 12 years form the basis of this report. The most common initial tumors or conditions treated with RT were medulloblastoma (n = 27), pituitary adenoma (n = 20), acute lymphoblastic leukemia (n = 20), low-grade astrocytoma (n = 19), and tinea capitis (n = 14). In the 116 patients whose RT fields were known, 55 (47.4%) had a portion of the brain treated, whereas 32 (27.6%) and 29 (25.0%) had craniospinal and whole-brain fields. The median time from RT to develop a RAM or latent time (LT) was 19 years (range, 1-63 years). Male gender (p = 0.001), initial diagnosis of leukemia (p = 0.001), and use of whole brain or craniospinal field (p ≤ 0.0001) were associated with a shorter LT, whereas patients who received lower doses of RT had a longer LT (p < 0.0001). Conclusions: The latent time to develop a RAM was related to gender, initial tumor type, radiotherapy volume, and radiotherapy dose.

  10. Effect of the time interval between fusion and activation on epigenetic reprogramming and development of bovine somatic cell nuclear transfer embryos.

    Science.gov (United States)

    Liu, Jun; Wang, Yongsheng; Su, Jianmin; Wang, Lijun; Li, Ruizhe; Li, Qian; Wu, Yongyan; Hua, Song; Quan, Fusheng; Guo, Zekun; Zhang, Yong

    2013-04-01

    Previous studies have shown that the time interval between fusion and activation (FA interval) play an important role in nuclear remodeling and in vitro development of somatic cell nuclear transfer (SCNT) embryos. However, the effects of FA interval on the epigenetic reprogramming and in vivo developmental competence of SCNT embryos remain unknown. In the present study, the effects of different FA intervals (0 h, 2 h, and 4 h) on the epigenetic reprogramming and developmental competence of bovine SCNT embryos were assessed. The results demonstrated that H3 lysine 9 (H3K9ac) levels decreased rapidly after fusion in all three groups. H3K9ac was practically undetectable 2 h after fusion in the 2-h and 4-h FA interval groups. However, H3K9ac was still evidently detectable in the 0-h FA interval group. The H3K9ac levels increased 10 h after fusion in all three groups, but were higher in the 2-h and 4-h FA interval groups than that in the 0-h FA interval group. The methylation levels of the satellite I region in day-7 blastocysts derived from the 2-h or 4-h FA interval groups was similar to that of in vitro fertilization blastocysts and is significantly lower than that of the 0-h FA interval group. SCNT embryos derived from 2-h FA interval group showed higher developmental competence than those from the 0-h and 4-h FA interval groups in terms of cleavage rate, blastocyst formation rate, apoptosis index, and pregnancy and calving rates. Hence, the FA interval is an important factor influencing the epigenetic reprogramming and developmental competence of bovine SCNT embryos.

  11. The neural substrates of impaired finger tapping regularity after stroke.

    Science.gov (United States)

    Calautti, Cinzia; Jones, P Simon; Guincestre, Jean-Yves; Naccarato, Marcello; Sharma, Nikhil; Day, Diana J; Carpenter, T Adrian; Warburton, Elizabeth A; Baron, Jean-Claude

    2010-03-01

    Not only finger tapping speed, but also tapping regularity can be impaired after stroke, contributing to reduced dexterity. The neural substrates of impaired tapping regularity after stroke are unknown. Previous work suggests damage to the dorsal premotor cortex (PMd) and prefrontal cortex (PFCx) affects externally-cued hand movement. We tested the hypothesis that these two areas are involved in impaired post-stroke tapping regularity. In 19 right-handed patients (15 men/4 women; age 45-80 years; purely subcortical in 16) partially to fully recovered from hemiparetic stroke, tri-axial accelerometric quantitative assessment of tapping regularity and BOLD fMRI were obtained during fixed-rate auditory-cued index-thumb tapping, in a single session 10-230 days after stroke. A strong random-effect correlation between tapping regularity index and fMRI signal was found in contralesional PMd such that the worse the regularity the stronger the activation. A significant correlation in the opposite direction was also present within contralesional PFCx. Both correlations were maintained if maximal index tapping speed, degree of paresis and time since stroke were added as potential confounds. Thus, the contralesional PMd and PFCx appear to be involved in the impaired ability of stroke patients to fingertap in pace with external cues. The findings for PMd are consistent with repetitive TMS investigations in stroke suggesting a role for this area in affected-hand movement timing. The inverse relationship with tapping regularity observed for the PFCx and the PMd suggests these two anatomically-connected areas negatively co-operate. These findings have implications for understanding the disruption and reorganization of the motor systems after stroke. Copyright (c) 2009 Elsevier Inc. All rights reserved.

  12. Stochastic analytic regularization

    International Nuclear Information System (INIS)

    Alfaro, J.

    1984-07-01

    Stochastic regularization is reexamined, pointing out a restriction on its use due to a new type of divergence which is not present in the unregulated theory. Furthermore, we introduce a new form of stochastic regularization which permits the use of a minimal subtraction scheme to define the renormalized Green functions. (author)

  13. Adaptive Changes After 2 Weeks of 10-s Sprint Interval Training With Various Recovery Times

    Directory of Open Access Journals (Sweden)

    Robert A. Olek

    2018-04-01

    Full Text Available Purpose: The aim of this study was to compare the effect of applying two different rest recovery times in a 10-s sprint interval training session on aerobic and anaerobic capacities as well as skeletal muscle enzyme activities.Methods: Fourteen physically active but not highly trained male subjects (mean maximal oxygen uptake 50.5 ± 1.0 mlO2·kg−1·min−1 participated in the study. The training protocol involved a series of 10-s sprints separated by either 1-min (SIT10:1 or 4-min (SIT10:4 of recovery. The number of sprints progressed from four to six over six sessions separated by 1–2 days rest. Pre and post intervention anthropometric measurements, assessment of aerobic, anaerobic capacity and muscle biopsy were performed. In the muscle samples maximal activities of citrate synthase (CS, 3-hydroxyacylCoA dehydrogenase (HADH, carnitine palmitoyl-transferase (CPT, malate dehydrogenase (MDH, and its mitochondrial form (mMDH, as well as lactate dehydrogenase (LDH were determined. Analysis of variance was performed to determine changes between conditions.Results: Maximal oxygen uptake improved significantly in both training groups, by 13.6% in SIT10:1 and 11.9% in SIT10:4, with no difference between groups. Wingate anaerobic test results indicated main effect of time for total work, peak power output and mean power output, which increased significantly and similarly in both groups. Significant differences between training groups were observed for end power output, which increased by 10.8% in SIT10:1, but remained unchanged in SIT10:4. Both training protocols induced similar increase in CS activity (main effect of time p < 0.05, but no other enzymes.Conclusion: Sprint interval training protocols induce metabolic adaptation over a short period of time, and the reduced recovery between bouts may attenuate fatigue during maximal exercise.

  14. Semisupervised Support Vector Machines With Tangent Space Intrinsic Manifold Regularization.

    Science.gov (United States)

    Sun, Shiliang; Xie, Xijiong

    2016-09-01

    Semisupervised learning has been an active research topic in machine learning and data mining. One main reason is that labeling examples is expensive and time-consuming, while there are large numbers of unlabeled examples available in many practical problems. So far, Laplacian regularization has been widely used in semisupervised learning. In this paper, we propose a new regularization method called tangent space intrinsic manifold regularization. It is intrinsic to data manifold and favors linear functions on the manifold. Fundamental elements involved in the formulation of the regularization are local tangent space representations, which are estimated by local principal component analysis, and the connections that relate adjacent tangent spaces. Simultaneously, we explore its application to semisupervised classification and propose two new learning algorithms called tangent space intrinsic manifold regularized support vector machines (TiSVMs) and tangent space intrinsic manifold regularized twin SVMs (TiTSVMs). They effectively integrate the tangent space intrinsic manifold regularization consideration. The optimization of TiSVMs can be solved by a standard quadratic programming, while the optimization of TiTSVMs can be solved by a pair of standard quadratic programmings. The experimental results of semisupervised classification problems show the effectiveness of the proposed semisupervised learning algorithms.

  15. Time interval between maternal electrocardiogram and venous Doppler waves in normal pregnancy and preeclampsia: a pilot study.

    Science.gov (United States)

    Tomsin, K; Mesens, T; Molenberghs, G; Peeters, L; Gyselaers, W

    2012-12-01

    To evaluate the time interval between maternal electrocardiogram (ECG) and venous Doppler waves at different stages of uncomplicated pregnancy (UP) and in preeclampsia (PE). Cross-sectional pilot study in 40 uncomplicated singleton pregnancies, categorized in four groups of ten according to gestational age: 10 - 14 weeks (UP1), 18 - 23 weeks (UP2), 28 - 33 weeks (UP3) and ≥ 37 weeks (UP4) of gestation. A fifth group of ten women with PE was also included. A Doppler flow examination at the level of renal interlobar veins (RIV) and hepatic veins (HV) was performed according to a standard protocol, in association with a maternal ECG. The time interval between the ECG P-wave and the corresponding A-deflection of the venous Doppler waves was measured (PA), and expressed relative to the duration of the cardiac cycle (RR), and labeled PA/RR. In hepatic veins, the PA/RR is longer in UP 4 than in UP 1 (0.48 ± 0.15 versus 0.29 ± 0.09, p ≤ 0.001). When all UP groups were compared, the PA/RR increased gradually with gestational age. In PE, the HV PA/RR is shorter than in UP 3 (0.25 ± 0.09 versus 0.42 ± 0.14, p advanced gestational stages are consistent with known features of maternal cardiovascular adaptation. Shorter values in preeclampsia are consistent with maternal cardiovascular maladaptation mechanisms. Our pilot study invites more research of the relevance of the time interval between maternal ECG and venous Doppler waves as a new parameter for studying the gestational cardiovascular (patho)physiology of the maternal venous compartment by duplex sonography. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Solution path for manifold regularized semisupervised classification.

    Science.gov (United States)

    Wang, Gang; Wang, Fei; Chen, Tao; Yeung, Dit-Yan; Lochovsky, Frederick H

    2012-04-01

    Traditional learning algorithms use only labeled data for training. However, labeled examples are often difficult or time consuming to obtain since they require substantial human labeling efforts. On the other hand, unlabeled data are often relatively easy to collect. Semisupervised learning addresses this problem by using large quantities of unlabeled data with labeled data to build better learning algorithms. In this paper, we use the manifold regularization approach to formulate the semisupervised learning problem where a regularization framework which balances a tradeoff between loss and penalty is established. We investigate different implementations of the loss function and identify the methods which have the least computational expense. The regularization hyperparameter, which determines the balance between loss and penalty, is crucial to model selection. Accordingly, we derive an algorithm that can fit the entire path of solutions for every value of the hyperparameter. Its computational complexity after preprocessing is quadratic only in the number of labeled examples rather than the total number of labeled and unlabeled examples.

  17. Socioeconomic position and the primary care interval

    DEFF Research Database (Denmark)

    Vedsted, Anders

    2018-01-01

    to the easiness to interpret the symptoms of the underlying cancer. Methods. We conducted a population-based cohort study using survey data on time intervals linked at an individually level to routine collected data on demographics from Danish registries. Using logistic regression we estimated the odds......Introduction. Diagnostic delays affect cancer survival negatively. Thus, the time interval from symptomatic presentation to a GP until referral to secondary care (i.e. primary care interval (PCI)), should be as short as possible. Lower socioeconomic position seems associated with poorer cancer...... younger than 45 years of age and older than 54 years of age had longer primary care interval than patients aged ‘45-54’ years. No other associations for SEP characteristics were observed. The findings may imply that GPs are referring patients regardless of SEP, although some room for improvement prevails...

  18. Initial Systolic Time Interval (ISTI) as a Predictor of Intradialytic Hypotension (IDH)

    International Nuclear Information System (INIS)

    Biesheuvel, J D; Verdaasdonk, R M; Meijer, JH; Vervloet, M G

    2013-01-01

    In haemodialysis treatment the clearance and volume control by the kidneys of a patient are partially replaced by intermittent haemodialysis. Because this artificial process is performed on a limited time scale, unphysiological imbalances in the fluid compartments of the body occur, that can lead to intradialytic hypotensions (IDH). An IDH endangers the efficacy of the haemodialysis session and is associated with dismal clinical endpoints, including mortality. A diagnostic method that predicts the occurrence of these drops in blood pressure could facilitate timely measures for the prevention of IDH. The present study investigates whether the Initial Systolic Time Interval (ISTI) can provide such a diagnostic method. The ISTI is defined as the time difference between the R-peak in the electrocardiogram (ECG) and the C-wave in the impedance cardiogram (ICG) and is considered to be a non-invasive assessment of the time delay between the electrical and mechanical activity of the heart. This time delay has previously been found to depend on autonomic nervous function as well as preload of the heart. Therefore, it can be expected that ISTI may predict an imminent IDH caused by a low circulating blood volume. This ongoing observational clinical study investigates the relationship between changes in ISTI and subsequent drops in blood pressure during haemodialysis. A registration of a complicated dialysis showed a significant correlation between a drop in blood pressure, a decrease in relative blood volume and a substantial increase in ISTI. An uncomplicated dialysis, in which also a considerable amount of fluid was removed, showed no correlations. Both, blood pressure and ISTI remained stable. In conclusion, the preliminary results of the present study show a substantial response of ISTI to haemodynamic instability, indicating an application in optimization and individualisation of the dialysis process.

  19. Decoupling of modeling and measuring interval in groundwater time series analysis based on response characteristics

    NARCIS (Netherlands)

    Berendrecht, W.L.; Heemink, A.W.; Geer, F.C. van; Gehrels, J.C.

    2003-01-01

    A state-space representation of the transfer function-noise (TFN) model allows the choice of a modeling (input) interval that is smaller than the measuring interval of the output variable. Since in geohydrological applications the interval of the available input series (precipitation excess) is

  20. Explicit isospectral flows associated to the AKNS operator on the unit interval. II

    Science.gov (United States)

    Amour, Laurent

    2012-10-01

    Explicit flows associated to any tangent vector fields on any isospectral manifold for the AKNS operator acting in L2 × L2 on the unit interval are written down. The manifolds are of infinite dimension (and infinite codimension). The flows are called isospectral and also are Hamiltonian flows. It is proven that they may be explicitly expressed in terms of regularized determinants of infinite matrix-valued functions with entries depending only on the spectral data at the starting point of the flow. The tangent vector fields are decomposed as ∑ξkTk where ξ ∈ ℓ2 and the Tk ∈ L2 × L2 form a particular basis of the tangent vector spaces of the infinite dimensional manifold. The paper here is a continuation of Amour ["Explicit isospectral flows for the AKNS operator on the unit interval," Inverse Probl. 25, 095008 (2009)], 10.1088/0266-5611/25/9/095008 where, except for a finite number, all the components of the sequence ξ are zero in order to obtain an explicit expression for the isospectral flows. The regularized determinants induce counter-terms allowing for the consideration of finite quantities when the sequences ξ run all over ℓ2.

  1. Effective field theory dimensional regularization

    International Nuclear Information System (INIS)

    Lehmann, Dirk; Prezeau, Gary

    2002-01-01

    A Lorentz-covariant regularization scheme for effective field theories with an arbitrary number of propagating heavy and light particles is given. This regularization scheme leaves the low-energy analytic structure of Greens functions intact and preserves all the symmetries of the underlying Lagrangian. The power divergences of regularized loop integrals are controlled by the low-energy kinematic variables. Simple diagrammatic rules are derived for the regularization of arbitrary one-loop graphs and the generalization to higher loops is discussed

  2. Effective field theory dimensional regularization

    Science.gov (United States)

    Lehmann, Dirk; Prézeau, Gary

    2002-01-01

    A Lorentz-covariant regularization scheme for effective field theories with an arbitrary number of propagating heavy and light particles is given. This regularization scheme leaves the low-energy analytic structure of Greens functions intact and preserves all the symmetries of the underlying Lagrangian. The power divergences of regularized loop integrals are controlled by the low-energy kinematic variables. Simple diagrammatic rules are derived for the regularization of arbitrary one-loop graphs and the generalization to higher loops is discussed.

  3. Affective Responses to Repeated Sessions of High-Intensity Interval Training.

    Science.gov (United States)

    Saanijoki, Tiina; Nummenmaa, Lauri; Eskelinen, Jari-Joonas; Savolainen, Anna M; Vahlberg, Tero; Kalliokoski, Kari K; Hannukainen, Jarna C

    2015-12-01

    Vigorous exercise feels unpleasant, and negative emotions may discourage adherence to regular exercise. We quantified the subjective affective responses to short-term high-intensity interval training (HIT) in comparison with moderate-intensity continuous training (MIT). Twenty-six healthy middle-age (mean age, 47 ± 5 yr; mean VO2peak, 34.2 ± 4.1 mL·kg⁻¹·min⁻¹) sedentary men were randomized into HIT (n = 13, 4-6 × 30 s of all-out cycling efforts at approximately 180% of peak workload with 4-min recovery) or MIT (n = 13, 40- to 60-min continuous cycling at 60% of peak workload) groups, performing six sessions within two weeks. Perceived exertion, stress, and affective state were recorded before, during, and after each session. Perceived exertion and arousal were higher, and affective state, more negative during the HIT than that during MIT sessions (P training. Peak oxygen consumption increased (P training). Short-term HIT and MIT are equally effective in improving aerobic fitness, but HIT increases experience of negative emotions and exertion in sedentary middle-age men. This may limit the adherence to this time-effective training mode, even though displeasure lessens over time and suggests similar mental adaptations to both MIT and HIT.

  4. Effect of the sub-threshold periodic current forcing on the regularity and the synchronization of neuronal spiking activity

    International Nuclear Information System (INIS)

    Ozer, Mahmut; Uzuntarla, Muhammet; Agaoglu, Sukriye Nihal

    2006-01-01

    We first investigate the amplitude effect of the subthreshold periodic forcing on the regularity of the spiking events by using the coefficient of variation of interspike intervals. We show that the resonance effect in the coefficient of variation, which is dependent on the driving frequency for larger membrane patch sizes, disappears when the amplitude of the subthreshold forcing is decreased. Then, we demonstrate that the timings of the spiking events of a noisy and periodically driven neuron concentrate on a specific phase of the stimulus. We also show that increasing the intensity of the noise causes the phase probability density of the spiking events to get smaller values, and eliminates differences in the phase locking behavior of the neuron for different patch sizes

  5. Conversion rate of laparoscopic cholecystectomy after endoscopic retrograde cholangiography in the treatment of choledocholithiasis - Does the time interval matter?

    NARCIS (Netherlands)

    de Vries, A.; Donkervoort, S. C.; van Geloven, A. A. W.; Pierik, E. G. J. M.

    2005-01-01

    Background: Preceding endoscopic retrograde cholangiography (ERC) in patients with choledochocystolithiasis impedes laparoscopic cholecystectomy (LC) and increases risk of conversion. We studied the influence of time interval between ERC and LC on the course of LC. Methods: All patients treated for

  6. Hierarchical regular small-world networks

    International Nuclear Information System (INIS)

    Boettcher, Stefan; Goncalves, Bruno; Guclu, Hasan

    2008-01-01

    Two new networks are introduced that resemble small-world properties. These networks are recursively constructed but retain a fixed, regular degree. They possess a unique one-dimensional lattice backbone overlaid by a hierarchical sequence of long-distance links, mixing real-space and small-world features. Both networks, one 3-regular and the other 4-regular, lead to distinct behaviors, as revealed by renormalization group studies. The 3-regular network is planar, has a diameter growing as √N with system size N, and leads to super-diffusion with an exact, anomalous exponent d w = 1.306..., but possesses only a trivial fixed point T c = 0 for the Ising ferromagnet. In turn, the 4-regular network is non-planar, has a diameter growing as ∼2 √(log 2 N 2 ) , exhibits 'ballistic' diffusion (d w = 1), and a non-trivial ferromagnetic transition, T c > 0. It suggests that the 3-regular network is still quite 'geometric', while the 4-regular network qualifies as a true small world with mean-field properties. As an engineering application we discuss synchronization of processors on these networks. (fast track communication)

  7. General inverse problems for regular variation

    DEFF Research Database (Denmark)

    Damek, Ewa; Mikosch, Thomas Valentin; Rosinski, Jan

    2014-01-01

    Regular variation of distributional tails is known to be preserved by various linear transformations of some random structures. An inverse problem for regular variation aims at understanding whether the regular variation of a transformed random object is caused by regular variation of components ...

  8. A prospective study of concussions among National Hockey League players during regular season games: the NHL-NHLPA Concussion Program.

    Science.gov (United States)

    Benson, Brian W; Meeuwisse, Willem H; Rizos, John; Kang, Jian; Burke, Charles J

    2011-05-17

    In 1997, the National Hockey League (NHL) and NHL Players' Association (NHLPA) launched a concussion program to improve the understanding of this injury. We explored initial postconcussion signs, symptoms, physical examination findings and time loss (i.e., time between the injury and medical clearance by the physician to return to competitive play), experienced by male professional ice-hockey players, and assessed the utility of initial postconcussion clinical manifestations in predicting time loss among hockey players. We conducted a prospective case series of concussions over seven NHL regular seasons (1997-2004) using an inclusive cohort of players. The primary outcome was concussion and the secondary outcome was time loss. NHL team physicians documented post-concussion clinical manifestations and recorded the date when a player was medically cleared to return to play. Team physicians reported 559 concussions during regular season games. The estimated incidence was 1.8 concussions per 1000 player-hours. The most common postconcussion symptom was headache (71%). On average, time loss (in days) increased 2.25 times (95% confidence interval [CI] 1.41-3.62) for every subsequent (i.e., recurrent) concussion sustained during the study period. Controlling for age and position, significant predictors of time loss were postconcussion headache (p < 0.001), low energy or fatigue (p = 0.01), amnesia (p = 0.02) and abnormal neurologic examination (p = 0.01). Using a previously suggested time loss cut-point of 10 days, headache (odds ratio [OR] 2.17, 95% CI 1.33-3.54) and low energy or fatigue (OR 1.72, 95% CI 1.04-2.85) were significant predictors of time loss of more than 10 days. Postconcussion headache, low energy or fatigue, amnesia and abnormal neurologic examination were significant predictors of time loss among professional hockey players.

  9. Concurrent variable-interval variable-ratio schedules in a dynamic choice environment.

    Science.gov (United States)

    Bell, Matthew C; Baum, William M

    2017-11-01

    Most studies of operant choice have focused on presenting subjects with a fixed pair of schedules across many experimental sessions. Using these methods, studies of concurrent variable- interval variable-ratio schedules helped to evaluate theories of choice. More recently, a growing literature has focused on dynamic choice behavior. Those dynamic choice studies have analyzed behavior on a number of different time scales using concurrent variable-interval schedules. Following the dynamic choice approach, the present experiment examined performance on concurrent variable-interval variable-ratio schedules in a rapidly changing environment. Our objectives were to compare performance on concurrent variable-interval variable-ratio schedules with extant data on concurrent variable-interval variable-interval schedules using a dynamic choice procedure and to extend earlier work on concurrent variable-interval variable-ratio schedules. We analyzed performances at different time scales, finding strong similarities between concurrent variable-interval variable-interval and concurrent variable-interval variable- ratio performance within dynamic choice procedures. Time-based measures revealed almost identical performance in the two procedures compared with response-based measures, supporting the view that choice is best understood as time allocation. Performance at the smaller time scale of visits accorded with the tendency seen in earlier research toward developing a pattern of strong preference for and long visits to the richer alternative paired with brief "samples" at the leaner alternative ("fix and sample"). © 2017 Society for the Experimental Analysis of Behavior.

  10. The relationship between lifestyle regularity and subjective sleep quality

    Science.gov (United States)

    Monk, Timothy H.; Reynolds, Charles F 3rd; Buysse, Daniel J.; DeGrazia, Jean M.; Kupfer, David J.

    2003-01-01

    In previous work we have developed a diary instrument-the Social Rhythm Metric (SRM), which allows the assessment of lifestyle regularity-and a questionnaire instrument--the Pittsburgh Sleep Quality Index (PSQI), which allows the assessment of subjective sleep quality. The aim of the present study was to explore the relationship between lifestyle regularity and subjective sleep quality. Lifestyle regularity was assessed by both standard (SRM-17) and shortened (SRM-5) metrics; subjective sleep quality was assessed by the PSQI. We hypothesized that high lifestyle regularity would be conducive to better sleep. Both instruments were given to a sample of 100 healthy subjects who were studied as part of a variety of different experiments spanning a 9-yr time frame. Ages ranged from 19 to 49 yr (mean age: 31.2 yr, s.d.: 7.8 yr); there were 48 women and 52 men. SRM scores were derived from a two-week diary. The hypothesis was confirmed. There was a significant (rho = -0.4, p subjects with higher levels of lifestyle regularity reported fewer sleep problems. This relationship was also supported by a categorical analysis, where the proportion of "poor sleepers" was doubled in the "irregular types" group as compared with the "non-irregular types" group. Thus, there appears to be an association between lifestyle regularity and good sleep, though the direction of causality remains to be tested.

  11. Low social rhythm regularity predicts first onset of bipolar spectrum disorders among at-risk individuals with reward hypersensitivity.

    Science.gov (United States)

    Alloy, Lauren B; Boland, Elaine M; Ng, Tommy H; Whitehouse, Wayne G; Abramson, Lyn Y

    2015-11-01

    The social zeitgeber model (Ehlers, Frank, & Kupfer, 1988) suggests that irregular daily schedules or social rhythms provide vulnerability to bipolar spectrum disorders. This study tested whether social rhythm regularity prospectively predicted first lifetime onset of bipolar spectrum disorders in adolescents already at risk for bipolar disorder based on exhibiting reward hypersensitivity. Adolescents (ages 14-19 years) previously screened to have high (n = 138) or moderate (n = 95) reward sensitivity, but no lifetime history of bipolar spectrum disorder, completed measures of depressive and manic symptoms, family history of bipolar disorder, and the Social Rhythm Metric. They were followed prospectively with semistructured diagnostic interviews every 6 months for an average of 31.7 (SD = 20.1) months. Hierarchical logistic regression indicated that low social rhythm regularity at baseline predicted greater likelihood of first onset of bipolar spectrum disorder over follow-up among high-reward-sensitivity adolescents but not moderate-reward-sensitivity adolescents, controlling for follow-up time, gender, age, family history of bipolar disorder, and initial manic and depressive symptoms (β = -.150, Wald = 4.365, p = .037, odds ratio = .861, 95% confidence interval [.748, .991]). Consistent with the social zeitgeber theory, low social rhythm regularity provides vulnerability to first onset of bipolar spectrum disorder among at-risk adolescents. It may be possible to identify adolescents at risk for developing a bipolar spectrum disorder based on exhibiting both reward hypersensitivity and social rhythm irregularity before onset occurs. (c) 2015 APA, all rights reserved).

  12. Experiment Design Regularization-Based Hardware/Software Codesign for Real-Time Enhanced Imaging in Uncertain Remote Sensing Environment

    Directory of Open Access Journals (Sweden)

    Castillo Atoche A

    2010-01-01

    Full Text Available A new aggregated Hardware/Software (HW/SW codesign approach to optimization of the digital signal processing techniques for enhanced imaging with real-world uncertain remote sensing (RS data based on the concept of descriptive experiment design regularization (DEDR is addressed. We consider the applications of the developed approach to typical single-look synthetic aperture radar (SAR imaging systems operating in the real-world uncertain RS scenarios. The software design is aimed at the algorithmic-level decrease of the computational load of the large-scale SAR image enhancement tasks. The innovative algorithmic idea is to incorporate into the DEDR-optimized fixed-point iterative reconstruction/enhancement procedure the convex convergence enforcement regularization via constructing the proper multilevel projections onto convex sets (POCS in the solution domain. The hardware design is performed via systolic array computing based on a Xilinx Field Programmable Gate Array (FPGA XC4VSX35-10ff668 and is aimed at implementing the unified DEDR-POCS image enhancement/reconstruction procedures in a computationally efficient multi-level parallel fashion that meets the (near real-time image processing requirements. Finally, we comment on the simulation results indicative of the significantly increased performance efficiency both in resolution enhancement and in computational complexity reduction metrics gained with the proposed aggregated HW/SW co-design approach.

  13. VALIDATION OF SPRING OPERATED PRESSURE RELIEF VALVE TIME TO FAILURE AND THE IMPORTANCE OF STATISTICALLY SUPPORTED MAINTENANCE INTERVALS

    Energy Technology Data Exchange (ETDEWEB)

    Gross, R; Stephen Harris, S

    2009-02-18

    The Savannah River Site operates a Relief Valve Repair Shop certified by the National Board of Pressure Vessel Inspectors to NB-23, The National Board Inspection Code. Local maintenance forces perform inspection, testing, and repair of approximately 1200 spring-operated relief valves (SORV) each year as the valves are cycled in from the field. The Site now has over 7000 certified test records in the Computerized Maintenance Management System (CMMS); a summary of that data is presented in this paper. In previous papers, several statistical techniques were used to investigate failure on demand and failure rates including a quantal response method for predicting the failure probability as a function of time in service. The non-conservative failure mode for SORV is commonly termed 'stuck shut'; industry defined as the valve opening at greater than or equal to 1.5 times the cold set pressure. Actual time to failure is typically not known, only that failure occurred some time since the last proof test (censored data). This paper attempts to validate the assumptions underlying the statistical lifetime prediction results using Monte Carlo simulation. It employs an aging model for lift pressure as a function of set pressure, valve manufacturer, and a time-related aging effect. This paper attempts to answer two questions: (1) what is the predicted failure rate over the chosen maintenance/ inspection interval; and do we understand aging sufficient enough to estimate risk when basing proof test intervals on proof test results?

  14. Phase-modified CTQW unable to distinguish strongly regular graphs efficiently

    International Nuclear Information System (INIS)

    Mahasinghe, A; Wijerathna, J K; Izaac, J A; Wang, J B

    2015-01-01

    Various quantum walk-based algorithms have been developed, aiming to distinguish non-isomorphic graphs with polynomial scaling, within both the discrete-time quantum walk (DTQW) and continuous-time quantum walk (CTQW) frameworks. Whilst both the single-particle DTQW and CTQW have failed to distinguish non-isomorphic strongly regular graph families (prompting the move to multi-particle graph isomorphism (GI) algorithms), the single-particle DTQW has been successfully modified by the introduction of a phase factor to distinguish a wide range of graphs in polynomial time. In this paper, we prove that an analogous phase modification to the single particle CTQW does not have the same distinguishing power as its discrete-time counterpart, in particular it cannot distinguish strongly regular graphs with the same family parameters with the same efficiency. (paper)

  15. Continuum-regularized quantum gravity

    International Nuclear Information System (INIS)

    Chan Huesum; Halpern, M.B.

    1987-01-01

    The recent continuum regularization of d-dimensional Euclidean gravity is generalized to arbitrary power-law measure and studied in some detail as a representative example of coordinate-invariant regularization. The weak-coupling expansion of the theory illustrates a generic geometrization of regularized Schwinger-Dyson rules, generalizing previous rules in flat space and flat superspace. The rules are applied in a non-trivial explicit check of Einstein invariance at one loop: the cosmological counterterm is computed and its contribution is included in a verification that the graviton mass is zero. (orig.)

  16. Predicting restoration of kidney function during CRRT-free intervals

    Directory of Open Access Journals (Sweden)

    Heise Daniel

    2012-01-01

    Full Text Available Abstract Background Renal failure is common in critically ill patients and frequently requires continuous renal replacement therapy (CRRT. CRRT is discontinued at regular intervals for routine changes of the disposable equipment or for replacing clogged filter membrane assemblies. The present study was conducted to determine if the necessity to continue CRRT could be predicted during the CRRT-free period. Materials and methods In the period from 2003 to 2006, 605 patients were treated with CRRT in our ICU. A total of 222 patients with 448 CRRT-free intervals had complete data sets and were used for analysis. Of the total CRRT-free periods, 225 served as an evaluation group. Twenty-nine parameters with an assumed influence on kidney function were analyzed with regard to their potential to predict the restoration of kidney function during the CRRT-free interval. Using univariate analysis and logistic regression, a prospective index was developed and validated in the remaining 223 CRRT-free periods to establish its prognostic strength. Results Only three parameters showed an independent influence on the restoration of kidney function during CRRT-free intervals: the number of previous CRRT cycles (medians in the two outcome groups: 1 vs. 2, the "Sequential Organ Failure Assessment"-score (means in the two outcome groups: 8.3 vs. 9.2 and urinary output after the cessation of CRRT (medians in two outcome groups: 66 ml/h vs. 10 ml/h. The prognostic index, which was calculated from these three variables, showed a satisfactory potential to predict the kidney function during the CRRT-free intervals; Receiver operating characteristic (ROC analysis revealed an area under the curve of 0.798. Conclusion Restoration of kidney function during CRRT-free periods can be predicted with an index calculated from three variables. Prospective trials in other hospitals must clarify whether our results are generally transferable to other patient populations.

  17. Multichannel interval timer

    International Nuclear Information System (INIS)

    Turko, B.T.

    1983-10-01

    A CAMAC based modular multichannel interval timer is described. The timer comprises twelve high resolution time digitizers with a common start enabling twelve independent stop inputs. Ten time ranges from 2.5 μs to 1.3 μs can be preset. Time can be read out in twelve 24-bit words either via CAMAC Crate Controller or an external FIFO register. LSB time calibration is 78.125 ps. An additional word reads out the operational status of twelve stop channels. The system consists of two modules. The analog module contains a reference clock and 13 analog time stretchers. The digital module contains counters, logic and interface circuits. The timer has an excellent differential linearity, thermal stability and crosstalk free performance

  18. Online co-regularized algorithms

    NARCIS (Netherlands)

    Ruijter, T. de; Tsivtsivadze, E.; Heskes, T.

    2012-01-01

    We propose an online co-regularized learning algorithm for classification and regression tasks. We demonstrate that by sequentially co-regularizing prediction functions on unlabeled data points, our algorithm provides improved performance in comparison to supervised methods on several UCI benchmarks

  19. The effect of chorionicity and twin-to-twin delivery time interval on short-term outcome of the second twin

    DEFF Research Database (Denmark)

    Hjortø, Sofie; Nickelsen, Carsten; Petersen, Janne

    2013-01-01

    Abstract Objectives: To investigate the effect of chorionicity and twin-to-twin delivery time interval on short-term outcome in the second twin. Additionally, to investigate predictors of adverse outcome in both twins. Methods: Data included vaginally delivered twins (≥ 36 weeks) from Copenhagen ...

  20. Geometric continuum regularization of quantum field theory

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1989-01-01

    An overview of the continuum regularization program is given. The program is traced from its roots in stochastic quantization, with emphasis on the examples of regularized gauge theory, the regularized general nonlinear sigma model and regularized quantum gravity. In its coordinate-invariant form, the regularization is seen as entirely geometric: only the supermetric on field deformations is regularized, and the prescription provides universal nonperturbative invariant continuum regularization across all quantum field theory. 54 refs

  1. INTERVAL OBSERVER FOR A BIOLOGICAL REACTOR MODEL

    Directory of Open Access Journals (Sweden)

    T. A. Kharkovskaia

    2014-05-01

    Full Text Available The method of an interval observer design for nonlinear systems with parametric uncertainties is considered. The interval observer synthesis problem for systems with varying parameters consists in the following. If there is the uncertainty restraint for the state values of the system, limiting the initial conditions of the system and the set of admissible values for the vector of unknown parameters and inputs, the interval existence condition for the estimations of the system state variables, containing the actual state at a given time, needs to be held valid over the whole considered time segment as well. Conditions of the interval observers design for the considered class of systems are shown. They are: limitation of the input and state, the existence of a majorizing function defining the uncertainty vector for the system, Lipschitz continuity or finiteness of this function, the existence of an observer gain with the suitable Lyapunov matrix. The main condition for design of such a device is cooperativity of the interval estimation error dynamics. An individual observer gain matrix selection problem is considered. In order to ensure the property of cooperativity for interval estimation error dynamics, a static transformation of coordinates is proposed. The proposed algorithm is demonstrated by computer modeling of the biological reactor. Possible applications of these interval estimation systems are the spheres of robust control, where the presence of various types of uncertainties in the system dynamics is assumed, biotechnology and environmental systems and processes, mechatronics and robotics, etc.

  2. Development of Salivary Cortisol Circadian Rhythm and Reference Intervals in Full-Term Infants.

    Science.gov (United States)

    Ivars, Katrin; Nelson, Nina; Theodorsson, Annette; Theodorsson, Elvar; Ström, Jakob O; Mörelius, Evalotte

    2015-01-01

    Cortisol concentrations in plasma display a circadian rhythm in adults and children older than one year. Earlier studies report divergent results regarding when cortisol circadian rhythm is established. The present study aims to investigate at what age infants develop a circadian rhythm, as well as the possible influences of behavioral regularity and daily life trauma on when the rhythm is established. Furthermore, we determine age-related reference intervals for cortisol concentrations in saliva during the first year of life. 130 healthy full-term infants were included in a prospective, longitudinal study with saliva sampling on two consecutive days, in the morning (07:30-09:30), noon (10:00-12:00) and evening (19:30-21:30), each month from birth until the infant was twelve months old. Information about development of behavioral regularity and potential exposure to trauma was obtained from the parents through the Baby Behavior Questionnaire and the Life Incidence of Traumatic Events checklist. A significant group-level circadian rhythm of salivary cortisol secretion was established at one month, and remained throughout the first year of life, although there was considerable individual variability. No correlation was found between development of cortisol circadian rhythm and the results from either the Baby Behavior Questionnaire or the Life Incidence of Traumatic Events checklist. The study presents salivary cortisol reference intervals for infants during the first twelve months of life. Cortisol circadian rhythm in infants is already established by one month of age, earlier than previous studies have shown. The current study also provides first year age-related reference intervals for salivary cortisol levels in healthy, full-term infants.

  3. The patterning of retinal horizontal cells: normalizing the regularity index enhances the detection of genomic linkage

    Directory of Open Access Journals (Sweden)

    Patrick W. Keeley

    2014-10-01

    Full Text Available Retinal neurons are often arranged as non-random distributions called mosaics, as their somata minimize proximity to neighboring cells of the same type. The horizontal cells serve as an example of such a mosaic, but little is known about the developmental mechanisms that underlie their patterning. To identify genes involved in this process, we have used three different spatial statistics to assess the patterning of the horizontal cell mosaic across a panel of genetically distinct recombinant inbred strains. To avoid the confounding effect cell density, which varies two-fold across these different strains, we computed the real/random regularity ratio, expressing the regularity of a mosaic relative to a randomly distributed simulation of similarly sized cells. To test whether this latter statistic better reflects the variation in biological processes that contribute to horizontal cell spacing, we subsequently compared the genetic linkage for each of these two traits, the regularity index and the real/random regularity ratio, each computed from the distribution of nearest neighbor (NN distances and from the Voronoi domain (VD areas. Finally, we compared each of these analyses with another index of patterning, the packing factor. Variation in the regularity indexes, as well as their real/random regularity ratios, and the packing factor, mapped quantitative trait loci (QTL to the distal ends of Chromosomes 1 and 14. For the NN and VD analyses, we found that the degree of linkage was greater when using the real/random regularity ratio rather than the respective regularity index. Using informatic resources, we narrow the list of prospective genes positioned at these two intervals to a small collection of six genes that warrant further investigation to determine their potential role in shaping the patterning of the horizontal cell mosaic.

  4. A delay-dependent approach to robust control for neutral uncertain neural networks with mixed interval time-varying delays

    International Nuclear Information System (INIS)

    Lu, Chien-Yu

    2011-01-01

    This paper considers the problem of delay-dependent global robust stabilization for discrete, distributed and neutral interval time-varying delayed neural networks described by nonlinear delay differential equations of the neutral type. The parameter uncertainties are norm bounded. The activation functions are assumed to be bounded and globally Lipschitz continuous. Using a Lyapunov functional approach and linear matrix inequality (LMI) techniques, the stability criteria for the uncertain neutral neural networks with interval time-varying delays are established in the form of LMIs, which can be readily verified using the standard numerical software. An important feature of the result reported is that all the stability conditions are dependent on the upper and lower bounds of the delays. Another feature of the results lies in that it involves fewer free weighting matrix strategy, and upper bounds of the inner product between two vectors are not introduced to reduce the conservatism of the criteria. Two illustrative examples are provided to demonstrate the effectiveness and the reduced conservatism of the proposed method

  5. High-intensity interval training: Modulating interval duration in overweight/obese men.

    Science.gov (United States)

    Smith-Ryan, Abbie E; Melvin, Malia N; Wingfield, Hailee L

    2015-05-01

    High-intensity interval training (HIIT) is a time-efficient strategy shown to induce various cardiovascular and metabolic adaptations. Little is known about the optimal tolerable combination of intensity and volume necessary for adaptations, especially in clinical populations. In a randomized controlled pilot design, we evaluated the effects of two types of interval training protocols, varying in intensity and interval duration, on clinical outcomes in overweight/obese men. Twenty-five men [body mass index (BMI) > 25 kg · m(2)] completed baseline body composition measures: fat mass (FM), lean mass (LM) and percent body fat (%BF) and fasting blood glucose, lipids and insulin (IN). A graded exercise cycling test was completed for peak oxygen consumption (VO2peak) and power output (PO). Participants were randomly assigned to high-intensity short interval (1MIN-HIIT), high-intensity interval (2MIN-HIIT) or control groups. 1MIN-HIIT and 2MIN-HIIT completed 3 weeks of cycling interval training, 3 days/week, consisting of either 10 × 1 min bouts at 90% PO with 1 min rests (1MIN-HIIT) or 5 × 2 min bouts with 1 min rests at undulating intensities (80%-100%) (2MIN-HIIT). There were no significant training effects on FM (Δ1.06 ± 1.25 kg) or %BF (Δ1.13% ± 1.88%), compared to CON. Increases in LM were not significant but increased by 1.7 kg and 2.1 kg for 1MIN and 2MIN-HIIT groups, respectively. Increases in VO2peak were also not significant for 1MIN (3.4 ml·kg(-1) · min(-1)) or 2MIN groups (2.7 ml · kg(-1) · min(-1)). IN sensitivity (HOMA-IR) improved for both training groups (Δ-2.78 ± 3.48 units; p < 0.05) compared to CON. HIIT may be an effective short-term strategy to improve cardiorespiratory fitness and IN sensitivity in overweight males.

  6. Regularity dimension of sequences and its application to phylogenetic tree reconstruction

    International Nuclear Information System (INIS)

    Pham, Tuan D.

    2012-01-01

    The concept of dimension is a central development of chaos theory for studying nonlinear dynamical systems. Different types of dimensions have been derived to interpret different geometrical or physical observations. Approximate entropy and its modified methods have been introduced for studying regularity and complexity of time-series data in physiology and biology. Here, the concept of power laws and entropy measure are adopted to develop the regularity dimension of sequences to model a mathematical relationship between the frequency with which information about signal regularity changes in various scales. The proposed regularity dimension is applied to reconstruct phylogenetic trees using mitochondrial DNA (mtDNA) sequences for the family Hominidae, which can be validated according to the hypothesized evolutionary relationships between organisms.

  7. Using Tikhonov Regularization for Spatial Projections from CSR Regularized Spherical Harmonic GRACE Solutions

    Science.gov (United States)

    Save, H.; Bettadpur, S. V.

    2013-12-01

    It has been demonstrated before that using Tikhonov regularization produces spherical harmonic solutions from GRACE that have very little residual stripes while capturing all the signal observed by GRACE within the noise level. This paper demonstrates a two-step process and uses Tikhonov regularization to remove the residual stripes in the CSR regularized spherical harmonic coefficients when computing the spatial projections. We discuss methods to produce mass anomaly grids that have no stripe features while satisfying the necessary condition of capturing all observed signal within the GRACE noise level.

  8. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yunji; Jing, Bing-Yi; Gao, Xin

    2015-01-01

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  9. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan

    2015-02-12

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  10. Regularized lattice Bhatnagar-Gross-Krook model for two- and three-dimensional cavity flow simulations.

    Science.gov (United States)

    Montessori, A; Falcucci, G; Prestininzi, P; La Rocca, M; Succi, S

    2014-05-01

    We investigate the accuracy and performance of the regularized version of the single-relaxation-time lattice Boltzmann equation for the case of two- and three-dimensional lid-driven cavities. The regularized version is shown to provide a significant gain in stability over the standard single-relaxation time, at a moderate computational overhead.

  11. Efficient multidimensional regularization for Volterra series estimation

    Science.gov (United States)

    Birpoutsoukis, Georgios; Csurcsia, Péter Zoltán; Schoukens, Johan

    2018-05-01

    This paper presents an efficient nonparametric time domain nonlinear system identification method. It is shown how truncated Volterra series models can be efficiently estimated without the need of long, transient-free measurements. The method is a novel extension of the regularization methods that have been developed for impulse response estimates of linear time invariant systems. To avoid the excessive memory needs in case of long measurements or large number of estimated parameters, a practical gradient-based estimation method is also provided, leading to the same numerical results as the proposed Volterra estimation method. Moreover, the transient effects in the simulated output are removed by a special regularization method based on the novel ideas of transient removal for Linear Time-Varying (LTV) systems. Combining the proposed methodologies, the nonparametric Volterra models of the cascaded water tanks benchmark are presented in this paper. The results for different scenarios varying from a simple Finite Impulse Response (FIR) model to a 3rd degree Volterra series with and without transient removal are compared and studied. It is clear that the obtained models capture the system dynamics when tested on a validation dataset, and their performance is comparable with the white-box (physical) models.

  12. Extension of a chaos control method to unstable trajectories on infinite- or finite-time intervals: Experimental verification

    International Nuclear Information System (INIS)

    Yagasaki, Kazuyuki

    2007-01-01

    In experiments for single and coupled pendula, we demonstrate the effectiveness of a new control method based on dynamical systems theory for stabilizing unstable aperiodic trajectories defined on infinite- or finite-time intervals. The basic idea of the method is similar to that of the OGY method, which is a well-known, chaos control method. Extended concepts of the stable and unstable manifolds of hyperbolic trajectories are used here

  13. Novel global robust stability criteria for interval neural networks with multiple time-varying delays

    International Nuclear Information System (INIS)

    Xu Shengyuan; Lam, James; Ho, Daniel W.C.

    2005-01-01

    This Letter is concerned with the problem of robust stability analysis for interval neural networks with multiple time-varying delays and parameter uncertainties. The parameter uncertainties are assumed to be bounded in given compact sets and the activation functions are supposed to be bounded and globally Lipschitz continuous. A sufficient condition is obtained by means of Lyapunov functionals, which guarantees the existence, uniqueness and global asymptotic stability of the delayed neural network for all admissible uncertainties. This condition is in terms of a linear matrix inequality (LMI), which can be easily checked by using recently developed algorithms in solving LMIs. Finally, a numerical example is provided to demonstrate the effectiveness of the proposed method

  14. Dynamics of coherent states in regular and chaotic regimes of the non-integrable Dicke model

    Science.gov (United States)

    Lerma-Hernández, S.; Chávez-Carlos, J.; Bastarrachea-Magnani, M. A.; López-del-Carpio, B.; Hirsch, J. G.

    2018-04-01

    The quantum dynamics of initial coherent states is studied in the Dicke model and correlated with the dynamics, regular or chaotic, of their classical limit. Analytical expressions for the survival probability, i.e. the probability of finding the system in its initial state at time t, are provided in the regular regions of the model. The results for regular regimes are compared with those of the chaotic ones. It is found that initial coherent states in regular regions have a much longer equilibration time than those located in chaotic regions. The properties of the distributions for the initial coherent states in the Hamiltonian eigenbasis are also studied. It is found that for regular states the components with no negligible contribution are organized in sequences of energy levels distributed according to Gaussian functions. In the case of chaotic coherent states, the energy components do not have a simple structure and the number of participating energy levels is larger than in the regular cases.

  15. The Effect of Three Months Regular Aerobic Exercise on Premenstrual Syndrome

    Directory of Open Access Journals (Sweden)

    Zinat Ghanbari

    2008-12-01

    Full Text Available Objective: To determine the effects of three-month regular aerobic exercise on the PMS symptoms. Also correlations with age, education, marital status and severity of PMS symptoms were studied.Materials and Methods: A Quasi- Experimental study was conducted on 91 volunteer women with regular menstrual cycle and no history of gynecological, endocrinological and psychological disorders. The study was done during March 2005- March 2007, in Tehran University of Medical Sciences. A Modified Menstrual Distress Questionnaire (MMDQ was used in this study. Participants were divided into two groups: Non-exercised, they also didn't have any past experience of regular exercise (n= 48 and Exercised (n= 43. The exercise time duration was one hour and was carried out three times per week for three months.  Emotional, behavioral, electrolyte, autonomic, neurovegatative and skin symptoms of PMS were compared between two groups. P value was considered significant at < 0.05.Results: A significant difference was observed for electrolytic, neurovegetative and cognitive symptoms before and after the exercise. Also the severity of skin and neurovegetative symptoms were different in experimental groups with and without past history of doing regular exercise. There was no correlation between age, education, marital status and severity of PMS symptoms.Conclusion: Three months of regular aerobic exercise effectively reduces the severity of PMS symptoms.

  16. Influence Of Inspection Intervals On Mechanical System Reliability

    International Nuclear Information System (INIS)

    Zilberman, B.

    1998-01-01

    In this paper a methodology of reliability analysis of mechanical systems with latent failures is described. Reliability analysis of such systems must include appropriate usage of check intervals for latent failure detection. The methodology suggests, that based on system logic the analyst decides at the beginning if a system can fail actively or latently and propagates this approach through all system levels. All inspections are assumed to be perfect (all failures are detected and repaired and no new failures are introduced as a result of the maintenance). Additional assumptions are that mission time is much smaller, than check intervals and all components have constant failure rates. Analytical expressions for reliability calculates are provided, based on fault tree and Markov modeling techniques (for two and three redundant systems with inspection intervals). The proposed methodology yields more accurate results than are obtained by not using check intervals or using half check interval times. The conventional analysis assuming that at the beginning of each mission system is as new, give an optimistic prediction of system reliability. Some examples of reliability calculations of mechanical systems with latent failures and establishing optimum check intervals are provided

  17. Fibonacci-regularization method for solving Cauchy integral equations of the first kind

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Fariborzi Araghi

    2017-09-01

    Full Text Available In this paper, a novel scheme is proposed to solve the first kind Cauchy integral equation over a finite interval. For this purpose, the regularization method is considered. Then, the collocation method with Fibonacci base function is applied to solve the obtained second kind singular integral equation. Also, the error estimate of the proposed scheme is discussed. Finally, some sample Cauchy integral equations stem from the theory of airfoils in fluid mechanics are presented and solved to illustrate the importance and applicability of the given algorithm. The tables in the examples show the efficiency of the method.

  18. Modes on the Move: Interval Cycles and the Emergence of Major-Minor Tonality

    Directory of Open Access Journals (Sweden)

    Matthew Woolhouse

    2011-01-01

    Full Text Available The issue of the emergence of major-minor tonality is addressed by recourse to a novel pitch grouping process, referred to as interval cycle proximity (ICP. An interval cycle is the minimum number of (additive iterations of an interval that are required for octave-related pitches to be re-stated, a property conjectured to be responsible for tonal attraction. It is hypothesised that the actuation of ICP in cognition, possibly in the latter part of the sixteenth century, led to a hierarchy of tonal attraction which favoured certain pitches over others, ostensibly the tonics of the modern major and minor system. An ICP model is described that calculates the level of tonal attraction between adjacent musical elements. The predictions of the model are shown to be consistent with music-theoretic accounts of common practice period tonality, including Piston’s Table of Usual Root Progressions. The development of tonality is illustrated with the historical quotations of commentators from the sixteenth to the eighteenth centuries, and can be characterised as follows. At the beginning of the seventeenth century multiple ‘finals’ were possible, each associated with a different interval configuration (mode. By the end of the seventeenth century, however, only two interval configurations were in regular use: those pertaining to the modern major- minor key system. The implications of this development are discussed with respect interval cycles and their hypothesised effect within music

  19. A two-way regularization method for MEG source reconstruction

    KAUST Repository

    Tian, Tian Siva; Huang, Jianhua Z.; Shen, Haipeng; Li, Zhimin

    2012-01-01

    The MEG inverse problem refers to the reconstruction of the neural activity of the brain from magnetoencephalography (MEG) measurements. We propose a two-way regularization (TWR) method to solve the MEG inverse problem under the assumptions that only a small number of locations in space are responsible for the measured signals (focality), and each source time course is smooth in time (smoothness). The focality and smoothness of the reconstructed signals are ensured respectively by imposing a sparsity-inducing penalty and a roughness penalty in the data fitting criterion. A two-stage algorithm is developed for fast computation, where a raw estimate of the source time course is obtained in the first stage and then refined in the second stage by the two-way regularization. The proposed method is shown to be effective on both synthetic and real-world examples. © Institute of Mathematical Statistics, 2012.

  20. A two-way regularization method for MEG source reconstruction

    KAUST Repository

    Tian, Tian Siva

    2012-09-01

    The MEG inverse problem refers to the reconstruction of the neural activity of the brain from magnetoencephalography (MEG) measurements. We propose a two-way regularization (TWR) method to solve the MEG inverse problem under the assumptions that only a small number of locations in space are responsible for the measured signals (focality), and each source time course is smooth in time (smoothness). The focality and smoothness of the reconstructed signals are ensured respectively by imposing a sparsity-inducing penalty and a roughness penalty in the data fitting criterion. A two-stage algorithm is developed for fast computation, where a raw estimate of the source time course is obtained in the first stage and then refined in the second stage by the two-way regularization. The proposed method is shown to be effective on both synthetic and real-world examples. © Institute of Mathematical Statistics, 2012.

  1. Haemostatic reference intervals in pregnancy

    DEFF Research Database (Denmark)

    Szecsi, Pal Bela; Jørgensen, Maja; Klajnbard, Anna

    2010-01-01

    largely unchanged during pregnancy, delivery, and postpartum and were within non-pregnant reference intervals. However, levels of fibrinogen, D-dimer, and coagulation factors VII, VIII, and IX increased markedly. Protein S activity decreased substantially, while free protein S decreased slightly and total......Haemostatic reference intervals are generally based on samples from non-pregnant women. Thus, they may not be relevant to pregnant women, a problem that may hinder accurate diagnosis and treatment of haemostatic disorders during pregnancy. In this study, we establish gestational age......-20, 21-28, 29-34, 35-42, at active labor, and on postpartum days 1 and 2. Reference intervals for each gestational period using only the uncomplicated pregnancies were calculated in all 391 women for activated partial thromboplastin time (aPTT), fibrinogen, fibrin D-dimer, antithrombin, free protein S...

  2. The relationship between peripheral intravenous catheter indwell time and the incidence of phlebitis.

    Science.gov (United States)

    Powell, Jessica; Tarnow, Karen Gahan; Perucca, Roxanne

    2008-01-01

    The purpose of this study was to determine any relationship between peripheral IV catheter indwell time and phlebitis in hospitalized adults. A retrospective review of quarterly quality assurance data-monitoring indwell time, phlebitis rating, and site and tubing labels-was performed. Of 1,161 sites, only 679 had documented indwell time to use. Average indwell time was 1.9 days, and overall phlebitis rate was 3.7%. Analysis of variance revealed a significant association between phlebitis and indwell time. However, asymptomatic peripheral IVs may not need to be removed at regular intervals because there were healthy, asymptomatic sites with indwell time up to 10 days.

  3. Acculturation and cancer screening among Asian Americans: role of health insurance and having a regular physician.

    Science.gov (United States)

    Lee, Sunmin; Chen, Lu; Jung, Mary Y; Baezconde-Garbanati, Lourdes; Juon, Hee-Soon

    2014-04-01

    Cancer is the leading cause of death among Asian Americans, but screening rates are significantly lower in Asians than in non-Hispanic Whites. This study examined associations between acculturation and three types of cancer screening (colorectal, cervical, and breast), focusing on the role of health insurance and having a regular physician. A cross-sectional study of 851 Chinese, Korean, and Vietnamese Americans was conducted in Maryland. Acculturation was measured using an abridged version of the Suinn-Lew Asian Self-Identity Acculturation Scale, acculturation clusters, language preference, length of residency in the US, and age at arrival. Age, health insurance, regular physician, gender, ethnicity, income, marital status, and health status were adjusted in the multivariate analysis. Logistic regression analysis showed that various measures of acculturation were positively associated with the odds of having all cancer screenings. Those lived for more than 20 years in the US were about 2-4 times [odds ratio (OR) and 95 % confidence interval (CI) colorectal: 2.41 (1.52-3.82); cervical: 1.79 (1.07-3.01); and breast: 2.11 (1.25-3.57)] more likely than those who lived for less than 10 years to have had cancer screening. When health insurance and having a regular physician were adjusted, the associations between length of residency and colorectal cancer [OR 1.72 (1.05-2.81)] was reduced and the association between length of residency and cervical and breast cancer became no longer significant. Findings from this study provide a robust and comprehensive picture of AA cancer screening behavior. They will provide helpful information on future target groups for promoting cancer screening.

  4. High intensity interval and moderate continuous cycle training in a physical education programme improves health-related fitness in young females.

    Science.gov (United States)

    Mazurek, K; Zmijewski, P; Krawczyk, K; Czajkowska, A; Kęska, A; Kapuściński, P; Mazurek, T

    2016-06-01

    The aim of the study was to investigate the effects of eight weeks of regular physical education classes supplemented with high intensity interval cycle exercise (HIIE) or continuous cycle exercises of moderate intensity (CME). Forty-eight collegiate females exercising in two regular physical education classes per week were randomly assigned to two programmes (HIIE; n = 24 or CME; n = 24) of additional (one session of 63 minutes per week) physical activity for 8 weeks. Participants performed HIIE comprising 2 series of 6x10 s sprinting with maximal pedalling cadence and active recovery pedalling with intensity 65%-75% HRmax or performed CME corresponding to 65%-75% HRmax. Before and after the 8-week programmes, anthropometric data and aero- and anaerobic capacity were measured. Two-way ANOVA revealed a significant time main effect for VO2max (p body mass not changing significantly (p = 0.59; +0.4% in HIIE and -0.1% in CME). A significant main time effect was found for relative fat mass (FM) and fat-free mass (FFM) (p body composition than physical education classes supplemented with HIIE sessions. In contrast to earlier, smaller trials, similar improvements in aerobic capacity were observed following physical activity with additional HIIE or CME sessions.

  5. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  6. The influence of the anesthesia-to-stimulation time interval on seizure quality parameters in electroconvulsive therapy

    DEFF Research Database (Denmark)

    Jorgensen, A; Christensen, S J; Jensen, A E K

    2018-01-01

    BACKGROUND: Electroconvulsive therapy (ECT) continues to be the most efficacious treatment for severe depression and other life-threatening acute psychiatric conditions. Treatment efficacy is dependent upon the induced seizure quality, which may be influenced by a range of treatment related factors....... Recently, the time interval from anesthesia to the electrical stimulation (ASTI) has been suggested to be an important determinant of seizure quality. METHODS: We measured ASTI in 73 ECT sessions given to 22 individual patients, and analyzed its influence on five seizure quality parameters (EEG seizure...

  7. Regularities of Multifractal Measures

    Indian Academy of Sciences (India)

    First, we prove the decomposition theorem for the regularities of multifractal Hausdorff measure and packing measure in R R d . This decomposition theorem enables us to split a set into regular and irregular parts, so that we can analyze each separately, and recombine them without affecting density properties. Next, we ...

  8. Regularization of fields for self-force problems in curved spacetime: Foundations and a time-domain application

    International Nuclear Information System (INIS)

    Vega, Ian; Detweiler, Steven

    2008-01-01

    We propose an approach for the calculation of self-forces, energy fluxes and waveforms arising from moving point charges in curved spacetimes. As opposed to mode-sum schemes that regularize the self-force derived from the singular retarded field, this approach regularizes the retarded field itself. The singular part of the retarded field is first analytically identified and removed, yielding a finite, differentiable remainder from which the self-force is easily calculated. This regular remainder solves a wave equation which enjoys the benefit of having a nonsingular source. Solving this wave equation for the remainder completely avoids the calculation of the singular retarded field along with the attendant difficulties associated with numerically modeling a delta-function source. From this differentiable remainder one may compute the self-force, the energy flux, and also a waveform which reflects the effects of the self-force. As a test of principle, we implement this method using a 4th-order (1+1) code, and calculate the self-force for the simple case of a scalar charge moving in a circular orbit around a Schwarzschild black hole. We achieve agreement with frequency-domain results to ∼0.1% or better.

  9. Adaptive Regularization of Neural Classifiers

    DEFF Research Database (Denmark)

    Andersen, Lars Nonboe; Larsen, Jan; Hansen, Lars Kai

    1997-01-01

    We present a regularization scheme which iteratively adapts the regularization parameters by minimizing the validation error. It is suggested to use the adaptive regularization scheme in conjunction with optimal brain damage pruning to optimize the architecture and to avoid overfitting. Furthermo......, we propose an improved neural classification architecture eliminating an inherent redundancy in the widely used SoftMax classification network. Numerical results demonstrate the viability of the method...

  10. The Method of Lines Solution of the Regularized Long-Wave Equation Using Runge-Kutta Time Discretization Method

    Directory of Open Access Journals (Sweden)

    H. O. Bakodah

    2013-01-01

    Full Text Available A method of lines approach to the numerical solution of nonlinear wave equations typified by the regularized long wave (RLW is presented. The method developed uses a finite differences discretization to the space. Solution of the resulting system was obtained by applying fourth Runge-Kutta time discretization method. Using Von Neumann stability analysis, it is shown that the proposed method is marginally stable. To test the accuracy of the method some numerical experiments on test problems are presented. Test problems including solitary wave motion, two-solitary wave interaction, and the temporal evaluation of a Maxwellian initial pulse are studied. The accuracy of the present method is tested with and error norms and the conservation properties of mass, energy, and momentum under the RLW equation.

  11. Haemostatic reference intervals in pregnancy

    DEFF Research Database (Denmark)

    Szecsi, Pal Bela; Jørgensen, Maja; Klajnbard, Anna

    2010-01-01

    Haemostatic reference intervals are generally based on samples from non-pregnant women. Thus, they may not be relevant to pregnant women, a problem that may hinder accurate diagnosis and treatment of haemostatic disorders during pregnancy. In this study, we establish gestational age-specific refe......Haemostatic reference intervals are generally based on samples from non-pregnant women. Thus, they may not be relevant to pregnant women, a problem that may hinder accurate diagnosis and treatment of haemostatic disorders during pregnancy. In this study, we establish gestational age......-specific reference intervals for coagulation tests during normal pregnancy. Eight hundred one women with expected normal pregnancies were included in the study. Of these women, 391 had no complications during pregnancy, vaginal delivery, or postpartum period. Plasma samples were obtained at gestational weeks 13......-20, 21-28, 29-34, 35-42, at active labor, and on postpartum days 1 and 2. Reference intervals for each gestational period using only the uncomplicated pregnancies were calculated in all 391 women for activated partial thromboplastin time (aPTT), fibrinogen, fibrin D-dimer, antithrombin, free protein S...

  12. Regular use of alcohol and tobacco in India and its association with age, gender, and poverty.

    Science.gov (United States)

    Neufeld, K J; Peters, D H; Rani, M; Bonu, S; Brooner, R K

    2005-03-07

    This study provides national estimates of regular tobacco and alcohol use in India and their associations with gender, age, and economic group obtained from a representative survey of 471,143 people over the age of 10 years in 1995-96, the National Sample Survey. The national prevalence of regular use of smoking tobacco is estimated to be 16.2%, chewing tobacco 14.0%, and alcohol 4.5%. Men were 25.5 times more likely than women to report regular smoking, 3.7 times more likely to regularly chew tobacco, and 9.7 times more likely to regularly use alcohol. Respondents belonging to scheduled castes and tribes (recognized disadvantaged groups) were significantly more likely to report regular use of alcohol as well as smoking and chewing tobacco. People from rural areas had higher rates compared to urban dwellers, as did those with no formal education. Individuals with incomes below the poverty line had higher relative odds of use of chewing tobacco and alcohol compared to those above the poverty line. The regular use of both tobacco and alcohol also increased significantly with each diminishing income quintile. Comparisons are made between these results and those found in the United States and elsewhere, highlighting the need to address control of these substances on the public health agenda.

  13. Dimensional regularization in position space and a forest formula for regularized Epstein-Glaser renormalization

    Energy Technology Data Exchange (ETDEWEB)

    Keller, Kai Johannes

    2010-04-15

    The present work contains a consistent formulation of the methods of dimensional regularization (DimReg) and minimal subtraction (MS) in Minkowski position space. The methods are implemented into the framework of perturbative Algebraic Quantum Field Theory (pAQFT). The developed methods are used to solve the Epstein-Glaser recursion for the construction of time-ordered products in all orders of causal perturbation theory. A solution is given in terms of a forest formula in the sense of Zimmermann. A relation to the alternative approach to renormalization theory using Hopf algebras is established. (orig.)

  14. Dimensional regularization in position space and a forest formula for regularized Epstein-Glaser renormalization

    International Nuclear Information System (INIS)

    Keller, Kai Johannes

    2010-04-01

    The present work contains a consistent formulation of the methods of dimensional regularization (DimReg) and minimal subtraction (MS) in Minkowski position space. The methods are implemented into the framework of perturbative Algebraic Quantum Field Theory (pAQFT). The developed methods are used to solve the Epstein-Glaser recursion for the construction of time-ordered products in all orders of causal perturbation theory. A solution is given in terms of a forest formula in the sense of Zimmermann. A relation to the alternative approach to renormalization theory using Hopf algebras is established. (orig.)

  15. Condition Number Regularized Covariance Estimation.

    Science.gov (United States)

    Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala

    2013-06-01

    Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the "large p small n " setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required.

  16. Correction of engineering servicing regularity of transporttechnological machines in operational process

    Science.gov (United States)

    Makarova, A. N.; Makarov, E. I.; Zakharov, N. S.

    2018-03-01

    In the article, the issue of correcting engineering servicing regularity on the basis of actual dependability data of cars in operation is considered. The purpose of the conducted research is to increase dependability of transport-technological machines by correcting engineering servicing regularity. The subject of the research is the mechanism of engineering servicing regularity influence on reliability measure. On the basis of the analysis of researches carried out before, a method of nonparametric estimation of car failure measure according to actual time-to-failure data was chosen. A possibility of describing the failure measure dependence on engineering servicing regularity by various mathematical models is considered. It is proven that the exponential model is the most appropriate for that purpose. The obtained results can be used as a separate method of engineering servicing regularity correction with certain operational conditions taken into account, as well as for the technical-economical and economical-stochastic methods improvement. Thus, on the basis of the conducted researches, a method of engineering servicing regularity correction of transport-technological machines in the operational process was developed. The use of that method will allow decreasing the number of failures.

  17. Regularization Techniques for Linear Least-Squares Problems

    KAUST Repository

    Suliman, Mohamed

    2016-04-01

    method deals with discrete ill-posed problems when the singular values of the linear transformation matrix are decaying very fast to a significantly small value. For the both proposed algorithms, the regularization parameter is obtained as a solution of a non-linear characteristic equation. We provide a details study for the general properties of these functions and address the existence and uniqueness of the root. To demonstrate the performance of the derivations, the first proposed COPRA method is applied to estimate different signals with various characteristics, while the second proposed COPRA method is applied to a large set of different real-world discrete ill-posed problems. Simulation results demonstrate that the two proposed methods outperform a set of benchmark regularization algorithms in most cases. In addition, the algorithms are also shown to have the lowest run time.

  18. The impact of the time interval on in-vitro fertilisation success after failure of the first attempt.

    Science.gov (United States)

    Bayoglu Tekin, Y; Ceyhan, S T; Kilic, S; Korkmaz, C

    2015-05-01

    The aim of this study was to identify the optimal time interval for in-vitro fertilisation that would increase treatment success after failure of the first attempt. This retrospective study evaluated 454 consecutive cycles of 227 infertile women who had two consecutive attempts within a 6-month period at an IVF centre. Data were collected on duration of stimulation, consumption of gonadotropin, numbers of retrieved oocytes, mature oocytes, fertilised eggs, good quality embryos on day 3/5 following oocyte retrieval and clinical and ongoing pregnancy. There were significant increases in clinical pregnancy rates at 2-, 3- and 4-month intervals. The maximum increase was after two menstrual cycles (p = 0.001). The highest rate of ongoing pregnancy was in women that had the second attempt after the next menstrual cycle following failure of IVF (27.2%). After IVF failure, initiating the next attempt within 2-4 months increases the clinical pregnancy rates.

  19. Yield and quality of milk and udder health in Martina Franca ass: effects of daily interval and time of machine milking

    Directory of Open Access Journals (Sweden)

    Giovanni Martemucci

    2010-01-01

    Full Text Available Twenty asses of Martina Franca breed, machine milked twice a day, were used to assess the influence of milking interval (3-h, 5-h, and 8-h; N=5 and time (700, 1200 and 1900 on milk yield and udder health. Individual milk samples were taken to determine fat, protein and lactose con- tent. Sensory analysis profile was also assessed. Milk’s total bacterial count (TBC, somatic cell con- tent (SCC and udder’s skin temperature were considered to assess udder health. Milk yield increases by 28.4% (P<0.01 with a milking interval from 3-h to 8-h and is higher (P<0.01 at morning milking. The maximum milk yield per milking corresponds to 700 milking (1416.9 mL thus indicating a circa- dian rhythm in milk secretion processes. Milking intervals of 5 and 8 hours cause a decrease (P<0.01 in milk fat and lactose content. The 8-h interval leads to an increase (P<0.01 in SCC but without any significance for the health udder. No alterations about CBT, clinical evaluation and temperature of ud- der were observed. Milk organoleptic characteristics were better in the 3-h interval milking.

  20. Reconstruction of dynamical systems from interspike intervals

    International Nuclear Information System (INIS)

    Sauer, T.

    1994-01-01

    Attractor reconstruction from interspike interval (ISI) data is described, in rough analogy with Taken's theorem for attractor reconstruction from time series. Assuming a generic integrate-and-fire model coupling the dynamical system to the spike train, there is a one-to-one correspondence between the system states and interspike interval vectors of sufficiently large dimension. The correspondence has an important implication: interspike intervals can be forecast from past history. We show that deterministically driven ISI series can be distinguished from stochastically driven ISI series on the basis of prediction error

  1. A HEMATOBIOCHEMICAL EVALUATION TO COMPARE THE EFFECTS OF HIGH INTENSITY INTERVAL TRAINING AND AEROBIC EXERCISE TO CONTROL DIABETES MALLITIS AND ITS COMPLICATIONS

    Directory of Open Access Journals (Sweden)

    Muneeb Iqbal

    2016-06-01

    Full Text Available Background: Diabetes has become a very common disease all over the world since last few decades and is now perceived as a global health disorder. Diabetes mellitus is identified on the basis of constant high concentration of blood glucose level and it mainly occurs due to deficiency of the pancreatic hormone insulin. High-intensity interval training (HIIT is an improved form of interval trainings, and exercise strategies which alternate the periods of small intense anaerobic exercise by less-intense regaining periods. The study aimed to compare the hematological parameters associated with diabetes and muscle activity between healthy humans and diabetic type-1 patients when subjected to HIIT and regular aerobic exercises. Methods: A convenience sample of total 60 participants was taken it comprised of thirty healthy individuals taken from the department of Physical Therapy, University of Sargodha, Lahore campus and thirty diabetic type-1 individuals of age 15-30 years taken from Akhuwat health services clinic Township, Lahore. Participants were divided into four groups of fifteen individuals each. Group one was the diabetic HIIT (DH group with diabetic type-1 patients subjected to HIIT. Group two was the diabetic aerobic (DA group with diabetic type-1 patients subjected to regular aerobic exercises. Group three was control High intensity interval training (HH that consisted of fifteen healthy individuals to be subjected to High intensity interval training exercises (HIIT. Group four (HA was the control aerobic group with fifteen healthy individuals of average lifestyles subjected to regular aerobic exercises. Results: Aerobic exercise was found to be more effective in reducing glucose level, lowering exogenous insulin and glycated hemoglobin, however HIIT proved to be more effective in lowering blood cholesterol level and decrease LDL level and increase HDL level. Conclusion: It was concluded that aerobic exercise program in comparison to high

  2. High Intensity Interval Training for Maximizing Health Outcomes.

    Science.gov (United States)

    Karlsen, Trine; Aamot, Inger-Lise; Haykowsky, Mark; Rognmo, Øivind

    Regular physical activity and exercise training are important actions to improve cardiorespiratory fitness and maintain health throughout life. There is solid evidence that exercise is an effective preventative strategy against at least 25 medical conditions, including cardiovascular disease, stroke, hypertension, colon and breast cancer, and type 2 diabetes. Traditionally, endurance exercise training (ET) to improve health related outcomes has consisted of low- to moderate ET intensity. However, a growing body of evidence suggests that higher exercise intensities may be superior to moderate intensity for maximizing health outcomes. The primary objective of this review is to discuss how aerobic high-intensity interval training (HIIT) as compared to moderate continuous training may maximize outcomes, and to provide practical advices for successful clinical and home-based HIIT. Copyright © 2017. Published by Elsevier Inc.

  3. OPTIMASI OLSR ROUTING PROTOCOL PADA JARINGAN WIRELESS MESH DENGAN ADAPTIVE REFRESHING TIME INTERVAL DAN ENHANCE MULTI POINT RELAY SELECTING ALGORITHM

    Directory of Open Access Journals (Sweden)

    Faosan Mapa

    2014-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Wireless Mesh Network (WMN adalah suatu konektivitas jaringan yang self-organized, self-configured dan multi-hop. Tujuan dari WMN adalah menawarkan pengguna suatu bentuk jaringan nirkabel yang dapat dengan mudah berkomunikasi dengan jaringan konvensional dengan kecepatan tinggi dan dengan cakupan yang lebih luas serta biaya awal yang minimal. Diperlukan suatu desain protokol routing yang efisien untuk WMN yang secara adaptif dapat mendukung mesh routers dan mesh clients. Dalam tulisan ini, diusulkan untuk mengoptimalkan protokol OLSR, yang merupakan protokol routing proaktif. Digunakan heuristik yang meningkatkan protokol OLSR melalui adaptive refreshing time interval dan memperbaiki metode MPR selecting algorithm. Suatu analisa dalam meningkatkan protokol OLSR melalui adaptive refreshing time interval dan memperbaiki algoritma pemilihan MPR menunjukkan kinerja yang signifikan dalam hal throughput jika dibandingkan dengan protokol OLSR yang asli. Akan tetapi, terdapat kenaikan dalam hal delay. Pada simulasi yang dilakukan dapat disimpulkan bahwa OLSR dapat dioptimalkan dengan memodifikasi pemilihan node MPR berdasarkan cost effective dan penyesuaian waktu interval refreshing hello message sesuai dengan keadaan

  4. Accelerating Large Data Analysis By Exploiting Regularities

    Science.gov (United States)

    Moran, Patrick J.; Ellsworth, David

    2003-01-01

    We present techniques for discovering and exploiting regularity in large curvilinear data sets. The data can be based on a single mesh or a mesh composed of multiple submeshes (also known as zones). Multi-zone data are typical to Computational Fluid Dynamics (CFD) simulations. Regularities include axis-aligned rectilinear and cylindrical meshes as well as cases where one zone is equivalent to a rigid-body transformation of another. Our algorithms can also discover rigid-body motion of meshes in time-series data. Next, we describe a data model where we can utilize the results from the discovery process in order to accelerate large data visualizations. Where possible, we replace general curvilinear zones with rectilinear or cylindrical zones. In rigid-body motion cases we replace a time-series of meshes with a transformed mesh object where a reference mesh is dynamically transformed based on a given time value in order to satisfy geometry requests, on demand. The data model enables us to make these substitutions and dynamic transformations transparently with respect to the visualization algorithms. We present results with large data sets where we combine our mesh replacement and transformation techniques with out-of-core paging in order to achieve significant speed-ups in analysis.

  5. Are Long-Term Chloroquine or Hydroxychloroquine Users Being Checked Regularly for Toxic Maculopathy?

    Science.gov (United States)

    Nika, Melisa; Blachley, Taylor S.; Edwards, Paul; Lee, Paul P.; Stein, Joshua D.

    2014-01-01

    Importance According to evidence-based, expert recommendations, long-term users of chloroquine (CQ) or hydroxychloroquine (HCQ) should undergo regular visits to eye-care providers and diagnostic testing to check for maculopathy. Objective To determine whether patients with rheumatoid arthritis (RA) or systemic lupus erythematosus (SLE) taking CQ or HCQ are regularly visiting eye-care providers and being screened for maculopathy. Setting, Design and Participants Patients with RA or SLE who were continuously enrolled in a particular managed-care network for ≥5 years during 2001-2011 were studied. Patients' amount of CQ/HCQ use in the 5 years since initial RA/SLE diagnosis was calculated, along with their number of eye-care visits and diagnostic tests for maculopathy. Those at high risk for maculopathy were identified. Visits to eye providers and diagnostic testing for maculopathy were assessed for each enrollee over the study period. Logistic regression was performed to assess potential factors associated with regular eye-care-provider visits (≥3 in 5 years) among CQ/HCQ users, including those at greatest risk for maculopathy. Main Outcome Measures Among CQ/HCQ users and those at high risk for toxic maculopathy, the proportions with regular eye-care visits and diagnostic testing, and the likelihood of regular eye-care visits (odds ratios [ORs] with 95% confidence intervals [CI]). Results Among 18,051 beneficiaries with RA or SLE, 6,339 (35.1%) had ≥1 record of HCQ/CQ use and 1,409 (7.8%) used HCQ/CQ for ≥4 years. Among those at high risk for maculopathy, 27.9% lacked regular eye-provider visits, 6.1% had no visits to eye providers, and 34.5% had no diagnostic testing for maculopathy during the 5-year period. Among high-risk patients, each additional month of HCQ/CQ use was associated with a 2.0%-increased likelihood of regular eye care (adjusted OR=1.02, CI=1.01-1.03). High-risk patients whose SLE/RA were managed by rheumatologists had a 77%-increased

  6. A novel interval type-2 fractional order fuzzy PID controller: Design, performance evaluation, and its optimal time domain tuning.

    Science.gov (United States)

    Kumar, Anupam; Kumar, Vijay

    2017-05-01

    In this paper, a novel concept of an interval type-2 fractional order fuzzy PID (IT2FO-FPID) controller, which requires fractional order integrator and fractional order differentiator, is proposed. The incorporation of Takagi-Sugeno-Kang (TSK) type interval type-2 fuzzy logic controller (IT2FLC) with fractional controller of PID-type is investigated for time response measure due to both unit step response and unit load disturbance. The resulting IT2FO-FPID controller is examined on different delayed linear and nonlinear benchmark plants followed by robustness analysis. In order to design this controller, fractional order integrator-differentiator operators are considered as design variables including input-output scaling factors. A new hybridized algorithm named as artificial bee colony-genetic algorithm (ABC-GA) is used to optimize the parameters of the controller while minimizing weighted sum of integral of time absolute error (ITAE) and integral of square of control output (ISCO). To assess the comparative performance of the IT2FO-FPID, authors compared it against existing controllers, i.e., interval type-2 fuzzy PID (IT2-FPID), type-1 fractional order fuzzy PID (T1FO-FPID), type-1 fuzzy PID (T1-FPID), and conventional PID controllers. Furthermore, to show the effectiveness of the proposed controller, the perturbed processes along with the larger dead time are tested. Moreover, the proposed controllers are also implemented on multi input multi output (MIMO), coupled, and highly complex nonlinear two-link robot manipulator system in presence of un-modeled dynamics. Finally, the simulation results explicitly indicate that the performance of the proposed IT2FO-FPID controller is superior to its conventional counterparts in most of the cases. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Right Propositional Neighborhood Logic over Natural Numbers with Integer Constraints for Interval Lengths

    DEFF Research Database (Denmark)

    Bresolin, Davide; Goranko, Valentin; Montanari, Angelo

    2009-01-01

    Interval temporal logics are based on interval structures over linearly (or partially) ordered domains, where time intervals, rather than time instants, are the primitive ontological entities. In this paper we introduce and study Right Propositional Neighborhood Logic over natural numbers...... with integer constraints for interval lengths, which is a propositional interval temporal logic featuring a modality for the 'right neighborhood' relation between intervals and explicit integer constraints for interval lengths. We prove that it has the bounded model property with respect to ultimately periodic...

  8. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions

    Directory of Open Access Journals (Sweden)

    Jean-Louis eHoneine

    2014-10-01

    Full Text Available Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a subtract or integrate sensory inputs, (b move from allocentric to egocentric reference or vice versa, and (c adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1-2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training

  9. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions

    Science.gov (United States)

    Honeine, Jean-Louis; Schieppati, Marco

    2014-01-01

    Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1–2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices

  10. Condition Number Regularized Covariance Estimation*

    Science.gov (United States)

    Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala

    2012-01-01

    Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the “large p small n” setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required. PMID:23730197

  11. Geosocial process and its regularities

    Science.gov (United States)

    Vikulina, Marina; Vikulin, Alexander; Dolgaya, Anna

    2015-04-01

    Natural disasters and social events (wars, revolutions, genocides, epidemics, fires, etc.) accompany each other throughout human civilization, thus reflecting the close relationship of these phenomena that are seemingly of different nature. In order to study this relationship authors compiled and analyzed the list of the 2,400 natural disasters and social phenomena weighted by their magnitude that occurred during the last XXXVI centuries of our history. Statistical analysis was performed separately for each aggregate (natural disasters and social phenomena), and for particular statistically representative types of events. There was 5 + 5 = 10 types. It is shown that the numbers of events in the list are distributed by logarithmic law: the bigger the event, the less likely it happens. For each type of events and each aggregate the existence of periodicities with periods of 280 ± 60 years was established. Statistical analysis of the time intervals between adjacent events for both aggregates showed good agreement with Weibull-Gnedenko distribution with shape parameter less than 1, which is equivalent to the conclusion about the grouping of events at small time intervals. Modeling of statistics of time intervals with Pareto distribution allowed to identify the emergent property for all events in the aggregate. This result allowed the authors to make conclusion about interaction between natural disasters and social phenomena. The list of events compiled by authors and first identified properties of cyclicity, grouping and interaction process reflected by this list is the basis of modeling essentially unified geosocial process at high enough statistical level. Proof of interaction between "lifeless" Nature and Society is fundamental and provided a new approach to forecasting demographic crises with taking into account both natural disasters and social phenomena.

  12. The use of a DNA stabilizer in human dental tissues stored under different temperature conditions and time intervals

    Science.gov (United States)

    TERADA, Andrea Sayuri Silveira Dias; da SILVA, Luiz Antonio Ferreira; GALO, Rodrigo; de AZEVEDO, Aline; GERLACH, Raquel Fernanda; da SILVA, Ricardo Henrique Alves

    2014-01-01

    Objective The present study evaluated the use of a reagent to stabilize the DNA extracted from human dental tissues stored under different temperature conditions and time intervals. Material and Methods A total of 161 teeth were divided into two distinct groups: intact teeth and isolated dental pulp tissue. The samples were stored with or without the product at different time intervals and temperature. After storage, DNA extraction and genomic DNA quantification were performed using real-time PCR; the fragments of the 32 samples that represented each possible condition were analyzed to find the four pre-selected markers in STR analysis. Results The results of the quantification showed values ranging from 0.01 to 10,246.88 ng/μL of DNA. The statistical difference in the quantity of DNA was observed when the factors related to the time and temperature of storage were analyzed. In relation to the use of the specific reagent, its use was relevant in the group of intact teeth when they were at room temperature for 30 and 180 days. The analysis of the fragments in the 32 selected samples was possible irrespective of the amount of DNA, confirming that the STR analysis using an automated method yields good results. Conclusions The use of a specific reagent showed a significant difference in stabilizing DNA in samples of intact human teeth stored at room temperature for 30 and 180 days, while the results showed no justification for using the product under the other conditions tested. PMID:25141206

  13. Mathematical Modeling the Geometric Regularity in Proteus Mirabilis Colonies

    Science.gov (United States)

    Zhang, Bin; Jiang, Yi; Minsu Kim Collaboration

    Proteus Mirabilis colony exhibits striking spatiotemporal regularity, with concentric ring patterns with alternative high and low bacteria density in space, and periodicity for repetition process of growth and swarm in time. We present a simple mathematical model to explain the spatiotemporal regularity of P. Mirabilis colonies. We study a one-dimensional system. Using a reaction-diffusion model with thresholds in cell density and nutrient concentration, we recreated periodic growth and spread patterns, suggesting that the nutrient constraint and cell density regulation might be sufficient to explain the spatiotemporal periodicity in P. Mirabilis colonies. We further verify this result using a cell based model.

  14. Method of high precision interval measurement in pulse laser ranging system

    Science.gov (United States)

    Wang, Zhen; Lv, Xin-yuan; Mao, Jin-jin; Liu, Wei; Yang, Dong

    2013-09-01

    Laser ranging is suitable for laser system, for it has the advantage of high measuring precision, fast measuring speed,no cooperative targets and strong resistance to electromagnetic interference,the measuremen of laser ranging is the key paremeters affecting the performance of the whole system.The precision of the pulsed laser ranging system was decided by the precision of the time interval measurement, the principle structure of laser ranging system was introduced, and a method of high precision time interval measurement in pulse laser ranging system was established in this paper.Based on the analysis of the factors which affected the precision of range measure,the pulse rising edges discriminator was adopted to produce timing mark for the start-stop time discrimination,and the TDC-GP2 high precision interval measurement system based on TMS320F2812 DSP was designed to improve the measurement precision.Experimental results indicate that the time interval measurement method in this paper can obtain higher range accuracy. Compared with the traditional time interval measurement system,the method simplifies the system design and reduce the influence of bad weather conditions,furthermore,it satisfies the requirements of low costs and miniaturization.

  15. Regular-soda intake independent of weight status is associated with asthma among US high school students.

    Science.gov (United States)

    Park, Sohyun; Blanck, Heidi M; Sherry, Bettylou; Jones, Sherry Everett; Pan, Liping

    2013-01-01

    Limited research shows an inconclusive association between soda intake and asthma, potentially attributable to certain preservatives in sodas. This cross-sectional study examined the association between regular (nondiet)-soda intake and current asthma among a nationally representative sample of high school students. Analysis was based on the 2009 national Youth Risk Behavior Survey and included 15,960 students (grades 9 through 12) with data for both regular-soda intake and current asthma status. The outcome measure was current asthma (ie, told by doctor/nurse that they had asthma and still have asthma). The main exposure variable was regular-soda intake (ie, drank a can/bottle/glass of soda during the 7 days before the survey). Multivariable logistic regression was used to estimate the adjusted odds ratios for regular-soda intake with current asthma after controlling for age, sex, race/ethnicity, weight status, and current cigarette use. Overall, 10.8% of students had current asthma. In addition, 9.7% of students who did not drink regular soda had current asthma, and 14.7% of students who drank regular soda three or more times per day had current asthma. Compared with those who did not drink regular soda, odds of having current asthma were higher among students who drank regular soda two times per day (adjusted odds ratio=1.28; 95% CI 1.02 to 1.62) and three or more times per day (adjusted odds ratio=1.64; 95% CI 1.25 to 2.16). The association between high regular-soda intake and current asthma suggests efforts to reduce regular-soda intake among youth might have benefits beyond improving diet quality. However, this association needs additional research, such as a longitudinal examination. Published by Elsevier Inc.

  16. Brain response during the M170 time interval is sensitive to socially relevant information.

    Science.gov (United States)

    Arviv, Oshrit; Goldstein, Abraham; Weeting, Janine C; Becker, Eni S; Lange, Wolf-Gero; Gilboa-Schechtman, Eva

    2015-11-01

    Deciphering the social meaning of facial displays is a highly complex neurological process. The M170, an event related field component of MEG recording, like its EEG counterpart N170, was repeatedly shown to be associated with structural encoding of faces. However, the scope of information encoded during the M170 time window is still being debated. We investigated the neuronal origin of facial processing of integrated social rank cues (SRCs) and emotional facial expressions (EFEs) during the M170 time interval. Participants viewed integrated facial displays of emotion (happy, angry, neutral) and SRCs (indicated by upward, downward, or straight head tilts). We found that the activity during the M170 time window is sensitive to both EFEs and SRCs. Specifically, highly prominent activation was observed in response to SRC connoting dominance as compared to submissive or egalitarian head cues. Interestingly, the processing of EFEs and SRCs appeared to rely on different circuitry. Our findings suggest that vertical head tilts are processed not only for their sheer structural variance, but as social information. Exploring the temporal unfolding and brain localization of non-verbal cues processing may assist in understanding the functioning of the social rank biobehavioral system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Convex Interval Games

    NARCIS (Netherlands)

    Alparslan-Gok, S.Z.; Brânzei, R.; Tijs, S.H.

    2008-01-01

    In this paper, convex interval games are introduced and some characterizations are given. Some economic situations leading to convex interval games are discussed. The Weber set and the Shapley value are defined for a suitable class of interval games and their relations with the interval core for

  18. Sparse regularization for force identification using dictionaries

    Science.gov (United States)

    Qiao, Baijie; Zhang, Xingwu; Wang, Chenxi; Zhang, Hang; Chen, Xuefeng

    2016-04-01

    The classical function expansion method based on minimizing l2-norm of the response residual employs various basis functions to represent the unknown force. Its difficulty lies in determining the optimum number of basis functions. Considering the sparsity of force in the time domain or in other basis space, we develop a general sparse regularization method based on minimizing l1-norm of the coefficient vector of basis functions. The number of basis functions is adaptively determined by minimizing the number of nonzero components in the coefficient vector during the sparse regularization process. First, according to the profile of the unknown force, the dictionary composed of basis functions is determined. Second, a sparsity convex optimization model for force identification is constructed. Third, given the transfer function and the operational response, Sparse reconstruction by separable approximation (SpaRSA) is developed to solve the sparse regularization problem of force identification. Finally, experiments including identification of impact and harmonic forces are conducted on a cantilever thin plate structure to illustrate the effectiveness and applicability of SpaRSA. Besides the Dirac dictionary, other three sparse dictionaries including Db6 wavelets, Sym4 wavelets and cubic B-spline functions can also accurately identify both the single and double impact forces from highly noisy responses in a sparse representation frame. The discrete cosine functions can also successfully reconstruct the harmonic forces including the sinusoidal, square and triangular forces. Conversely, the traditional Tikhonov regularization method with the L-curve criterion fails to identify both the impact and harmonic forces in these cases.

  19. Participation in regular leisure-time physical activity among individuals with type 2 diabetes not meeting Canadian guidelines: the influence of intention, perceived behavioral control, and moral norm.

    Science.gov (United States)

    Boudreau, François; Godin, Gaston

    2014-12-01

    Most people with type 2 diabetes do not engage in regular leisure-time physical activity. The theory of planned behavior and moral norm construct can enhance our understanding of physical activity intention and behavior among this population. This study aims to identify the determinants of both intention and behavior to participate in regular leisure-time physical activity among individuals with type 2 diabetes who not meet Canada's physical activity guidelines. By using secondary data analysis of a randomized computer-tailored print-based intervention, participants (n = 200) from the province of Quebec (Canada) completed and returned a baseline questionnaire measuring their attitude, perceived behavioral control, and moral norm. One month later, they self-reported their level of leisure-time physical activity. A hierarchical regression equation showed that attitude (beta = 0.10, P norm (beta = 0.45, P norm on behavior was mediated by intention and perceived behavioral control. The determinants investigated offered an excellent starting point for designing appropriate counseling messages to promote leisure-time physical activity among individuals with type 2 diabetes.

  20. Spectral of electrocardiographic RR intervals to indicate atrial fibrillation

    Science.gov (United States)

    Nuryani, Nuryani; Satrio Nugroho, Anto

    2017-11-01

    Atrial fibrillation is a serious heart diseases, which is associated on the risk of death, and thus an early detection of atrial fibrillation is necessary. We have investigated spectral pattern of electrocardiogram in relation to atrial fibrillation. The utilized feature of electrocardiogram is RR interval. RR interval is the time interval between a two-consecutive R peaks. A series of RR intervals in a time segment is converted to a signal with a frequency domain. The frequency components are investigated to find the components which significantly associate to atrial fibrillation. A segment is defined as atrial fibrillation or normal segments by considering a defined number of atrial fibrillation RR in the segment. Using clinical data of 23 patients with atrial fibrillation, we find that the frequency components could be used to indicate atrial fibrillation.

  1. Regular-, irregular-, and pseudo-character processing in Chinese: The regularity effect in normal adult readers

    Directory of Open Access Journals (Sweden)

    Dustin Kai Yan Lau

    2014-03-01

    Full Text Available Background Unlike alphabetic languages, Chinese uses a logographic script. However, the pronunciation of many character’s phonetic radical has the same pronunciation as the character as a whole. These are considered regular characters and can be read through a lexical non-semantic route (Weekes & Chen, 1999. Pseudocharacters are another way to study this non-semantic route. A pseudocharacter is the combination of existing semantic and phonetic radicals in their legal positions resulting in a non-existing character (Ho, Chan, Chung, Lee, & Tsang, 2007. Pseudocharacters can be pronounced by direct derivation from the sound of its phonetic radical. Conversely, if the pronunciation of a character does not follow that of the phonetic radical, it is considered as irregular and can only be correctly read through the lexical-semantic route. The aim of the current investigation was to examine reading aloud in normal adults. We hypothesized that the regularity effect, previously described for alphabetical scripts and acquired dyslexic patients of Chinese (Weekes & Chen, 1999; Wu, Liu, Sun, Chromik, & Zhang, 2014, would also be present in normal adult Chinese readers. Method Participants. Thirty (50% female native Hong Kong Cantonese speakers with a mean age of 19.6 years and a mean education of 12.9 years. Stimuli. Sixty regular-, 60 irregular-, and 60 pseudo-characters (with at least 75% of name agreement in Chinese were matched by initial phoneme, number of strokes and family size. Additionally, regular- and irregular-characters were matched by frequency (low and consistency. Procedure. Each participant was asked to read aloud the stimuli presented on a laptop using the DMDX software. The order of stimuli presentation was randomized. Data analysis. ANOVAs were carried out by participants and items with RTs and errors as dependent variables and type of stimuli (regular-, irregular- and pseudo-character as repeated measures (F1 or between subject

  2. Overconfidence in Interval Estimates

    Science.gov (United States)

    Soll, Jack B.; Klayman, Joshua

    2004-01-01

    Judges were asked to make numerical estimates (e.g., "In what year was the first flight of a hot air balloon?"). Judges provided high and low estimates such that they were X% sure that the correct answer lay between them. They exhibited substantial overconfidence: The correct answer fell inside their intervals much less than X% of the time. This…

  3. Regularity for a clamped grid equation $u_{xxxx}+u_{yyyy}=f $ on a domain with a corner

    Directory of Open Access Journals (Sweden)

    Tymofiy Gerasimov

    2009-04-01

    Full Text Available The operator $L=frac{partial ^{4}}{partial x^{4}} +frac{partial ^{4}}{partial y^{4}}$ appears in a model for the vertical displacement of a two-dimensional grid that consists of two perpendicular sets of elastic fibers or rods. We are interested in the behaviour of such a grid that is clamped at the boundary and more specifically near a corner of the domain. Kondratiev supplied the appropriate setting in the sense of Sobolev type spaces tailored to find the optimal regularity. Inspired by the Laplacian and the Bilaplacian models one expect, except maybe for some special angles that the optimal regularity improves when angle decreases. For the homogeneous Dirichlet problem with this special non-isotropic fourth order operator such a result does not hold true. We will show the existence of an interval $( frac{1}{2}pi ,omega _{star }$, $omega _{star }/pi approx 0.528dots$ (in degrees $omega _{star }approx 95.1dots^{circ} $, in which the optimal regularity improves with increasing opening angle.

  4. Total variation regularization for fMRI-based prediction of behavior

    Science.gov (United States)

    Michel, Vincent; Gramfort, Alexandre; Varoquaux, Gaël; Eger, Evelyn; Thirion, Bertrand

    2011-01-01

    While medical imaging typically provides massive amounts of data, the extraction of relevant information for predictive diagnosis remains a difficult challenge. Functional MRI (fMRI) data, that provide an indirect measure of task-related or spontaneous neuronal activity, are classically analyzed in a mass-univariate procedure yielding statistical parametric maps. This analysis framework disregards some important principles of brain organization: population coding, distributed and overlapping representations. Multivariate pattern analysis, i.e., the prediction of behavioural variables from brain activation patterns better captures this structure. To cope with the high dimensionality of the data, the learning method has to be regularized. However, the spatial structure of the image is not taken into account in standard regularization methods, so that the extracted features are often hard to interpret. More informative and interpretable results can be obtained with the ℓ1 norm of the image gradient, a.k.a. its Total Variation (TV), as regularization. We apply for the first time this method to fMRI data, and show that TV regularization is well suited to the purpose of brain mapping while being a powerful tool for brain decoding. Moreover, this article presents the first use of TV regularization for classification. PMID:21317080

  5. Decentralized formation of random regular graphs for robust multi-agent networks

    KAUST Repository

    Yazicioglu, A. Yasin

    2014-12-15

    Multi-agent networks are often modeled via interaction graphs, where the nodes represent the agents and the edges denote direct interactions between the corresponding agents. Interaction graphs have significant impact on the robustness of networked systems. One family of robust graphs is the random regular graphs. In this paper, we present a locally applicable reconfiguration scheme to build random regular graphs through self-organization. For any connected initial graph, the proposed scheme maintains connectivity and the average degree while minimizing the degree differences and randomizing the links. As such, if the average degree of the initial graph is an integer, then connected regular graphs are realized uniformly at random as time goes to infinity.

  6. Effect of time interval between capecitabine intake and radiotherapy on local recurrence-free survival in preoperative chemoradiation for locally advanced rectal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeon Joo; Kim, Jong Hoon; Yu, Chang Sik; Kim, Tae Won; Jang, Se Jin; Choi, Eun Kyung; Kim, Jin Cheon [Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of); Choi, Won Sik [University of Ulsan College of Medicine, Gangneung (Korea, Republic of)

    2017-06-15

    The concentration of capecitabine peaks at 1–2 hours after administration. We therefore assumed that proper timing of capecitabine administration and radiotherapy would maximize radiosensitization and influence survival among patients with locally advanced rectal cancer. We retrospectively reviewed 223 patients with locally advanced rectal cancer who underwent preoperative chemoradiation, followed by surgery from January 2002 to May 2006. All patients underwent pelvic radiotherapy (50 Gy/25 fractions) and received capecitabine twice daily at 12-hour intervals (1,650 mg/m2/day). Patients were divided into two groups according to the time interval between capecitabine intake and radiotherapy. Patients who took capecitabine 1 hour before radiotherapy were classified as Group A (n = 109); all others were classified as Group B (n = 114). The median follow-up period was 72 months (range, 7 to 149 months). Although Group A had a significantly higher rate of good responses (44% vs. 25%; p = 0.005), the 5-year local recurrence-free survival rates of 93% in Group A and 97% in Group B did not differ significantly (p = 0.519). The 5-year disease-free survival and overall survival rates were also comparable between the groups. Despite the better pathological response in Group A, the time interval between capecitabine and radiotherapy administration did not have a significant effect on survivals. Further evaluations are needed to clarify the interaction of these treatment modalities.

  7. RES: Regularized Stochastic BFGS Algorithm

    Science.gov (United States)

    Mokhtari, Aryan; Ribeiro, Alejandro

    2014-12-01

    RES, a regularized stochastic version of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton method is proposed to solve convex optimization problems with stochastic objectives. The use of stochastic gradient descent algorithms is widespread, but the number of iterations required to approximate optimal arguments can be prohibitive in high dimensional problems. Application of second order methods, on the other hand, is impracticable because computation of objective function Hessian inverses incurs excessive computational cost. BFGS modifies gradient descent by introducing a Hessian approximation matrix computed from finite gradient differences. RES utilizes stochastic gradients in lieu of deterministic gradients for both, the determination of descent directions and the approximation of the objective function's curvature. Since stochastic gradients can be computed at manageable computational cost RES is realizable and retains the convergence rate advantages of its deterministic counterparts. Convergence results show that lower and upper bounds on the Hessian egeinvalues of the sample functions are sufficient to guarantee convergence to optimal arguments. Numerical experiments showcase reductions in convergence time relative to stochastic gradient descent algorithms and non-regularized stochastic versions of BFGS. An application of RES to the implementation of support vector machines is developed.

  8. Regularity effect in prospective memory during aging

    Directory of Open Access Journals (Sweden)

    Geoffrey Blondelle

    2016-10-01

    Full Text Available Background: Regularity effect can affect performance in prospective memory (PM, but little is known on the cognitive processes linked to this effect. Moreover, its impacts with regard to aging remain unknown. To our knowledge, this study is the first to examine regularity effect in PM in a lifespan perspective, with a sample of young, intermediate, and older adults. Objective and design: Our study examined the regularity effect in PM in three groups of participants: 28 young adults (18–30, 16 intermediate adults (40–55, and 25 older adults (65–80. The task, adapted from the Virtual Week, was designed to manipulate the regularity of the various activities of daily life that were to be recalled (regular repeated activities vs. irregular non-repeated activities. We examine the role of several cognitive functions including certain dimensions of executive functions (planning, inhibition, shifting, and binding, short-term memory, and retrospective episodic memory to identify those involved in PM, according to regularity and age. Results: A mixed-design ANOVA showed a main effect of task regularity and an interaction between age and regularity: an age-related difference in PM performances was found for irregular activities (older < young, but not for regular activities. All participants recalled more regular activities than irregular ones with no age effect. It appeared that recalling of regular activities only involved planning for both intermediate and older adults, while recalling of irregular ones were linked to planning, inhibition, short-term memory, binding, and retrospective episodic memory. Conclusion: Taken together, our data suggest that planning capacities seem to play a major role in remembering to perform intended actions with advancing age. Furthermore, the age-PM-paradox may be attenuated when the experimental design is adapted by implementing a familiar context through the use of activities of daily living. The clinical

  9. Using the Initial Systolic Time Interval to assess cardiac autonomic function in Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    Jan H. Meijer

    2011-12-01

    Full Text Available The Initial Systolic Time Interval (ISTI has been defined as the time difference between the peak electrical and peak mechanical activity of the heart. ISTI is obtained from the electro-cardiogram and the impedance cardiogram. The response of ISTI while breathing at rest and to a deep breathing stimulus was studied in a group of patients suffering from Parkinson's disease (PD and a group of healthy control subjects. ISTI showed substantial variability during these manoeuvres. The tests showed that the variability of RR and ISTI was substantially different between PD patients and controls. It is hypothesized that in PD patients the sympathetic system compensates for the loss of regulatory control function of the blood-pressure by the parasympathetic system. It is concluded that ISTI is a practical, additional and independent parameter that can be used to assist other tests in evaluating autonomic control of the heart in PD patients.doi:10.5617/jeb.216 J Electr Bioimp, vol. 2, pp. 98-101, 2011

  10. J-regular rings with injectivities

    OpenAIRE

    Shen, Liang

    2010-01-01

    A ring $R$ is called a J-regular ring if R/J(R) is von Neumann regular, where J(R) is the Jacobson radical of R. It is proved that if R is J-regular, then (i) R is right n-injective if and only if every homomorphism from an $n$-generated small right ideal of $R$ to $R_{R}$ can be extended to one from $R_{R}$ to $R_{R}$; (ii) R is right FP-injective if and only if R is right (J, R)-FP-injective. Some known results are improved.

  11. Regular perturbation theory for two-electron atoms

    International Nuclear Information System (INIS)

    Feranchuk, I.D.; Triguk, V.V.

    2011-01-01

    Regular perturbation theory (RPT) for the ground and excited states of two-electron atoms or ions is developed. It is shown for the first time that summation of the matrix elements from the electron-electron interaction operator over all intermediate states can be calculated in a closed form by means of the two-particle Coulomb Green's function constructed in the Letter. It is shown that the second order approximation of RPT includes the main part of the correlation energy both for the ground and excited states. This approach can be also useful for description of two-electron atoms in external fields. -- Highlights: → We develop regular perturbation theory for the two-electron atoms or ions. → We calculate the sum of the matrix elements over all intermediate states. → We construct the two-particle Coulomb Green's function.

  12. The "Chaos Theory" and nonlinear dynamics in heart rate variability analysis: does it work in short-time series in patients with coronary heart disease?

    Science.gov (United States)

    Krstacic, Goran; Krstacic, Antonija; Smalcelj, Anton; Milicic, Davor; Jembrek-Gostovic, Mirjana

    2007-04-01

    Dynamic analysis techniques may quantify abnormalities in heart rate variability (HRV) based on nonlinear and fractal analysis (chaos theory). The article emphasizes clinical and prognostic significance of dynamic changes in short-time series applied on patients with coronary heart disease (CHD) during the exercise electrocardiograph (ECG) test. The subjects were included in the series after complete cardiovascular diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG data after sampling digitally. The range rescaled analysis method determined the fractal dimension of the intervals. To quantify fractal long-range correlation's properties of heart rate variability, the detrended fluctuation analysis technique was used. Approximate entropy (ApEn) was applied to quantify the regularity and complexity of time series, as well as unpredictability of fluctuations in time series. It was found that the short-term fractal scaling exponent (alpha(1)) is significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P chaos theory during the exercise ECG test point out the multifractal time series in CHD patients who loss normal fractal characteristics and regularity in HRV. Nonlinear analysis technique may complement traditional ECG analysis.

  13. Boosting Maintenance in Working Memory with Temporal Regularities

    Science.gov (United States)

    Plancher, Gaën; Lévêque, Yohana; Fanuel, Lison; Piquandet, Gaëlle; Tillmann, Barbara

    2018-01-01

    Music cognition research has provided evidence for the benefit of temporally regular structures guiding attention over time. The present study investigated whether maintenance in working memory can benefit from an isochronous rhythm. Participants were asked to remember series of 6 letters for serial recall. In the rhythm condition of Experiment…

  14. Detecting regular sound changes in linguistics as events of concerted evolution.

    Science.gov (United States)

    Hruschka, Daniel J; Branford, Simon; Smith, Eric D; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-05

    Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Restricted Interval Valued Neutrosophic Sets and Restricted Interval Valued Neutrosophic Topological Spaces

    Directory of Open Access Journals (Sweden)

    Anjan Mukherjee

    2016-08-01

    Full Text Available In this paper we introduce the concept of restricted interval valued neutrosophic sets (RIVNS in short. Some basic operations and properties of RIVNS are discussed. The concept of restricted interval valued neutrosophic topology is also introduced together with restricted interval valued neutrosophic finer and restricted interval valued neutrosophic coarser topology. We also define restricted interval valued neutrosophic interior and closer of a restricted interval valued neutrosophic set. Some theorems and examples are cites. Restricted interval valued neutrosophic subspace topology is also studied.

  16. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang [Seoul National Univ., Seoul (Korea, Republic of)

    1997-07-15

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code.

  17. A study on assessment methodology of surveillance test interval and Allowed Outage Time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Ryu, Yeong Woo; Cho, Jae Seon; Heo, Chang Wook; Kim, Do Hyeong; Kim, Joo Yeol; Kim, Yun Ik; Yang, Hei Chang

    1997-07-01

    Objectives of this study is the development of methodology by which assesses the optimization of Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korean nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches are performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method. In the second year of this study, the sensitivity analyses about the failure factors of the components are performed in the bases of the assessment methodologies of the first study, the interaction modeling of the STI and AOT is quantified. And the reliability assessment methodology about the diesel generator is reviewed and applied to the PSA code

  18. Clinical and Biological Features of Interval Colorectal Cancer

    Directory of Open Access Journals (Sweden)

    Yu Mi Lee

    2017-05-01

    Full Text Available Interval colorectal cancer (I-CRC is defined as a CRC diagnosed within 60 months after a negative colonoscopy, taking into account that 5 years is the “mean sojourn time.” It is important to prevent the development of interval cancer. The development of interval colon cancer is associated with female sex, old age, family history of CRC, comorbidities, diverticulosis, and the skill of the endoscopist. During carcinogenesis, sessile serrated adenomas/polyps (SSA/Ps share many genomic and colonic site characteristics with I-CRCs. The clinical and biological features of I-CRC should be elucidated to prevent the development of interval colon cancer.

  19. Response-rate differences in variable-interval and variable-ratio schedules: An old problem revisited

    OpenAIRE

    Cole, Mark R.

    1994-01-01

    In Experiment 1, a variable-ratio 10 schedule became, successively, a variable-interval schedule with only the minimum interreinforcement intervals yoked to the variable ratio, or a variable-interval schedule with both interreinforcement intervals and reinforced interresponse times yoked to the variable ratio. Response rates in the variable-interval schedule with both interreinforcement interval and reinforced interresponse time yoking fell between the higher rates maintained by the variable-...

  20. Iterative Regularization with Minimum-Residual Methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2007-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  1. Iterative regularization with minimum-residual methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2006-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES - their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  2. Closeness-Centrality-Based Synchronization Criteria for Complex Dynamical Networks With Interval Time-Varying Coupling Delays.

    Science.gov (United States)

    Park, Myeongjin; Lee, Seung-Hoon; Kwon, Oh-Min; Seuret, Alexandre

    2017-09-06

    This paper investigates synchronization in complex dynamical networks (CDNs) with interval time-varying delays. The CDNs are representative of systems composed of a large number of interconnected dynamical units, and for the purpose of the mathematical analysis, the leading work is to model them as graphs whose nodes represent the dynamical units. At this time, we take note of the importance of each node in networks. One way, in this paper, is that the closeness-centrality mentioned in the field of social science is grafted onto the CDNs. By constructing a suitable Lyapunov-Krasovskii functional, and utilizing some mathematical techniques, the sufficient and closeness-centrality-based conditions for synchronization stability of the networks are established in terms of linear matrix inequalities. Ultimately, the use of the closeness-centrality can be weighted with regard to not only the interconnection relation among the nodes, which was utilized in the existing works but also more information about nodes. Here, the centrality will be added as the concerned information. Moreover, to avoid the computational burden causing the nonconvex term including the square of the time-varying delay, how to deal with it is applied by estimating it to the convex term including time-varying delay. Finally, two illustrative examples are given to show the advantage of the closeness-centrality in point of the robustness on time-delay.

  3. Effective action for scalar fields and generalized zeta-function regularization

    International Nuclear Information System (INIS)

    Cognola, Guido; Zerbini, Sergio

    2004-01-01

    Motivated by the study of quantum fields in a Friedmann-Robertson-Walker space-time, the one-loop effective action for a scalar field defined in the ultrastatic manifold RxH 3 /Γ, H 3 /Γ being the finite volume, noncompact, hyperbolic spatial section, is investigated by a generalization of zeta-function regularization. It is shown that additional divergences may appear at the one-loop level. The one-loop renormalizability of the model is discussed and, making use of a generalization of zeta-function regularization, the one-loop renormalization group equations are derived

  4. Prevalence of and factors associated with regular khat chewing among university students in Ethiopia

    Directory of Open Access Journals (Sweden)

    Astatkie A

    2015-02-01

    Full Text Available Ayalew Astatkie,1 Meaza Demissie,2 Yemane Berhane,2 Alemayehu Worku2,3 1School of Public and Environmental Health, College of Medicine and Health Sciences, Hawassa University, Hawassa, Ethiopia; 2Addis Continental Institute of Public Health, Addis Ababa, Ethiopia; 3School of Public Health, College of Health Sciences, Addis Ababa University, Addis Ababa, Ethiopia Purpose: Khat (Catha edulis is commonly chewed for its psychostimulant and euphorigenic effects in Africa and the Arabian Peninsula. Students use it to help them study for long hours especially during the period of examination. However, how regularly khat is chewed among university students and its associated factors are not well documented. In this article we report on the prevalence of and factors associated with regular khat chewing among university students in Ethiopia. Methods: We did a cross-sectional study from May 20, 2014 to June 23, 2014 on a sample of 1,255 regular students recruited from all campuses of Hawassa University, southern Ethiopia. The data were collected using self-administered questionnaires. We analyzed the data to identify factors associated with current regular khat chewing using complex sample adjusted logistic regression analysis. Results: The prevalence of current regular khat chewing was 10.5% (95% confidence interval [CI]: 6.1%–14.9%. After controlling for sex, religion, year of study, having a father who chews khat, cigarette smoking and alcohol drinking in the adjusted logistic regression model, living off-campus in rented houses as compared to living in the university dormitory (adjusted odds ratio [95% CI] =8.09 [1.56–42.01], and having friends who chew khat (adjusted odds ratio [95% CI] =4.62 [1.98–10.74] were found to significantly increase the odds of current regular khat use. Conclusion: Students living outside the university campus in rented houses compared to those living in dormitory and those with khat chewing peers are more likely to use

  5. Multiple graph regularized protein domain ranking.

    Science.gov (United States)

    Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin

    2012-11-19

    Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.

  6. High-resolution seismic data regularization and wavefield separation

    Science.gov (United States)

    Cao, Aimin; Stump, Brian; DeShon, Heather

    2018-04-01

    We present a new algorithm, non-equispaced fast antileakage Fourier transform (NFALFT), for irregularly sampled seismic data regularization. Synthetic tests from 1-D to 5-D show that the algorithm may efficiently remove leaked energy in the frequency wavenumber domain, and its corresponding regularization process is accurate and fast. Taking advantage of the NFALFT algorithm, we suggest a new method (wavefield separation) for the detection of the Earth's inner core shear wave with irregularly distributed seismic arrays or networks. All interfering seismic phases that propagate along the minor arc are removed from the time window around the PKJKP arrival. The NFALFT algorithm is developed for seismic data, but may also be used for other irregularly sampled temporal or spatial data processing.

  7. RR-Interval variance of electrocardiogram for atrial fibrillation detection

    Science.gov (United States)

    Nuryani, N.; Solikhah, M.; Nugoho, A. S.; Afdala, A.; Anzihory, E.

    2016-11-01

    Atrial fibrillation is a serious heart problem originated from the upper chamber of the heart. The common indication of atrial fibrillation is irregularity of R peak-to-R-peak time interval, which is shortly called RR interval. The irregularity could be represented using variance or spread of RR interval. This article presents a system to detect atrial fibrillation using variances. Using clinical data of patients with atrial fibrillation attack, it is shown that the variance of electrocardiographic RR interval are higher during atrial fibrillation, compared to the normal one. Utilizing a simple detection technique and variances of RR intervals, we find a good performance of atrial fibrillation detection.

  8. MK-801 and memantine act differently on short-term memory tested with different time-intervals in the Morris water maze test.

    Science.gov (United States)

    Duda, Weronika; Wesierska, Malgorzata; Ostaszewski, Pawel; Vales, Karel; Nekovarova, Tereza; Stuchlik, Ales

    2016-09-15

    N-methyl-d-aspartate receptors (NMDARs) play a crucial role in spatial memory formation. In neuropharmacological studies their functioning strongly depends on testing conditions and the dosage of NMDAR antagonists. The aim of this study was to assess the immediate effects of NMDAR block by (+)MK-801 or memantine on short-term allothetic memory. Memory was tested in a working memory version of the Morris water maze test. In our version of the test, rats underwent one day of training with 8 trials, and then three experimental days when rats were injected intraperitoneally with low- 5 (MeL), high - 20 (MeH) mg/kg memantine, 0.1mg/kg MK-801 or 1ml/kg saline (SAL) 30min before testing, for three consecutive days. On each experimental day there was just one acquisition and one test trial, with an inter-trial interval of 5 or 15min. During training the hidden platform was relocated after each trial and during the experiment after each day. The follow-up effect was assessed on day 9. Intact rats improved their spatial memory across the one training day. With a 5min interval MeH rats had longer latency then all rats during retrieval. With a 15min interval the MeH rats presented worse working memory measured as retrieval minus acquisition trial for path than SAL and MeL and for latency than MeL rats. MK-801 rats had longer latency than SAL during retrieval. Thus, the high dose of memantine, contrary to low dose of MK-801 disrupts short-term memory independent on the time interval between acquisition and retrieval. This shows that short-term memory tested in a working memory version of water maze is sensitive to several parameters: i.e., NMDA receptor antagonist type, dosage and the time interval between learning and testing. Copyright © 2016. Published by Elsevier B.V.

  9. Call-to-balloon time dashboard in patients with ST-segment elevation myocardial infarction results in significant improvement in the logistic chain.

    Science.gov (United States)

    Hermans, Maaike P J; Velders, Matthijs A; Smeekes, Martin; Drexhage, Olivier S; Hautvast, Raymond W M; Ytsma, Timon; Schalij, Martin J; Umans, Victor A W M

    2017-08-04

    Timely reperfusion with primary percutaneous coronary intervention (pPCI) in ST-segment elevation myocardial infarction (STEMI) patients is associated with superior clinical outcomes. Aiming to reduce ischaemic time, an innovative system for home-to-hospital (H2H) time monitoring was implemented, which enabled real-time evaluation of ischaemic time intervals, regular feedback and improvements in the logistic chain. The objective of this study was to assess the results after implementation of the H2H dashboard for monitoring and evaluation of ischaemic time in STEMI patients. Ischaemic time in STEMI patients transported by emergency medical services (EMS) and treated with pPCI in the Noordwest Ziekenhuis, Alkmaar before (2008-2009; n=495) and after the implementation of the H2H dashboard (2011-2014; n=441) was compared. Median time intervals were significantly shorter in the H2H group (door-to-balloon time 32 [IQR 25-43] vs. 40 [IQR 28-55] minutes, p-value dashboard was independently associated with shorter time delays. Real-time monitoring and feedback on time delay with the H2H dashboard improves the logistic chain in STEMI patients, resulting in shorter ischaemic time intervals.

  10. Regularization of Instantaneous Frequency Attribute Computations

    Science.gov (United States)

    Yedlin, M. J.; Margrave, G. F.; Van Vorst, D. G.; Ben Horin, Y.

    2014-12-01

    We compare two different methods of computation of a temporally local frequency:1) A stabilized instantaneous frequency using the theory of the analytic signal.2) A temporally variant centroid (or dominant) frequency estimated from a time-frequency decomposition.The first method derives from Taner et al (1979) as modified by Fomel (2007) and utilizes the derivative of the instantaneous phase of the analytic signal. The second method computes the power centroid (Cohen, 1995) of the time-frequency spectrum, obtained using either the Gabor or Stockwell Transform. Common to both methods is the necessity of division by a diagonal matrix, which requires appropriate regularization.We modify Fomel's (2007) method by explicitly penalizing the roughness of the estimate. Following Farquharson and Oldenburg (2004), we employ both the L curve and GCV methods to obtain the smoothest model that fits the data in the L2 norm.Using synthetic data, quarry blast, earthquakes and the DPRK tests, our results suggest that the optimal method depends on the data. One of the main applications for this work is the discrimination between blast events and earthquakesFomel, Sergey. " Local seismic attributes." , Geophysics, 72.3 (2007): A29-A33.Cohen, Leon. " Time frequency analysis theory and applications." USA: Prentice Hall, (1995).Farquharson, Colin G., and Douglas W. Oldenburg. "A comparison of automatic techniques for estimating the regularization parameter in non-linear inverse problems." Geophysical Journal International 156.3 (2004): 411-425.Taner, M. Turhan, Fulton Koehler, and R. E. Sheriff. " Complex seismic trace analysis." Geophysics, 44.6 (1979): 1041-1063.

  11. Optimization of Spacecraft Rendezvous and Docking using Interval Analysis

    NARCIS (Netherlands)

    Van Kampen, E.; Chu, Q.P.; Mulder, J.A.

    2010-01-01

    This paper applies interval optimization to the fixed-time multiple impulse rendezvous and docking problem. Current methods for solving this type of optimization problem include for example genetic algorithms and gradient based optimization. Unlike these methods, interval methods can guarantee that

  12. Multiple Feature Fusion Based on Co-Training Approach and Time Regularization for Place Classification in Wearable Video

    Directory of Open Access Journals (Sweden)

    Vladislavs Dovgalecs

    2013-01-01

    Full Text Available The analysis of video acquired with a wearable camera is a challenge that multimedia community is facing with the proliferation of such sensors in various applications. In this paper, we focus on the problem of automatic visual place recognition in a weakly constrained environment, targeting the indexing of video streams by topological place recognition. We propose to combine several machine learning approaches in a time regularized framework for image-based place recognition indoors. The framework combines the power of multiple visual cues and integrates the temporal continuity information of video. We extend it with computationally efficient semisupervised method leveraging unlabeled video sequences for an improved indexing performance. The proposed approach was applied on challenging video corpora. Experiments on a public and a real-world video sequence databases show the gain brought by the different stages of the method.

  13. The in vitro antibacterial effect of iodine-potassium iodide and calcium hydroxide in infected dentinal tubules at different time intervals.

    Science.gov (United States)

    Lin, Shaul; Kfir, Anda; Laviv, Amir; Sela, Galit; Fuss, Zvi

    2009-03-01

    The aim of this study was to evaluate the antibacterial effect of iodine-potassium iodide (IKI) and calcium hydroxide (CH) on dentinal tubules infected with Enterococcus faecalis (E. faecalis) at different time intervals. Hollow cylinders of bovine root dentin (n=45) were infected and divided into three equal groups filled with either IKI or CH and a positive control. After placing each medicament in the infected cylinders for time periods of 10 minutes, 48 hours and 7 days, microbiological samples were analyzed. At the end of each period, four 100 microm thick inner dentin layers (400 microm thick from each specimen) were removed using dental burs of increasing diameters. Dentin powder was cultured on agar plates to quantitatively assess their infection, expressed in colony forming units (cfu). In all layers of the positive control group, heavy bacterial infection was observed. After 10 minutes, IKI reduced the amount of viable bacteria more efficiently than CH, whereas at later time intervals CH showed the best results. For short periods of exposure, IKI has a more efficient antibacterial effect in the dentinal tubules than CH but CH performs better after longer durations of exposure. This research indicates the use of IKI is a better choice for disinfecting the root canal than CH if only a short duration of exposure is used because of its more efficient antibacterial effect. However, if a longer exposure time is used, then CH is a better choice because of its better disinfecting effect over time.

  14. [Incidence of long (short) PR interval in electrocardiogram among healthy people in Changsha and its clinical significance].

    Science.gov (United States)

    Liu, Liping; Lin, Ping; Xu, Yi; Wu, Lijia; Zou, Runmei; Xie, Zhenwu; Wang, Cheng

    2016-04-01

    To analyze the incidence of long (short) PR interval in electrocardiogram among healthy people in Changsha and the clinical significance.
 Twelve-lead body surface electrocardiogram was taken to measure the heart rates and PR intervals from 4 025 healthy individuals (age range from 6 min after birth to 83 years old) who performed physical examination from Jan, 1993 to Dec, 2012 in the Second Xiangya Hospital, Central South University. Statistics were analyzed by SPSS 16.0.
 The total incidence of short PR interval was 19.65% (791/4 025). The age group from birth to 13 years old had a higher incidence than the other age groups (χ2=432, PPR intervals was 3.58% (144/4 025). The 1 year-old group had the highest incidence (6.74%), which decreased with the increase of age. The lowest incidence of long PR intervals occurred in the age group from 14-17 years old, which gradually increased after 50 years old. There were no significant differences in long (short) PR intervals between the gender (P>0.05).
 The incidence of long (short) PR intervals varies in different age groups of healthy people. The incidences of long (short) PR intervals in children before 10 years old are higher than those in adults, especially the short PR intervals, as a result of the heart rate affected by childhood autonomic nervous function and the change in atrial volume with age. Adults have long (short) PR interval should be regularly followed-up to prevent cardiovascular events.

  15. Higher derivative regularization and chiral anomaly

    International Nuclear Information System (INIS)

    Nagahama, Yoshinori.

    1985-02-01

    A higher derivative regularization which automatically leads to the consistent chiral anomaly is analyzed in detail. It explicitly breaks all the local gauge symmetry but preserves global chiral symmetry and leads to the chirally symmetric consistent anomaly. This regularization thus clarifies the physics content contained in the consistent anomaly. We also briefly comment on the application of this higher derivative regularization to massless QED. (author)

  16. Electronic cigarette: a longitudinal study of regular vapers.

    Science.gov (United States)

    Etter, Jean-François

    2017-06-07

    It is unclear how vaping behaviour changes over time in regular vapers, and what occurs when vapers relapse to smoking or when they stop vaping. We assessed change in vaping and smoking behaviours over 12 months in regular vapers. A longitudinal study of 3868 regular vapers enrolled on the Internet in 2012-2015 ("baseline"), followed after one (n=1631, 42%), three (n=1337, 35%), six (n=1148, 30%) and 12 months (n=893, 23%). Participants had been vaping for a median of five months at baseline. Most (77%) were former smokers, who had not smoked for a median of three months at baseline. Over 12 months, enjoyment gradually became the most frequently cited reason to vape (93%), and vaping to deal with craving for tobacco gradually decreased (from 87% to 56%). In exclusive vapers (ex-smokers), nicotine concentration in e-liquids decreased over time (from 12 to 9 mg/mL), but puffs/day remained stable (200 puffs/day). After 12 months, 9% of 687 former smokers relapsed to smoking and 28% of 64 daily smokers (dual users) stopped smoking. After 12 months, when participants stopped vaping, they tended to relapse to smoking (+18% daily smokers among those who stopped vaping versus -2% in permanent vapers, preasons to vape. Rates of relapse to smoking were low in former smokers and quit rates were high in current smokers. Stopping vaping was associated with relapsing to smoking. Even in established vapers, vaping behaviour and reasons to vape change over time. This should be taken into account by clinicians, manufacturers and regulators. Results from this non-representative sample can help generate hypotheses that can later be tested in representative samples of vapers. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Multiple graph regularized protein domain ranking

    KAUST Repository

    Wang, Jim Jing-Yan

    2012-11-19

    Background: Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods.Results: To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods.Conclusion: The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. 2012 Wang et al; licensee BioMed Central Ltd.

  18. Multiple graph regularized protein domain ranking

    KAUST Repository

    Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin

    2012-01-01

    Background: Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods.Results: To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods.Conclusion: The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. 2012 Wang et al; licensee BioMed Central Ltd.

  19. Multiple graph regularized protein domain ranking

    Directory of Open Access Journals (Sweden)

    Wang Jim

    2012-11-01

    Full Text Available Abstract Background Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. Results To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. Conclusion The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.

  20. Stream Processing Using Grammars and Regular Expressions

    DEFF Research Database (Denmark)

    Rasmussen, Ulrik Terp

    disambiguation. The first algorithm operates in two passes in a semi-streaming fashion, using a constant amount of working memory and an auxiliary tape storage which is written in the first pass and consumed by the second. The second algorithm is a single-pass and optimally streaming algorithm which outputs...... as much of the parse tree as is semantically possible based on the input prefix read so far, and resorts to buffering as many symbols as is required to resolve the next choice. Optimality is obtained by performing a PSPACE-complete pre-analysis on the regular expression. In the second part we present...... Kleenex, a language for expressing high-performance streaming string processing programs as regular grammars with embedded semantic actions, and its compilation to streaming string transducers with worst-case linear-time performance. Its underlying theory is based on transducer decomposition into oracle...

  1. Factors that prolong the 'postmortem interval until finding' (PMI-f) among community-dwelling elderly individuals in Japan: analysis of registration data.

    Science.gov (United States)

    Ito, Tomoko; Tamiya, Nanako; Takahashi, Hideto; Yamazaki, Kentaro; Yamamoto, Hideki; Sakano, Shoji; Kashiwagi, Masayo; Miyaishi, Satoru

    2012-01-01

    To clarify the factors affecting 'postmortem interval until finding' (PMI-f) among elderly unexpected death cases. Cross-sectional study. All area of Yamagata prefecture in Japan. Entering subjects were 5675 elderly cases with age of ≥65 years selected from all 9002 cases of unexpected death from 2002 to 2007 in Yamagata prefecture between 2002 and 2007. Our final study subjects consisted of 3387 cases sampled with several criteria to assess the factors to prolong PMI-f. The outcome was the postmortem interval until finding (PMI-f) as the time from death until finding the body which we defined in this study. 'Living alone' showed the highest adjusted HR (3.73, 95% CI 3.37 to 4.13), also 'unnatural death' (1.50, 1.28 to 1.75), 'found at own home' (1.37, 1.22 to 1.55) and 'younger subjects' (0.99, 0.98 to 0.99). In the model including interactions with the household situation, we found 'male subjects living alone' and 'female subjects living with family' tended to be found later. PMI-f is an effective outcome for quantitative analyses of risk of bodies left. To prevent the elderly dead bodies left for long time, it is necessary to keep regular home-based contact with elderly individuals living alone.

  2. Optimal time interval between capecitabine intake and radiotherapy in preoperative chemoradiation for locally advanced rectal cancer

    International Nuclear Information System (INIS)

    Yu, Chang Sik; Kim, Tae Won; Kim, Jong Hoon; Choi, Won Sik; Kim, Hee Cheol; Chang, Heung Moon; Ryu, Min Hee; Jang, Se Jin; Ahn, Seung Do; Lee, Sang-wook; Shin, Seong Soo; Choi, Eun Kyung; Kim, Jin Cheon

    2007-01-01

    Purpose: Capecitabine and its metabolites reach peak plasma concentrations 1 to 2 hours after a single oral administration, and concentrations rapidly decrease thereafter. We performed a retrospective analysis to find the optimal time interval between capecitabine administration and radiotherapy for rectal cancer. Methods and Materials: The time interval between capecitabine intake and radiotherapy was measured in patients who were treated with preoperative radiotherapy and concurrent capecitabine for rectal cancer. Patients were classified into the following groups. Group A1 included patients who took capecitabine 1 hour before radiotherapy, and Group B1 included all other patients. Group B1 was then subdivided into Group A2 (patients who took capecitabine 2 hours before radiotherapy) and Group B2. Group B2 was further divided into Group A3 and Group B3 with the same method. Total mesorectal excision was performed 6 weeks after completion of chemoradiation and the pathologic response was evaluated. Results: A total of 200 patients were enrolled in this study. Pathologic examination showed that Group A1 had higher rates of complete regression of primary tumors in the rectum (23.5% vs. 9.6%, p = 0.01), good response (44.7% vs. 25.2%, p = 0.006), and lower T stages (p = 0.021) compared with Group B1; however, Groups A2 and A3 did not show any improvement compared with Groups B2 and B3. Multivariate analysis showed that increases in primary tumors in the rectum and good response were only significant when capecitabine was administered 1 hour before radiotherapy. Conclusion: In preoperative chemoradiotherapy for rectal cancer, the pathologic response could be improved by administering capecitabine 1 hour before radiotherapy

  3. Work and family life of childrearing women workers in Japan: comparison of non-regular employees with short working hours, non-regular employees with long working hours, and regular employees.

    Science.gov (United States)

    Seto, Masako; Morimoto, Kanehisa; Maruyama, Soichiro

    2006-05-01

    This study assessed the working and family life characteristics, and the degree of domestic and work strain of female workers with different employment statuses and weekly working hours who are rearing children. Participants were the mothers of preschoolers in a large Japanese city. We classified the women into three groups according to the hours they worked and their employment conditions. The three groups were: non-regular employees working less than 30 h a week (n=136); non-regular employees working 30 h or more per week (n=141); and regular employees working 30 h or more a week (n=184). We compared among the groups the subjective values of work, financial difficulties, childcare and housework burdens, psychological effects, and strains such as work and family strain, work-family conflict, and work dissatisfaction. Regular employees were more likely to report job pressures and inflexible work schedules and to experience more strain related to work and family than non-regular employees. Non-regular employees were more likely to be facing financial difficulties. In particular, non-regular employees working longer hours tended to encounter socioeconomic difficulties and often lacked support from family and friends. Female workers with children may have different social backgrounds and different stressors according to their working hours and work status.

  4. Regularized inversion of controlled source and earthquake data

    International Nuclear Information System (INIS)

    Ramachandran, Kumar

    2012-01-01

    Estimation of the seismic velocity structure of the Earth's crust and upper mantle from travel-time data has advanced greatly in recent years. Forward modelling trial-and-error methods have been superseded by tomographic methods which allow more objective analysis of large two-dimensional and three-dimensional refraction and/or reflection data sets. The fundamental purpose of travel-time tomography is to determine the velocity structure of a medium by analysing the time it takes for a wave generated at a source point within the medium to arrive at a distribution of receiver points. Tomographic inversion of first-arrival travel-time data is a nonlinear problem since both the velocity of the medium and ray paths in the medium are unknown. The solution for such a problem is typically obtained by repeated application of linearized inversion. Regularization of the nonlinear problem reduces the ill posedness inherent in the tomographic inversion due to the under-determined nature of the problem and the inconsistencies in the observed data. This paper discusses the theory of regularized inversion for joint inversion of controlled source and earthquake data, and results from synthetic data testing and application to real data. The results obtained from tomographic inversion of synthetic data and real data from the northern Cascadia subduction zone show that the velocity model and hypocentral parameters can be efficiently estimated using this approach. (paper)

  5. A dynamical regularization algorithm for solving inverse source problems of elliptic partial differential equations

    Science.gov (United States)

    Zhang, Ye; Gong, Rongfang; Cheng, Xiaoliang; Gulliksson, Mårten

    2018-06-01

    This study considers the inverse source problem for elliptic partial differential equations with both Dirichlet and Neumann boundary data. The unknown source term is to be determined by additional boundary conditions. Unlike the existing methods found in the literature, which usually employ the first-order in time gradient-like system (such as the steepest descent methods) for numerically solving the regularized optimization problem with a fixed regularization parameter, we propose a novel method with a second-order in time dissipative gradient-like system and a dynamical selected regularization parameter. A damped symplectic scheme is proposed for the numerical solution. Theoretical analysis is given for both the continuous model and the numerical algorithm. Several numerical examples are provided to show the robustness of the proposed algorithm.

  6. Geometric regularizations and dual conifold transitions

    International Nuclear Information System (INIS)

    Landsteiner, Karl; Lazaroiu, Calin I.

    2003-01-01

    We consider a geometric regularization for the class of conifold transitions relating D-brane systems on noncompact Calabi-Yau spaces to certain flux backgrounds. This regularization respects the SL(2,Z) invariance of the flux superpotential, and allows for computation of the relevant periods through the method of Picard-Fuchs equations. The regularized geometry is a noncompact Calabi-Yau which can be viewed as a monodromic fibration, with the nontrivial monodromy being induced by the regulator. It reduces to the original, non-monodromic background when the regulator is removed. Using this regularization, we discuss the simple case of the local conifold, and show how the relevant field-theoretic information can be extracted in this approach. (author)

  7. The incidence and clinical associated factors of interval colorectal cancers in Southern Taiwan

    Directory of Open Access Journals (Sweden)

    Cheng-En Tsai

    2018-03-01

    Conclusion: The prevalence of interval CRC in the present study is 3.28%. Comorbidity with ESRD and shorter ascending colon withdrawal time could be factors associated with interval CRC. Good colon preparation for the patients with ESRD and more ascending colon withdrawal time could reduce the interval CRC.

  8. Adaptive regularization

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Rasmussen, Carl Edward; Svarer, C.

    1994-01-01

    Regularization, e.g., in the form of weight decay, is important for training and optimization of neural network architectures. In this work the authors provide a tool based on asymptotic sampling theory, for iterative estimation of weight decay parameters. The basic idea is to do a gradient desce...

  9. Regularizing portfolio optimization

    International Nuclear Information System (INIS)

    Still, Susanne; Kondor, Imre

    2010-01-01

    The optimization of large portfolios displays an inherent instability due to estimation error. This poses a fundamental problem, because solutions that are not stable under sample fluctuations may look optimal for a given sample, but are, in effect, very far from optimal with respect to the average risk. In this paper, we approach the problem from the point of view of statistical learning theory. The occurrence of the instability is intimately related to over-fitting, which can be avoided using known regularization methods. We show how regularized portfolio optimization with the expected shortfall as a risk measure is related to support vector regression. The budget constraint dictates a modification. We present the resulting optimization problem and discuss the solution. The L2 norm of the weight vector is used as a regularizer, which corresponds to a diversification 'pressure'. This means that diversification, besides counteracting downward fluctuations in some assets by upward fluctuations in others, is also crucial because it improves the stability of the solution. The approach we provide here allows for the simultaneous treatment of optimization and diversification in one framework that enables the investor to trade off between the two, depending on the size of the available dataset.

  10. Regularizing portfolio optimization

    Science.gov (United States)

    Still, Susanne; Kondor, Imre

    2010-07-01

    The optimization of large portfolios displays an inherent instability due to estimation error. This poses a fundamental problem, because solutions that are not stable under sample fluctuations may look optimal for a given sample, but are, in effect, very far from optimal with respect to the average risk. In this paper, we approach the problem from the point of view of statistical learning theory. The occurrence of the instability is intimately related to over-fitting, which can be avoided using known regularization methods. We show how regularized portfolio optimization with the expected shortfall as a risk measure is related to support vector regression. The budget constraint dictates a modification. We present the resulting optimization problem and discuss the solution. The L2 norm of the weight vector is used as a regularizer, which corresponds to a diversification 'pressure'. This means that diversification, besides counteracting downward fluctuations in some assets by upward fluctuations in others, is also crucial because it improves the stability of the solution. The approach we provide here allows for the simultaneous treatment of optimization and diversification in one framework that enables the investor to trade off between the two, depending on the size of the available dataset.

  11. Distortion of time interval reproduction in an epileptic patient with a focal lesion in the right anterior insular/inferior frontal cortices.

    Science.gov (United States)

    Monfort, Vincent; Pfeuty, Micha; Klein, Madelyne; Collé, Steffie; Brissart, Hélène; Jonas, Jacques; Maillard, Louis

    2014-11-01

    This case report on an epileptic patient suffering from a focal lesion at the junction of the right anterior insular cortex (AIC) and the adjacent inferior frontal cortex (IFC) provides the first evidence that damage to this brain region impairs temporal performance in a visual time reproduction task in which participants had to reproduce the presentation duration (3, 5 and 7s) of emotionally-neutral and -negative pictures. Strikingly, as compared to a group of healthy subjects, the AIC/IFC case considerably overestimated reproduction times despite normal variability. The effect was obtained in all duration and emotion conditions. Such a distortion in time reproduction was not observed in four other epileptic patients without insular or inferior frontal damage. Importantly, the absolute extent of temporal over-reproduction increased in proportion to the magnitude of the target durations, which concurs with the scalar property of interval timing, and points to an impairment of time-specific rather than of non temporal (such as motor) mechanisms. Our data suggest that the disability in temporal reproduction of the AIC/IFC case would result from a distorted memory representation of the encoded duration, occurring during the process of storage and/or of recovery from memory and leading to a deviation of the temporal judgment during the reproduction task. These findings support the recent proposal that the anterior insular/inferior frontal cortices would be involved in time interval representation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Tessellating the Sphere with Regular Polygons

    Science.gov (United States)

    Soto-Johnson, Hortensia; Bechthold, Dawn

    2004-01-01

    Tessellations in the Euclidean plane and regular polygons that tessellate the sphere are reviewed. The regular polygons that can possibly tesellate the sphere are spherical triangles, squares and pentagons.

  13. Study on risk insight for additional ILRT interval extension

    International Nuclear Information System (INIS)

    Seo, M. R.; Hong, S. Y.; Kim, M. K.; Chung, B. S.; Oh, H. C.

    2005-01-01

    In U.S., the containment Integrated Leakage Rate Test (ILRT) interval was extended from 3 times per 10 years to once per 10 years based on NUREG-1493 'Performance-Based Containment Leak-Test Program' in 1995. In September, 2001, ILRT interval was extended up to once per 15 years based on Nuclear Energy Industry (NEI) provisional guidance 'Interim Guidance for Performing Risk Impact Assessments In Support of One-Time Extensions for Containment Integrated Leakage Rate Test Surveillance Intervals'. In Korea, the containment ILRT was performed with 5 year interval. But, in MOST(Ministry of Science and Technology) Notice 2004-15 'Standard for the Leak- Rate Test of the Nuclear Reactor Containment', the extension of the ILRT interval to once per 10 year can be allowed if some conditions are met. So, the safety analysis for the extension of Yonggwang Nuclear (YGN) Unit 1 and 2 ILRT interval extension to once per 10 years was completed based on the methodology in NUREG-1493. But, during review process by regulatory body, KINS, it was required that some various risk insight or index for risk analysis should be developed. So, we began to study NEI interim report for 15 year ILRT interval extension. As previous analysis based on NUREG-1493, MACCS II (MELCOR Accident Consequence Code System) computer code was used for the risk analysis of the population, and the population dose was selected as a reference index for the risk evaluation

  14. Regular and promotional sales in new product life-cycle: A competitive approach

    OpenAIRE

    Guidolin, Mariangela; Guseo, Renato; Mortarino, Cinzia

    2016-01-01

    In this paper, we consider the application of the Lotka-Volterra model with churn effects, LVch, (Guidolin and Guseo, 2015) to the case of a confectionary product produced in Italy and recently commercialized in a European country. Weekly time series, referring separately to quantities of regular and promotional sales, are available. Their joint inspection highlighted the presence of compensatory dynamics suggesting the study with the LVch to estimate whether competition between regular and p...

  15. Regular exercise behaviour and intention and symptoms of anxiety and depression in coronary heart disease patients across Europe: Results from the EUROASPIRE III survey.

    Science.gov (United States)

    Prugger, Christof; Wellmann, Jürgen; Heidrich, Jan; De Bacquer, Dirk; De Smedt, Delphine; De Backer, Guy; Reiner, Željko; Empana, Jean-Philippe; Fras, Zlatko; Gaita, Dan; Jennings, Catriona; Kotseva, Kornelia; Wood, David; Keil, Ulrich

    2017-01-01

    Regular exercise lowers the risk of cardiovascular death in coronary heart disease (CHD) patients. We aimed to investigate regular exercise behaviour and intention in relation to symptoms of anxiety and depression in CHD patients across Europe. This study was based on a multicentre cross-sectional survey. In the EUROpean Action on Secondary and Primary Prevention through Intervention to Reduce Events (EUROASPIRE) III survey, 8966 CHD patients patients exercised or intended to exercise regularly was assessed using the Stages of Change questionnaire in 8330 patients. Symptoms of anxiety and depression were evaluated using the Hospital Anxiety and Depression Scale. Total physical activity was measured by the International Physical Activity Questionnaire in patients from a subset of 14 countries. Overall, 50.3% of patients were not intending to exercise regularly, 15.9% were intending to exercise regularly, and 33.8% were exercising regularly. Patients with severe symptoms of depression less frequently exercised regularly than patients with symptoms in the normal range (20.2%, 95% confidence interval (CI) 14.8-26.8 vs 36.7%, 95% CI 29.8-44.2). Among patients not exercising regularly, patients with severe symptoms of depression were less likely to have an intention to exercise regularly (odds ratio 0.62, 95% CI 0.46-0.85). Symptoms of anxiety did not affect regular exercise intention. In sensitivity analysis, results were consistent when adjusting for total physical activity. Lower frequency of regular exercise and decreased likelihood of exercise intention were observed in CHD patients with severe depressive symptoms. Severe symptoms of depression may preclude CHD patients from performing regular exercise. © The European Society of Cardiology 2016.

  16. Accretion onto some well-known regular black holes

    International Nuclear Information System (INIS)

    Jawad, Abdul; Shahzad, M.U.

    2016-01-01

    In this work, we discuss the accretion onto static spherically symmetric regular black holes for specific choices of the equation of state parameter. The underlying regular black holes are charged regular black holes using the Fermi-Dirac distribution, logistic distribution, nonlinear electrodynamics, respectively, and Kehagias-Sftesos asymptotically flat regular black holes. We obtain the critical radius, critical speed, and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density, and the rate of change of the mass for each of the regular black holes. (orig.)

  17. Accretion onto some well-known regular black holes

    Energy Technology Data Exchange (ETDEWEB)

    Jawad, Abdul; Shahzad, M.U. [COMSATS Institute of Information Technology, Department of Mathematics, Lahore (Pakistan)

    2016-03-15

    In this work, we discuss the accretion onto static spherically symmetric regular black holes for specific choices of the equation of state parameter. The underlying regular black holes are charged regular black holes using the Fermi-Dirac distribution, logistic distribution, nonlinear electrodynamics, respectively, and Kehagias-Sftesos asymptotically flat regular black holes. We obtain the critical radius, critical speed, and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density, and the rate of change of the mass for each of the regular black holes. (orig.)

  18. Accretion onto some well-known regular black holes

    Science.gov (United States)

    Jawad, Abdul; Shahzad, M. Umair

    2016-03-01

    In this work, we discuss the accretion onto static spherically symmetric regular black holes for specific choices of the equation of state parameter. The underlying regular black holes are charged regular black holes using the Fermi-Dirac distribution, logistic distribution, nonlinear electrodynamics, respectively, and Kehagias-Sftesos asymptotically flat regular black holes. We obtain the critical radius, critical speed, and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density, and the rate of change of the mass for each of the regular black holes.

  19. Regular rehearsal helps in consolidation of long term memory.

    Science.gov (United States)

    Parle, Milind; Singh, Nirmal; Vasudevan, Mani

    2006-01-01

    Memory, one of the most complex functions of the brain comprises of multiple components such as perception, registration, consolidation, storage, retrieval and decay. The present study was undertaken to evaluate the impact of different training sessions on the retention capacity of rats. The capacity of retention of learnt task was measured using exteroceptive behavioral models such as Hexagonal swimming pool apparatus, Hebb-Williams maze and Elevated plus-maze. A total of 150 rats divided into fifteen groups were employed in the present study. The animals were subjected to different training sessions during first three days. The ability to retain the learned task was tested after single, sub-acute, acute, sub-chronic and chronic exposure to above exteroceptive memory models in separate groups of animals. The memory score of all animals was recorded after 72 h, 192 h and 432 h of their last training trial. Rats of single exposure group did not show any effect on memory. Sub-acute training group animals showed improved memory up to 72 h only, where as in acute and sub-chronic training groups this memory improvement was extended up to 192 h. The rats, which were subjected to chronic exposures showed a significant improvement in retention capacity that lasted up to a period of eighteen days. These observations suggest that repeated rehearsals at regular intervals are probably necessary for consolidation of long-term memory. It was observed that sub-acute, acute and sub-chronic exposures, improved the retrieval ability of rats but this memory improving effect was short lived. Thus, rehearsal or training plays a crucial role in enhancing one's capacity of retaining the learnt information. Key PointsThe present study underlines the importance of regular rehearsals in enhancing one's capacity of retaining the learnt information. " Sub-acute, acute & sub-chronic rehearsals result in storing of information for a limited period of time.Quick decay of information or

  20. Regularizing Feynman path integrals using the generalized Kontsevich-Vishik trace

    Science.gov (United States)

    Hartung, Tobias

    2017-12-01

    A fully regulated definition of Feynman's path integral is presented here. The proposed re-formulation of the path integral coincides with the familiar formulation whenever the path integral is well defined. In particular, it is consistent with respect to lattice formulations and Wick rotations, i.e., it can be used in Euclidean and Minkowski space-time. The path integral regularization is introduced through the generalized Kontsevich-Vishik trace, that is, the extension of the classical trace to Fourier integral operators. Physically, we are replacing the time-evolution semi-group by a holomorphic family of operators such that the corresponding path integrals are well defined in some half space of C . The regularized path integral is, thus, defined through analytic continuation. This regularization can be performed by means of stationary phase approximation or computed analytically depending only on the Hamiltonian and the observable (i.e., known a priori). In either case, the computational effort to evaluate path integrals or expectations of observables reduces to the evaluation of integrals over spheres. Furthermore, computations can be performed directly in the continuum and applications (analytic computations and their implementations) to a number of models including the non-trivial cases of the massive Schwinger model and a φ4 theory.

  1. Diagrammatic methods in phase-space regularization

    International Nuclear Information System (INIS)

    Bern, Z.; Halpern, M.B.; California Univ., Berkeley

    1987-11-01

    Using the scalar prototype and gauge theory as the simplest possible examples, diagrammatic methods are developed for the recently proposed phase-space form of continuum regularization. A number of one-loop and all-order applications are given, including general diagrammatic discussions of the nogrowth theorem and the uniqueness of the phase-space stochastic calculus. The approach also generates an alternate derivation of the equivalence of the large-β phase-space regularization to the more conventional coordinate-space regularization. (orig.)

  2. Programming with Intervals

    Science.gov (United States)

    Matsakis, Nicholas D.; Gross, Thomas R.

    Intervals are a new, higher-level primitive for parallel programming with which programmers directly construct the program schedule. Programs using intervals can be statically analyzed to ensure that they do not deadlock or contain data races. In this paper, we demonstrate the flexibility of intervals by showing how to use them to emulate common parallel control-flow constructs like barriers and signals, as well as higher-level patterns such as bounded-buffer producer-consumer. We have implemented intervals as a publicly available library for Java and Scala.

  3. Factors associated with regular consumption of obesogenic foods: National School-Based Student Health Hurvey, 2012

    Directory of Open Access Journals (Sweden)

    Giovana LONGO-SILVA

    Full Text Available ABSTRACT Objective: To investigate the frequency of consumption of obesogenic foods among adolescents and its association with sociodemographic, family, behavioral, and environmental variables. Methods: Secondary data from the National School-Based Student Health Hurvey were analyzed from a representative sample of 9th grade Brazilian students (high school. A self-administered questionnaire, organized into thematic blocks, was used. The dependent variables were the consumption of deep fried snacks, packaged snacks, sugar candies, and soft drinks; consumption frequency for the seven days preceding the study was analyzed. Bivariate analysis was carried out to determine the empirical relationship between the regular consumption of these foods (≥3 days/week with sociodemographic, family, behavioral, and school structural variables. p-value <0.20 was used as the criterion for initial inclusion in the multivariate logistic analysis, which was conducted using the "Enter" method, and the results were expressed as adjusted odds ratios with 95% confidence interval and p<0.05 indicating a statistically significance. Results: Regular food consumption ranged from 27.17% to 65.96%. The variables female gender, mobile phone ownership, Internet access at home, tobacco use, alcohol consumption, regular physical activity, eating while watching television or studying, watching television for at least 2 hours a day, and not willing to lose weight were associated in the final logistic models of all foods analyzed. Conclusion: It was concluded that fried snacks, packaged snacks, sugar candies, and soft drinks are regularly consumed by adolescents and that such consumption was associated with the sociodemographic, family, behavioral, and school structural variables.

  4. Simultaneous determination of radionuclides separable into natural decay series by use of time-interval analysis

    International Nuclear Information System (INIS)

    Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro

    2004-01-01

    A delayed coincidence method, time-interval analysis (TIA), has been applied to successive α-α decay events on the millisecond time-scale. Such decay events are part of the 220 Rn→ 216 Po (T 1/2 145 ms) (Th-series) and 219 Rn→ 215 Po (T 1/2 1.78 ms) (Ac-series). By using TIA in addition to measurement of 226 Ra (U-series) from α-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject β-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N 2 gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the 221 Fr→ 217 At (T 1/2 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the 225 Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples. (orig.)

  5. The Interval Slope Method for Long-Term Forecasting of Stock Price Trends

    Directory of Open Access Journals (Sweden)

    Chun-xue Nie

    2016-01-01

    Full Text Available A stock price is a typical but complex type of time series data. We used the effective prediction of long-term time series data to schedule an investment strategy and obtain higher profit. Due to economic, environmental, and other factors, it is very difficult to obtain a precise long-term stock price prediction. The exponentially segmented pattern (ESP is introduced here and used to predict the fluctuation of different stock data over five future prediction intervals. The new feature of stock pricing during the subinterval, named the interval slope, can characterize fluctuations in stock price over specific periods. The cumulative distribution function (CDF of MSE was compared to those of MMSE-BC and SVR. We concluded that the interval slope developed here can capture more complex dynamics of stock price trends. The mean stock price can then be predicted over specific time intervals relatively accurately, in which multiple mean values over time intervals are used to express the time series in the long term. In this way, the prediction of long-term stock price can be more precise and prevent the development of cumulative errors.

  6. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  7. The persistence of the attentional bias to regularities in a changing environment.

    Science.gov (United States)

    Yu, Ru Qi; Zhao, Jiaying

    2015-10-01

    The environment often is stable, but some aspects may change over time. The challenge for the visual system is to discover and flexibly adapt to the changes. We examined how attention is shifted in the presence of changes in the underlying structure of the environment. In six experiments, observers viewed four simultaneous streams of objects while performing a visual search task. In the first half of each experiment, the stream in the structured location contained regularities, the shapes in the random location were randomized, and gray squares appeared in two neutral locations. In the second half, the stream in the structured or the random location may change. In the first half of all experiments, visual search was facilitated in the structured location, suggesting that attention was consistently biased toward regularities. In the second half, this bias persisted in the structured location when no change occurred (Experiment 1), when the regularities were removed (Experiment 2), or when new regularities embedded in the original or novel stimuli emerged in the previously random location (Experiments 3 and 6). However, visual search was numerically but no longer reliably faster in the structured location when the initial regularities were removed and new regularities were introduced in the previously random location (Experiment 4), or when novel random stimuli appeared in the random location (Experiment 5). This suggests that the attentional bias was weakened. Overall, the results demonstrate that the attentional bias to regularities was persistent but also sensitive to changes in the environment.

  8. Measurement of subcritical multiplication by the interval distribution method

    International Nuclear Information System (INIS)

    Nelson, G.W.

    1985-01-01

    The prompt decay constant or the subcritical neutron multiplication may be determined by measuring the distribution of the time intervals between successive neutron counts. The distribution data is analyzed by least-squares fitting to a theoretical distribution function derived from a point reactor probability model. Published results of measurements with one- and two-detector systems are discussed. Data collection times are shorter, and statistical errors are smaller the nearer the system is to delayed critical. Several of the measurements indicate that a shorter data collection time and higher accuracy are possible with the interval distribution method than with the Feynman variance method

  9. First Passage Time Intervals of Gaussian Processes

    Science.gov (United States)

    Perez, Hector; Kawabata, Tsutomu; Mimaki, Tadashi

    1987-08-01

    The first passage time problem of a stationary Guassian process is theretically and experimentally studied. Renewal functions are derived for a time-dependent boundary and numerically calculated for a Gaussian process having a seventh-order Butterworth spectrum. The results show a multipeak property not only for the constant boundary but also for a linearly increasing boundary. The first passage time distribution densities were experimentally determined for a constant boundary. The renewal functions were shown to be a fairly good approximation to the distribution density over a limited range.

  10. Improved resolution and reliability in dynamic PET using Bayesian regularization of MRTM2

    DEFF Research Database (Denmark)

    Agn, Mikael; Svarer, Claus; Frokjaer, Vibe G.

    2014-01-01

    This paper presents a mathematical model that regularizes dynamic PET data by using a Bayesian framework. We base the model on the well known two-parameter multilinear reference tissue method MRTM2 and regularize on the assumption that spatially close regions have similar parameters. The developed...... model is compared to the conventional approach of improving the low signal-to-noise ratio of PET data, i.e., spatial filtering of each time frame independently by a Gaussian kernel. We show that the model handles high levels of noise better than the conventional approach, while at the same time...

  11. Rhythmic regularity revisited : Is beat induction indeed pre-attentive?

    NARCIS (Netherlands)

    Bouwer, F.; Honing, H.; Cambouropoulos, E.; Tsougras, C.; Mavromatis, P.; Pastiadis, K.

    2012-01-01

    When listening to musical rhythm, regularity in time is often perceived in the form of a beat or pulse. External rhythmic events can give rise to the perception of a beat, through a process known as beat induction. In addition, internal processes, like long-term memory, working memory and automatic

  12. Interpregnancy interval and risk of autistic disorder.

    Science.gov (United States)

    Gunnes, Nina; Surén, Pål; Bresnahan, Michaeline; Hornig, Mady; Lie, Kari Kveim; Lipkin, W Ian; Magnus, Per; Nilsen, Roy Miodini; Reichborn-Kjennerud, Ted; Schjølberg, Synnve; Susser, Ezra Saul; Øyen, Anne-Siri; Stoltenberg, Camilla

    2013-11-01

    A recent California study reported increased risk of autistic disorder in children conceived within a year after the birth of a sibling. We assessed the association between interpregnancy interval and risk of autistic disorder using nationwide registry data on pairs of singleton full siblings born in Norway. We defined interpregnancy interval as the time from birth of the first-born child to conception of the second-born child in a sibship. The outcome of interest was autistic disorder in the second-born child. Analyses were restricted to sibships in which the second-born child was born in 1990-2004. Odds ratios (ORs) were estimated by fitting ordinary logistic models and logistic generalized additive models. The study sample included 223,476 singleton full-sibling pairs. In sibships with interpregnancy intervals autistic disorder, compared with 0.13% in the reference category (≥ 36 months). For interpregnancy intervals shorter than 9 months, the adjusted OR of autistic disorder in the second-born child was 2.18 (95% confidence interval 1.42-3.26). The risk of autistic disorder in the second-born child was also increased for interpregnancy intervals of 9-11 months in the adjusted analysis (OR = 1.71 [95% CI = 1.07-2.64]). Consistent with a previous report from California, interpregnancy intervals shorter than 1 year were associated with increased risk of autistic disorder in the second-born child. A possible explanation is depletion of micronutrients in mothers with closely spaced pregnancies.

  13. Effect of palady and cup feeding on premature neonates′ weight gain and reaching full oral feeding time interval

    Directory of Open Access Journals (Sweden)

    Maryam Marofi

    2016-01-01

    Full Text Available Background: Premature neonates′ feeding is of great importance due to its effective role in their growth. These neonates should reach an independent oral nutrition stage before being discharged from the Neonatal Intensive care Unit. Therefore, the researcher decided to conduct a study on the effect of palady and cup feeding on premature neonates′ weight gain and their reaching full oral feeding time interval. Materials and Methods: This is a clinical trial with a quantitative design conducted on 69 premature infants (gestational age between 29 and 32 weeks who were assigned to cup (n = 34 and palady (n = 35 feeding groups through random allocation. The first feeding was administrated either by cup or palady method in each shift within seven sequential days (total of 21 cup and palady feedings. Then, the rest of feeding was administrated by gavage. Results: Mean hospitalization time (cup = 39.01 and palady = 30.4; P < 0.001 and mean time interval to reach full oral feeding (cup = 33.7 and palady = 24.1; P < 0.001 were significantly lower in palady group compared to cup group. Mean weight changes of neonates 7 weeks after the intervention compared to those in the beginning of the intervention were significantly more in palady group compared to the cup group (cup = 146.7 and palady = 198.8; P < 0.001. Conclusions: The neonates in palady group reached full oral feeding earlier than those of cup group. Subjects′ weight gain was also higher in palady group compared to the cup group. Premature neonates with over 30 weeks of gestational age and physiological stability can be fed by palady.

  14. The uniqueness of the regularization procedure

    International Nuclear Information System (INIS)

    Brzezowski, S.

    1981-01-01

    On the grounds of the BPHZ procedure, the criteria of correct regularization in perturbation calculations of QFT are given, together with the prescription for dividing the regularized formulas into the finite and infinite parts. (author)

  15. 5 CFR 551.421 - Regular working hours.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Regular working hours. 551.421 Section... Activities § 551.421 Regular working hours. (a) Under the Act there is no requirement that a Federal employee... distinction based on whether the activity is performed by an employee during regular working hours or outside...

  16. Regular extensions of some classes of grammars

    NARCIS (Netherlands)

    Nijholt, Antinus

    Culik and Cohen introduced the class of LR-regular grammars, an extension of the LR(k) grammars. In this report we consider the analogous extension of the LL(k) grammers, called the LL-regular grammars. The relations of this class of grammars to other classes of grammars are shown. Every LL-regular

  17. The LPM effect in sequential bremsstrahlung: dimensional regularization

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, Peter; Chang, Han-Chih [Department of Physics, University of Virginia,382 McCormick Road, Charlottesville, VA 22894-4714 (United States); Iqbal, Shahin [National Centre for Physics,Quaid-i-Azam University Campus, Islamabad, 45320 (Pakistan)

    2016-10-19

    The splitting processes of bremsstrahlung and pair production in a medium are coherent over large distances in the very high energy limit, which leads to a suppression known as the Landau-Pomeranchuk-Migdal (LPM) effect. Of recent interest is the case when the coherence lengths of two consecutive splitting processes overlap (which is important for understanding corrections to standard treatments of the LPM effect in QCD). In previous papers, we have developed methods for computing such corrections without making soft-gluon approximations. However, our methods require consistent treatment of canceling ultraviolet (UV) divergences associated with coincident emission times, even for processes with tree-level amplitudes. In this paper, we show how to use dimensional regularization to properly handle the UV contributions. We also present a simple diagnostic test that any consistent UV regularization method for this problem needs to pass.

  18. High intensity interval and moderate continuous cycle training in a physical education programme improves health-related fitness in young females

    Directory of Open Access Journals (Sweden)

    K Mazurek

    2016-04-01

    Full Text Available The aim of the study was to investigate the effects of eight weeks of regular physical education classes supplemented with high intensity interval cycle exercise (HIIE or continuous cycle exercises of moderate intensity (CME. Forty-eight collegiate females exercising in two regular physical education classes per week were randomly assigned to two programmes (HIIE; n=24 or CME; n=24 of additional (one session of 63 minutes per week physical activity for 8 weeks. Participants performed HIIE comprising 2 series of 6x10 s sprinting with maximal pedalling cadence and active recovery pedalling with intensity 65%–75% HRmax or performed CME corresponding to 65%-75% HRmax. Before and after the 8-week programmes, anthropometric data and aero- and anaerobic capacity were measured. Two-way ANOVA revealed a significant time main effect for VO2max (p<0.001, similar improvements being found in both groups (+12% in HIIE and +11% in CME, despite body mass not changing significantly (p=0.59; +0.4% in HIIE and -0.1% in CME. A significant main time effect was found for relative fat mass (FM and fat-free mass (FFM (p<0.001 and p<0.001, respectively. A group x time interaction effect was found for relative FM and FFM (p=0.018 and p=0.018; a greater reduction in FM and greater increase in FFM were noted in the CME than the HIIE group. Improvements in anaerobic power were observed in both groups (p<0.001, but it was greater in the HIIE group (interaction effect, p=0.022. Weight loss is not mandatory for exercise-induced effects on improving aerobic and anaerobic capacity in collegiate females. Eight weeks of regular physical education classes supplemented with CME sessions are more effective in improving body composition than physical education classes supplemented with HIIE sessions. In contrast to earlier, smaller trials, similar improvements in aerobic capacity were observed following physical activity with additional HIIE or CME sessions.

  19. Near-Regular Structure Discovery Using Linear Programming

    KAUST Repository

    Huang, Qixing

    2014-06-02

    Near-regular structures are common in manmade and natural objects. Algorithmic detection of such regularity greatly facilitates our understanding of shape structures, leads to compact encoding of input geometries, and enables efficient generation and manipulation of complex patterns on both acquired and synthesized objects. Such regularity manifests itself both in the repetition of certain geometric elements, as well as in the structured arrangement of the elements. We cast the regularity detection problem as an optimization and efficiently solve it using linear programming techniques. Our optimization has a discrete aspect, that is, the connectivity relationships among the elements, as well as a continuous aspect, namely the locations of the elements of interest. Both these aspects are captured by our near-regular structure extraction framework, which alternates between discrete and continuous optimizations. We demonstrate the effectiveness of our framework on a variety of problems including near-regular structure extraction, structure-preserving pattern manipulation, and markerless correspondence detection. Robustness results with respect to geometric and topological noise are presented on synthesized, real-world, and also benchmark datasets. © 2014 ACM.

  20. The effect of pouring time on the dimensional accuracy of casts made from different irreversible hydrocolloid impression materials

    Directory of Open Access Journals (Sweden)

    Supneet Singh Wadhwa

    2013-01-01

    Full Text Available Aims and Objectives: To determine the time dependent accuracy of casts made from three different irreversible hydrocolloids. Materials and Methods: The effect of delayed pouring on the accuracy of three different irreversible hydrocolloid impression materials - Regular set CA 37(Cavex, The Netherlands, regular set chromatic (Jeltrate, Dentsply, and fast set (Hydrogum soft, Zhermack Clinical was investigated. A brass master die that contained two identical posts simulating two complete crown-tapered abutment preparations with reference grooves served as a standardized master model. A total of 120 impressions were made using specially prepared stock-perforated brass tray with 40 impressions of each material. The impressions were further sub-grouped according to four different storage time intervals: 0 min (immediately, 12 min, 30 min, and 1 h. The impressions were stored at room temperature in a zip-lock plastic bag. Interabutment and intraabutment distances were measured in the recovered stone dies (Type IV, Kalrock using a profile projector with an accuracy of 0.001 mm. The data so obtained was analyzed statistically. Results: Results of this study showed no statistically significant differences in the accuracy of casts obtained at different time intervals. Conclusion: Because it is not always possible to pour the impression immediately in routine clinical practice, all irreversible hydrocolloid materials studied could be stored in a zip-lock plastic bag for upto 1 h without any significant distortion.

  1. An Intervention to Reduce the Time Interval Between Hospital Entry and Emergency Coronary Angiography in Patients with ST-Elevation Myocardial Infarction.

    Science.gov (United States)

    Karkabi, Basheer; Jaffe, Ronen; Halon, David A; Merdler, Amnon; Khader, Nader; Rubinshtein, Ronen; Goldstein, Jacob; Zafrir, Barak; Zissman, Keren; Ben-Dov, Nissan; Gabrielly, Michael; Fuks, Alex; Shiran, Avinoam; Adawi, Salim; Hellman, Yaron; Shahla, Johny; Halabi, Salim; Flugelman, Moshe Y; Cohen, Shai; Bergman, Irina; Kassem, Sameer; Shapira, Chen

    2017-09-01

    Outcomes of patients with acute ST-elevation myocardial infarction (STEMI) are strongly correlated to the time interval from hospital entry to primary percutaneous coronary intervention (PPCI). Current guidelines recommend a door to balloon time of < 90 minutes. To reduce the time from hospital admission to PPCI and to increase the proportion of patients treated within 90 minutes. In March 2013 the authors launched a seven-component intervention program:  Direct patient evacuation by out-of-hospital emergency medical services to the coronary intensive care unit or catheterization laboratory Education program for the emergency department staff Dissemination of information regarding the urgency of the PPCI decision Activation of the catheterization team by a single phone call Reimbursement for transportation costs to on-call staff who use their own cars Improvement in the quality of medical records Investigation of failed cases and feedback. During the 14 months prior to the intervention, initiation of catheterization occurred within 90 minutes of hospital arrival in 88/133 patients(65%); during the 18 months following the start of the intervention, the rate was 181/200 (90%) (P < 0.01). The respective mean/median times to treatment were 126/67 minutes and 52/47 minutes (P < 0.01). Intervention also resulted in shortening of the time interval from hospital entry to PPCI on nights and weekends. Following implementation of a comprehensive intervention, the time from hospital admission to PPCI of STEMI patients shortened significantly, as did the proportion of patients treated within 90 minutes of hospital arrival.

  2. Regular Expression Matching and Operational Semantics

    Directory of Open Access Journals (Sweden)

    Asiri Rathnayake

    2011-08-01

    Full Text Available Many programming languages and tools, ranging from grep to the Java String library, contain regular expression matchers. Rather than first translating a regular expression into a deterministic finite automaton, such implementations typically match the regular expression on the fly. Thus they can be seen as virtual machines interpreting the regular expression much as if it were a program with some non-deterministic constructs such as the Kleene star. We formalize this implementation technique for regular expression matching using operational semantics. Specifically, we derive a series of abstract machines, moving from the abstract definition of matching to increasingly realistic machines. First a continuation is added to the operational semantics to describe what remains to be matched after the current expression. Next, we represent the expression as a data structure using pointers, which enables redundant searches to be eliminated via testing for pointer equality. From there, we arrive both at Thompson's lockstep construction and a machine that performs some operations in parallel, suitable for implementation on a large number of cores, such as a GPU. We formalize the parallel machine using process algebra and report some preliminary experiments with an implementation on a graphics processor using CUDA.

  3. EEG/MEG Source Reconstruction with Spatial-Temporal Two-Way Regularized Regression

    KAUST Repository

    Tian, Tian Siva; Huang, Jianhua Z.; Shen, Haipeng; Li, Zhimin

    2013-01-01

    In this work, we propose a spatial-temporal two-way regularized regression method for reconstructing neural source signals from EEG/MEG time course measurements. The proposed method estimates the dipole locations and amplitudes simultaneously

  4. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  5. Tetravalent one-regular graphs of order 4p2

    DEFF Research Database (Denmark)

    Feng, Yan-Quan; Kutnar, Klavdija; Marusic, Dragan

    2014-01-01

    A graph is one-regular if its automorphism group acts regularly on the set of its arcs. In this paper tetravalent one-regular graphs of order 4p2, where p is a prime, are classified.......A graph is one-regular if its automorphism group acts regularly on the set of its arcs. In this paper tetravalent one-regular graphs of order 4p2, where p is a prime, are classified....

  6. Multiple graph regularized nonnegative matrix factorization

    KAUST Repository

    Wang, Jim Jing-Yan

    2013-10-01

    Non-negative matrix factorization (NMF) has been widely used as a data representation method based on components. To overcome the disadvantage of NMF in failing to consider the manifold structure of a data set, graph regularized NMF (GrNMF) has been proposed by Cai et al. by constructing an affinity graph and searching for a matrix factorization that respects graph structure. Selecting a graph model and its corresponding parameters is critical for this strategy. This process is usually carried out by cross-validation or discrete grid search, which are time consuming and prone to overfitting. In this paper, we propose a GrNMF, called MultiGrNMF, in which the intrinsic manifold is approximated by a linear combination of several graphs with different models and parameters inspired by ensemble manifold regularization. Factorization metrics and linear combination coefficients of graphs are determined simultaneously within a unified object function. They are alternately optimized in an iterative algorithm, thus resulting in a novel data representation algorithm. Extensive experiments on a protein subcellular localization task and an Alzheimer\\'s disease diagnosis task demonstrate the effectiveness of the proposed algorithm. © 2013 Elsevier Ltd. All rights reserved.

  7. Regularization and error assignment to unfolded distributions

    CERN Document Server

    Zech, Gunter

    2011-01-01

    The commonly used approach to present unfolded data only in graphical formwith the diagonal error depending on the regularization strength is unsatisfac-tory. It does not permit the adjustment of parameters of theories, the exclusionof theories that are admitted by the observed data and does not allow the com-bination of data from different experiments. We propose fixing the regulariza-tion strength by a p-value criterion, indicating the experimental uncertaintiesindependent of the regularization and publishing the unfolded data in additionwithout regularization. These considerations are illustrated with three differentunfolding and smoothing approaches applied to a toy example.

  8. REGULAR PATTERN MINING (WITH JITTER ON WEIGHTED-DIRECTED DYNAMIC GRAPHS

    Directory of Open Access Journals (Sweden)

    A. GUPTA

    2017-02-01

    Full Text Available Real world graphs are mostly dynamic in nature, exhibiting time-varying behaviour in structure of the graph, weight on the edges and direction of the edges. Mining regular patterns in the occurrence of edge parameters gives an insight into the consumer trends over time in ecommerce co-purchasing networks. But such patterns need not necessarily be precise as in the case when some product goes out of stock or a group of customers becomes unavailable for a short period of time. Ignoring them may lead to loss of useful information and thus taking jitter into account becomes vital. To the best of our knowledge, no work has been yet reported to extract regular patterns considering a jitter of length greater than unity. In this article, we propose a novel method to find quasi regular patterns on weight and direction sequences of such graphs. The method involves analysing the dynamic network considering the inconsistencies in the occurrence of edges. It utilizes the relation between the occurrence sequence and the corresponding weight and direction sequences to speed up this process. Further, these patterns are used to determine the most central nodes (such as the most profit yielding products. To accomplish this we introduce the concept of dynamic closeness centrality and dynamic betweenness centrality. Experiments on Enron e-mail dataset and a synthetic dynamic network show that the presented approach is efficient, so it can be used to find patterns in large scale networks consisting of many timestamps.

  9. Interpregnancy intervals: impact of postpartum contraceptive effectiveness and coverage.

    Science.gov (United States)

    Thiel de Bocanegra, Heike; Chang, Richard; Howell, Mike; Darney, Philip

    2014-04-01

    The purpose of this study was to determine the use of contraceptive methods, which was defined by effectiveness, length of coverage, and their association with short interpregnancy intervals, when controlling for provider type and client demographics. We identified a cohort of 117,644 women from the 2008 California Birth Statistical Master file with second or higher order birth and at least 1 Medicaid (Family Planning, Access, Care, and Treatment [Family PACT] program or Medi-Cal) claim within 18 months after index birth. We explored the effect of contraceptive method provision on the odds of having an optimal interpregnancy interval and controlled for covariates. The average length of contraceptive coverage was 3.81 months (SD = 4.84). Most women received user-dependent hormonal contraceptives as their most effective contraceptive method (55%; n = 65,103 women) and one-third (33%; n = 39,090 women) had no contraceptive claim. Women who used long-acting reversible contraceptive methods had 3.89 times the odds and women who used user-dependent hormonal methods had 1.89 times the odds of achieving an optimal birth interval compared with women who used barrier methods only; women with no method had 0.66 times the odds. When user-dependent methods are considered, the odds of having an optimal birth interval increased for each additional month of contraceptive coverage by 8% (odds ratio, 1.08; 95% confidence interval, 1.08-1.09). Women who were seen by Family PACT or by both Family PACT and Medi-Cal providers had significantly higher odds of optimal birth intervals compared with women who were served by Medi-Cal only. To achieve optimal birth spacing and ultimately to improve birth outcomes, attention should be given to contraceptive counseling and access to contraceptive methods in the postpartum period. Copyright © 2014 Mosby, Inc. All rights reserved.

  10. Prognostic Value of Cardiac Time Intervals by Tissue Doppler Imaging M-Mode in Patients With Acute ST-Segment-Elevation Myocardial Infarction Treated With Primary Percutaneous Coronary Intervention

    DEFF Research Database (Denmark)

    Biering-Sørensen, Tor; Mogelvang, Rasmus; Søgaard, Peter

    2013-01-01

    Background- Color tissue Doppler imaging M-mode through the mitral leaflet is an easy and precise method to estimate all cardiac time intervals from 1 cardiac cycle and thereby obtain the myocardial performance index (MPI). However, the prognostic value of the cardiac time intervals and the MPI...... assessed by color tissue Doppler imaging M-mode through the mitral leaflet in patients with ST-segment-elevation myocardial infarction (MI) is unknown. Methods and Results- In total, 391 patients were admitted with an ST-segment-elevation MI, treated with primary percutaneous coronary intervention...

  11. Chaos on the interval

    CERN Document Server

    Ruette, Sylvie

    2017-01-01

    The aim of this book is to survey the relations between the various kinds of chaos and related notions for continuous interval maps from a topological point of view. The papers on this topic are numerous and widely scattered in the literature; some of them are little known, difficult to find, or originally published in Russian, Ukrainian, or Chinese. Dynamical systems given by the iteration of a continuous map on an interval have been broadly studied because they are simple but nevertheless exhibit complex behaviors. They also allow numerical simulations, which enabled the discovery of some chaotic phenomena. Moreover, the "most interesting" part of some higher-dimensional systems can be of lower dimension, which allows, in some cases, boiling it down to systems in dimension one. Some of the more recent developments such as distributional chaos, the relation between entropy and Li-Yorke chaos, sequence entropy, and maps with infinitely many branches are presented in book form for the first time. The author gi...

  12. Time and frequency characteristics of temporary threshold shifts caused by pure tone exposures

    DEFF Research Database (Denmark)

    Ordoñez, Rodrigo Pizarro; Hammershøi, Dorte

    2011-01-01

    The time-frequency characteristics of Temporary Threshold Shifts (TTS) caused by pure tones were determined using the Békésy audiometric method with narrow-band noise of short duration as the probe stimuli. Two experiments were done using exposures of 3 min at 100 dB above threshold. In the first....... In the second experiment, the TTS recovery curve produced by a 1 kHz pure tone exposure was assessed at 1.5 kHz, at approximately 15 s intervals for the first 5 min and at regularly increasing intervals up to 45 min after the exposure. The results showed a maximum in the recovery around 2 min after the exposure....... The data gathered in these experiments were used to construct a mathematical model of TTS recovery. The model describes both the 1/2-octave shift and the 2 min bounce and it can be used in the comparison of temporary changes in auditory function, assessed at different times and frequencies....

  13. Higher order total variation regularization for EIT reconstruction.

    Science.gov (United States)

    Gong, Bo; Schullcke, Benjamin; Krueger-Ziolek, Sabine; Zhang, Fan; Mueller-Lisse, Ullrich; Moeller, Knut

    2018-01-08

    Electrical impedance tomography (EIT) attempts to reveal the conductivity distribution of a domain based on the electrical boundary condition. This is an ill-posed inverse problem; its solution is very unstable. Total variation (TV) regularization is one of the techniques commonly employed to stabilize reconstructions. However, it is well known that TV regularization induces staircase effects, which are not realistic in clinical applications. To reduce such artifacts, modified TV regularization terms considering a higher order differential operator were developed in several previous studies. One of them is called total generalized variation (TGV) regularization. TGV regularization has been successively applied in image processing in a regular grid context. In this study, we adapted TGV regularization to the finite element model (FEM) framework for EIT reconstruction. Reconstructions using simulation and clinical data were performed. First results indicate that, in comparison to TV regularization, TGV regularization promotes more realistic images. Graphical abstract Reconstructed conductivity changes located on selected vertical lines. For each of the reconstructed images as well as the ground truth image, conductivity changes located along the selected left and right vertical lines are plotted. In these plots, the notation GT in the legend stands for ground truth, TV stands for total variation method, and TGV stands for total generalized variation method. Reconstructed conductivity distributions from the GREIT algorithm are also demonstrated.

  14. Regularized Label Relaxation Linear Regression.

    Science.gov (United States)

    Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung; Fang, Bingwu

    2018-04-01

    Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on -norm and -norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.

  15. A Time Interval of More Than 18 Months Between a Pregnancy and a Roux-en-Y Gastric Bypass Increases the Risk of Iron Deficiency and Anaemia in Pregnancy.

    Science.gov (United States)

    Crusell, Mie; Nilas, Lisbeth; Svare, Jens; Lauenborg, Jeannet

    2016-10-01

    The aim of the study is to explore the impact of time between Roux-en-Y gastric bypass (RYGB) and pregnancy on obstetrical outcome and nutritional derangements. In a retrospective cross-sectional study of pregnant women admitted for antenatal care at two tertiary hospitals, we examined 153 women with RYGB and a singleton pregnancy of at least 24 weeks. The women were stratified according to a pregnancy pregnancy, gestational hypertension, length of pregnancy, mode of delivery and foetal birth weight. The two groups were comparable regarding age, parity and prepregnancy body mass index. The frequency of iron deficiency anaemia (ferritin pregnancy outcome or birth weight between the two groups. A long surgery-to-pregnancy time interval after a RYGB increases the risk of iron deficiency anaemia but not of other nutritional deficits. Time interval does not seem to have an adverse effect on the obstetrical outcome, including intrauterine growth restriction. Specific attention is needed on iron deficit with increasing surgery-to-pregnancy time interval.

  16. Application of Turchin's method of statistical regularization

    Science.gov (United States)

    Zelenyi, Mikhail; Poliakova, Mariia; Nozik, Alexander; Khudyakov, Alexey

    2018-04-01

    During analysis of experimental data, one usually needs to restore a signal after it has been convoluted with some kind of apparatus function. According to Hadamard's definition this problem is ill-posed and requires regularization to provide sensible results. In this article we describe an implementation of the Turchin's method of statistical regularization based on the Bayesian approach to the regularization strategy.

  17. On the regularized fermionic projector of the vacuum

    Science.gov (United States)

    Finster, Felix

    2008-03-01

    We construct families of fermionic projectors with spherically symmetric regularization, which satisfy the condition of a distributional MP-product. The method is to analyze regularization tails with a power law or logarithmic scaling in composite expressions in the fermionic projector. The resulting regularizations break the Lorentz symmetry and give rise to a multilayer structure of the fermionic projector near the light cone. Furthermore, we construct regularizations which go beyond the distributional MP-product in that they yield additional distributional contributions supported at the origin. The remaining freedom for the regularization parameters and the consequences for the normalization of the fermionic states are discussed.

  18. On the regularized fermionic projector of the vacuum

    International Nuclear Information System (INIS)

    Finster, Felix

    2008-01-01

    We construct families of fermionic projectors with spherically symmetric regularization, which satisfy the condition of a distributional MP-product. The method is to analyze regularization tails with a power law or logarithmic scaling in composite expressions in the fermionic projector. The resulting regularizations break the Lorentz symmetry and give rise to a multilayer structure of the fermionic projector near the light cone. Furthermore, we construct regularizations which go beyond the distributional MP-product in that they yield additional distributional contributions supported at the origin. The remaining freedom for the regularization parameters and the consequences for the normalization of the fermionic states are discussed

  19. A probabilistic approach for representation of interval uncertainty

    International Nuclear Information System (INIS)

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  20. Why is the Motivation of Non-Regular Employees Not Low? From the Viewpoints of Equity Theory and Social Comparison Processes Theory

    OpenAIRE

    平野, 光俊; 笠谷, 千佳

    2017-01-01

    The aim of this study is to use perspectives from equity theory and social comparison to explain the reason why non-regular employees’ motivation is not low, despite working at relatively low pay compared to regular employees. To achieve this, the study conducted a questionnaire survey of regular (full-time) and part-time employees of a grocery store chain retail business. The results indicated the following: (1) part-time workers have greater motivation, affective commitment, and job satisfa...