WorldWideScience

Sample records for short computational time

  1. Computation of the Short-Time Linear Canonical Transform with Dual Window

    Directory of Open Access Journals (Sweden)

    Lei Huang

    2017-01-01

    Full Text Available The short-time linear canonical transform (STLCT, which maps the time domain signal into the joint time and frequency domain, has recently attracted some attention in the area of signal processing. However, its applications are still limited due to the fact that selection of coefficients of the short-time linear canonical series (STLCS is not unique, because time and frequency elementary functions (together known as basis function of STLCS do not constitute an orthogonal basis. To solve this problem, this paper investigates a dual window solution. First, the nonorthogonal problem that suffered from original window is fulfilled by orthogonal condition with dual window. Then based on the obtained condition, a dual window computation approach of the GT is extended to the STLCS. In addition, simulations verify the validity of the proposed condition and solutions. Furthermore, some possible applied directions are discussed.

  2. Efficient Computation of Multiscale Entropy over Short Biomedical Time Series Based on Linear State-Space Models

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-01-01

    Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.

  3. Short-term effects of playing computer games on attention.

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-05-01

    The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour. The TBAG form of the Stroop task was administered to all participants twice, before playing and immediately after playing the game. Participants with improved posttest scores, compared to their pretest scores, used the computer on average 0.67 +/- 1.1 hr/day, while the average administered was measured at 1.6 +/- 1.4 hr/day and 1.3 +/- 0.9 hr/day computer use for participants with worse or unaltered scores, respectively. According to the regression model, male gender, younger ages, duration of daily computer use, and ADHD inattention type were found to be independent risk factors for worsened posttest scores. Time spent playing computer games can exert a short-term effect on attention as measured by the Stroop test.

  4. Three-phase short circuit calculation method based on pre-computed surface for doubly fed induction generator

    Science.gov (United States)

    Ma, J.; Liu, Q.

    2018-02-01

    This paper presents an improved short circuit calculation method, based on pre-computed surface to determine the short circuit current of a distribution system with multiple doubly fed induction generators (DFIGs). The short circuit current, injected into power grid by DFIG, is determined by low voltage ride through (LVRT) control and protection under grid fault. However, the existing methods are difficult to calculate the short circuit current of DFIG in engineering practice due to its complexity. A short circuit calculation method, based on pre-computed surface, was proposed by developing the surface of short circuit current changing with the calculating impedance and the open circuit voltage. And the short circuit currents were derived by taking into account the rotor excitation and crowbar activation time. Finally, the pre-computed surfaces of short circuit current at different time were established, and the procedure of DFIG short circuit calculation considering its LVRT was designed. The correctness of proposed method was verified by simulation.

  5. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  6. Neural Computations in a Dynamical System with Multiple Time Scales.

    Science.gov (United States)

    Mi, Yuanyuan; Lin, Xiaohan; Wu, Si

    2016-01-01

    Neural systems display rich short-term dynamics at various levels, e.g., spike-frequency adaptation (SFA) at the single-neuron level, and short-term facilitation (STF) and depression (STD) at the synapse level. These dynamical features typically cover a broad range of time scales and exhibit large diversity in different brain regions. It remains unclear what is the computational benefit for the brain to have such variability in short-term dynamics. In this study, we propose that the brain can exploit such dynamical features to implement multiple seemingly contradictory computations in a single neural circuit. To demonstrate this idea, we use continuous attractor neural network (CANN) as a working model and include STF, SFA and STD with increasing time constants in its dynamics. Three computational tasks are considered, which are persistent activity, adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, and hence cannot be implemented by a single dynamical feature or any combination with similar time constants. However, with properly coordinated STF, SFA and STD, we show that the network is able to implement the three computational tasks concurrently. We hope this study will shed light on the understanding of how the brain orchestrates its rich dynamics at various levels to realize diverse cognitive functions.

  7. Computation of a long-time evolution in a Schroedinger system

    International Nuclear Information System (INIS)

    Girard, R.; Kroeger, H.; Labelle, P.; Bajzer, Z.

    1988-01-01

    We compare different techniques for the computation of a long-time evolution and the S matrix in a Schroedinger system. As an application we consider a two-nucleon system interacting via the Yamaguchi potential. We suggest computation of the time evolution for a very short time using Pade approximants, the long-time evolution being obtained by iterative squaring. Within the technique of strong approximation of Moller wave operators (SAM) we compare our calculation with computation of the time evolution in the eigenrepresentation of the Hamiltonian and with the standard Lippmann-Schwinger solution for the S matrix. We find numerical agreement between these alternative methods for time-evolution computation up to half the number of digits of internal machine precision, and fairly rapid convergence of both techniques towards the Lippmann-Schwinger solution

  8. The short time Fourier transform and local signals

    Science.gov (United States)

    Okumura, Shuhei

    In this thesis, I examine the theoretical properties of the short time discrete Fourier transform (STFT). The STFT is obtained by applying the Fourier transform by a fixed-sized, moving window to input series. We move the window by one time point at a time, so we have overlapping windows. I present several theoretical properties of the STFT, applied to various types of complex-valued, univariate time series inputs, and their outputs in closed forms. In particular, just like the discrete Fourier transform, the STFT's modulus time series takes large positive values when the input is a periodic signal. One main point is that a white noise time series input results in the STFT output being a complex-valued stationary time series and we can derive the time and time-frequency dependency structure such as the cross-covariance functions. Our primary focus is the detection of local periodic signals. I present a method to detect local signals by computing the probability that the squared modulus STFT time series has consecutive large values exceeding some threshold after one exceeding observation following one observation less than the threshold. We discuss a method to reduce the computation of such probabilities by the Box-Cox transformation and the delta method, and show that it works well in comparison to the Monte Carlo simulation method.

  9. A Non-Linear Digital Computer Model Requiring Short Computation Time for Studies Concerning the Hydrodynamics of the BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reisch, F; Vayssier, G

    1969-05-15

    This non-linear model serves as one of the blocks in a series of codes to study the transient behaviour of BWR or PWR type reactors. This program is intended to be the hydrodynamic part of the BWR core representation or the hydrodynamic part of the PWR heat exchanger secondary side representation. The equations have been prepared for the CSMP digital simulation language. By using the most suitable integration routine available, the ratio of simulation time to real time is about one on an IBM 360/75 digital computer. Use of the slightly different language DSL/40 on an IBM 7044 computer takes about four times longer. The code has been tested against the Eindhoven loop with satisfactory agreement.

  10. Significant decimal digits for energy representation on short-word computers

    International Nuclear Information System (INIS)

    Sartori, E.

    1989-01-01

    The general belief that single precision floating point numbers have always at least seven significant decimal digits on short word computers such as IBM is erroneous. Seven significant digits are required however for representing the energy variable in nuclear cross-section data sets containing sharp p-wave resonances at 0 Kelvin. It is suggested that either the energy variable is stored in double precision or that cross-section resonances are reconstructed to room temperature or higher on short word computers

  11. Short-Term Effects of Playing Computer Games on Attention

    Science.gov (United States)

    Tahiroglu, Aysegul Yolga; Celik, Gonca Gul; Avci, Ayse; Seydaoglu, Gulsah; Uzel, Mehtap; Altunbas, Handan

    2010-01-01

    Objective: The main aim of the present study is to investigate the short-term cognitive effects of computer games in children with different psychiatric disorders and normal controls. Method: One hundred one children are recruited for the study (aged between 9 and 12 years). All participants played a motor-racing game on the computer for 1 hour.…

  12. Several problems of algorithmization in integrated computation programs on third generation computers for short circuit currents in complex power networks

    Energy Technology Data Exchange (ETDEWEB)

    Krylov, V.A.; Pisarenko, V.P.

    1982-01-01

    Methods of modeling complex power networks with short circuits in the networks are described. The methods are implemented in integrated computation programs for short circuit currents and equivalents in electrical networks with a large number of branch points (up to 1000) on a computer with a limited on line memory capacity (M equals 4030 for the computer).

  13. Potential barrier classification by short-time measurement

    International Nuclear Information System (INIS)

    Granot, Er'el; Marchewka, Avi

    2006-01-01

    We investigate the short-time dynamics of a delta-function potential barrier on an initially confined wave packet. There are mainly two conclusions: (A) At short times the probability density of the first particles that passed through the barrier is unaffected by it. (B) When the barrier is absorptive (i.e., its potential is imaginary) it affects the transmitted wave function at shorter times than a real potential barrier. Therefore, it is possible to distinguish between an imaginary and a real potential barrier by measuring its effect at short times only on the transmitting wave function

  14. Potential barrier classification by short-time measurement

    Science.gov (United States)

    Granot, Er'El; Marchewka, Avi

    2006-03-01

    We investigate the short-time dynamics of a delta-function potential barrier on an initially confined wave packet. There are mainly two conclusions: (A) At short times the probability density of the first particles that passed through the barrier is unaffected by it. (B) When the barrier is absorptive (i.e., its potential is imaginary) it affects the transmitted wave function at shorter times than a real potential barrier. Therefore, it is possible to distinguish between an imaginary and a real potential barrier by measuring its effect at short times only on the transmitting wave function.

  15. Quantum computer based on activated dielectric nanoparticles selectively interacting with short optical pulses

    International Nuclear Information System (INIS)

    Gadomskii, Oleg N; Kharitonov, Yu Ya

    2004-01-01

    The operation principle of a quantum computer is proposed based on a system of dielectric nanoparticles activated with two-level atoms - cubits, in which electric dipole transitions are excited by short intense optical pulses. It is proved that the logical operation (logical operator) CNOT (controlled NOT) is performed by means of time-dependent transfer of quantum information over 'long' (of the order of 10 4 nm) distances between spherical nanoparticles owing to the delayed interaction between them in the optical radiation field. It is shown that one-cubit and two-cubit logical operators required for quantum calculations can be realised by selectively exciting dielectric particles with short optical pulses. (quantum calculations)

  16. Computing three-point functions for short operators

    International Nuclear Information System (INIS)

    Bargheer, Till; Institute for Advanced Study, Princeton, NJ; Minahan, Joseph A.; Pereira, Raul

    2013-11-01

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  17. Computing three-point functions for short operators

    Energy Technology Data Exchange (ETDEWEB)

    Bargheer, Till [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Institute for Advanced Study, Princeton, NJ (United States). School of Natural Sciences; Minahan, Joseph A.; Pereira, Raul [Uppsala Univ. (Sweden). Dept. of Physics and Astronomy

    2013-11-15

    We compute the three-point structure constants for short primary operators of N=4 super Yang.Mills theory to leading order in 1/√(λ) by mapping the problem to a flat-space string theory calculation. We check the validity of our procedure by comparing to known results for three chiral primaries. We then compute the three-point functions for any combination of chiral and non-chiral primaries, with the non-chiral primaries all dual to string states at the first massive level. Along the way we find many cancellations that leave us with simple expressions, suggesting that integrability is playing an important role.

  18. Humor in Human-Computer Interaction : A Short Survey

    NARCIS (Netherlands)

    Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

    2017-01-01

    This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

  19. Short time ahead wind power production forecast

    International Nuclear Information System (INIS)

    Sapronova, Alla; Meissner, Catherine; Mana, Matteo

    2016-01-01

    An accurate prediction of wind power output is crucial for efficient coordination of cooperative energy production from different sources. Long-time ahead prediction (from 6 to 24 hours) of wind power for onshore parks can be achieved by using a coupled model that would bridge the mesoscale weather prediction data and computational fluid dynamics. When a forecast for shorter time horizon (less than one hour ahead) is anticipated, an accuracy of a predictive model that utilizes hourly weather data is decreasing. That is because the higher frequency fluctuations of the wind speed are lost when data is averaged over an hour. Since the wind speed can vary up to 50% in magnitude over a period of 5 minutes, the higher frequency variations of wind speed and direction have to be taken into account for an accurate short-term ahead energy production forecast. In this work a new model for wind power production forecast 5- to 30-minutes ahead is presented. The model is based on machine learning techniques and categorization approach and using the historical park production time series and hourly numerical weather forecast. (paper)

  20. Short time ahead wind power production forecast

    Science.gov (United States)

    Sapronova, Alla; Meissner, Catherine; Mana, Matteo

    2016-09-01

    An accurate prediction of wind power output is crucial for efficient coordination of cooperative energy production from different sources. Long-time ahead prediction (from 6 to 24 hours) of wind power for onshore parks can be achieved by using a coupled model that would bridge the mesoscale weather prediction data and computational fluid dynamics. When a forecast for shorter time horizon (less than one hour ahead) is anticipated, an accuracy of a predictive model that utilizes hourly weather data is decreasing. That is because the higher frequency fluctuations of the wind speed are lost when data is averaged over an hour. Since the wind speed can vary up to 50% in magnitude over a period of 5 minutes, the higher frequency variations of wind speed and direction have to be taken into account for an accurate short-term ahead energy production forecast. In this work a new model for wind power production forecast 5- to 30-minutes ahead is presented. The model is based on machine learning techniques and categorization approach and using the historical park production time series and hourly numerical weather forecast.

  1. Short-time quantum dynamics of sharp boundaries potentials

    Energy Technology Data Exchange (ETDEWEB)

    Granot, Er' el, E-mail: erel@ariel.ac.il; Marchewka, Avi

    2015-02-15

    Despite the high prevalence of singular potential in general, and rectangular potentials in particular, in applied scattering models, to date little is known about their short time effects. The reason is that singular potentials cause a mixture of complicated local as well as non-local effects. The object of this work is to derive a generic method to calculate analytically the short-time impact of any singular potential. In this paper it is shown that the scattering of a smooth wavefunction on a singular potential is totally equivalent, in the short-time regime, to the free propagation of a singular wavefunction. However, the latter problem was totally addressed analytically in Ref. [7]. Therefore, this equivalency can be utilized in solving analytically the short time dynamics of any smooth wavefunction at the presence of a singular potentials. In particular, with this method the short-time dynamics of any problem where a sharp boundaries potential (e.g., a rectangular barrier) is turned on instantaneously can easily be solved analytically.

  2. Short-time quantum dynamics of sharp boundaries potentials

    Science.gov (United States)

    Granot, Er'el; Marchewka, Avi

    2015-02-01

    Despite the high prevalence of singular potential in general, and rectangular potentials in particular, in applied scattering models, to date little is known about their short time effects. The reason is that singular potentials cause a mixture of complicated local as well as non-local effects. The object of this work is to derive a generic method to calculate analytically the short-time impact of any singular potential. In this paper it is shown that the scattering of a smooth wavefunction on a singular potential is totally equivalent, in the short-time regime, to the free propagation of a singular wavefunction. However, the latter problem was totally addressed analytically in Ref. [7]. Therefore, this equivalency can be utilized in solving analytically the short time dynamics of any smooth wavefunction at the presence of a singular potentials. In particular, with this method the short-time dynamics of any problem where a sharp boundaries potential (e.g., a rectangular barrier) is turned on instantaneously can easily be solved analytically.

  3. Short-time quantum dynamics of sharp boundaries potentials

    International Nuclear Information System (INIS)

    Granot, Er'el; Marchewka, Avi

    2015-01-01

    Despite the high prevalence of singular potential in general, and rectangular potentials in particular, in applied scattering models, to date little is known about their short time effects. The reason is that singular potentials cause a mixture of complicated local as well as non-local effects. The object of this work is to derive a generic method to calculate analytically the short-time impact of any singular potential. In this paper it is shown that the scattering of a smooth wavefunction on a singular potential is totally equivalent, in the short-time regime, to the free propagation of a singular wavefunction. However, the latter problem was totally addressed analytically in Ref. [7]. Therefore, this equivalency can be utilized in solving analytically the short time dynamics of any smooth wavefunction at the presence of a singular potentials. In particular, with this method the short-time dynamics of any problem where a sharp boundaries potential (e.g., a rectangular barrier) is turned on instantaneously can easily be solved analytically

  4. Neural Computations in a Dynamical System with Multiple Time Scales

    Directory of Open Access Journals (Sweden)

    Yuanyuan Mi

    2016-09-01

    Full Text Available Neural systems display rich short-term dynamics at various levels, e.g., spike-frequencyadaptation (SFA at single neurons, and short-term facilitation (STF and depression (STDat neuronal synapses. These dynamical features typically covers a broad range of time scalesand exhibit large diversity in different brain regions. It remains unclear what the computationalbenefit for the brain to have such variability in short-term dynamics is. In this study, we proposethat the brain can exploit such dynamical features to implement multiple seemingly contradictorycomputations in a single neural circuit. To demonstrate this idea, we use continuous attractorneural network (CANN as a working model and include STF, SFA and STD with increasing timeconstants in their dynamics. Three computational tasks are considered, which are persistent activity,adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, andhence cannot be implemented by a single dynamical feature or any combination with similar timeconstants. However, with properly coordinated STF, SFA and STD, we show that the network isable to implement the three computational tasks concurrently. We hope this study will shed lighton the understanding of how the brain orchestrates its rich dynamics at various levels to realizediverse cognitive functions.

  5. Adaptive synchrosqueezing based on a quilted short-time Fourier transform

    Science.gov (United States)

    Berrian, Alexander; Saito, Naoki

    2017-08-01

    In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.

  6. Short generators without quantum computers : the case of multiquadratics

    NARCIS (Netherlands)

    Bauch, J.; Bernstein, D.J.; de Valence, H.; Lange, T.; van Vredendaal, C.; Coron, J.-S.; Nielsen, J.B.

    2017-01-01

    Finding a short element g of a number field, given the ideal generated by g, is a classic problem in computational algebraic number theory. Solving this problem recovers the private key in cryptosystems introduced by Gentry, Smart–Vercauteren, Gentry–Halevi, Garg– Gentry–Halevi, et al. Work over the

  7. Decision time horizon for music genre classification using short time features

    DEFF Research Database (Denmark)

    Ahrendt, Peter; Meng, Anders; Larsen, Jan

    2004-01-01

    In this paper music genre classification has been explored with special emphasis on the decision time horizon and ranking of tapped-delay-line short-time features. Late information fusion as e.g. majority voting is compared with techniques of early information fusion such as dynamic PCA (DPCA......). The most frequently suggested features in the literature were employed including mel-frequency cepstral coefficients (MFCC), linear prediction coefficients (LPC), zero-crossing rate (ZCR), and MPEG-7 features. To rank the importance of the short time features consensus sensitivity analysis is applied...

  8. Computational modeling of ultra-short-pulse ablation of enamel

    Energy Technology Data Exchange (ETDEWEB)

    London, R.A.; Bailey, D.S.; Young, D.A. [and others

    1996-02-29

    A computational model for the ablation of tooth enamel by ultra-short laser pulses is presented. The role of simulations using this model in designing and understanding laser drilling systems is discussed. Pulses of duration 300 sec and intensity greater than 10{sup 12} W/cm{sup 2} are considered. Laser absorption proceeds via multi-photon initiated plasma mechanism. The hydrodynamic response is calculated with a finite difference method, using an equation of state constructed from thermodynamic functions including electronic, ion motion, and chemical binding terms. Results for the ablation efficiency are presented. An analytic model describing the ablation threshold and ablation depth is presented. Thermal coupling to the remaining tissue and long-time thermal conduction are calculated. Simulation results are compared to experimental measurements of the ablation efficiency. Desired improvements in the model are presented.

  9. Computations of concentration of radon and its decay products against time. Computer program; Obliczanie koncentracji radonu i jego produktow rozpadu w funkcji czasu. Program komputerowy

    Energy Technology Data Exchange (ETDEWEB)

    Machaj, B. [Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    1996-12-31

    This research is aimed to develop a device for continuous monitoring of radon in the air, by measuring alpha activity of radon and its short lived decay products. The influence of alpha activity variation of radon and its daughters on the measured results is of importance and requires a knowledge of this variation with time. Employing the measurement of alpha radiation of radon and of its short lived decay products, require knowledge of radon concentration variation and its decay products against the time. A computer program in Turbo Pascal language was therefore developed performing the computations employing the known relations involved, the program being adapted for IBM PC computers. The presented program enables computation of activity of {sup 222}Rn and its daughter products: {sup 218}Po, {sup 214}Pb, {sup 214}Bi and {sup 214}Po every 1 min within the period of 0-255 min for any state of radiation equilibrium between the radon and its daughter products. The program permits also to compute alpha activity of {sup 222}Rn + {sup 218}Po + {sup 214}Po against time and the total alpha activity at selected interval of time. The results of computations are stored on the computer hard disk in ASCII format and are used a graphic program e.g. by DrawPerfect program to make diagrams. Equations employed for computation of the alpha activity of radon and its decay products as well as the description of program functions are given. (author). 2 refs, 4 figs.

  10. Short-time quantum propagator and Bohmian trajectories

    Science.gov (United States)

    de Gosson, Maurice; Hiley, Basil

    2013-12-01

    We begin by giving correct expressions for the short-time action following the work Makri-Miller. We use these estimates to derive an accurate expression modulo Δt2 for the quantum propagator and we show that the quantum potential is negligible modulo Δt2 for a point source, thus justifying an unfortunately largely ignored observation of Holland made twenty years ago. We finally prove that this implies that the quantum motion is classical for very short times.

  11. Short-time quantum propagator and Bohmian trajectories

    International Nuclear Information System (INIS)

    Gosson, Maurice de; Hiley, Basil

    2013-01-01

    We begin by giving correct expressions for the short-time action following the work Makri–Miller. We use these estimates to derive an accurate expression modulo Δt 2 for the quantum propagator and we show that the quantum potential is negligible modulo Δt 2 for a point source, thus justifying an unfortunately largely ignored observation of Holland made twenty years ago. We finally prove that this implies that the quantum motion is classical for very short times.

  12. Short-time quantum propagator and Bohmian trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Gosson, Maurice de, E-mail: maurice.degosson@gmail.com [Universität Wien, Fakultät für Mathematik, NuHAG, Wien 1090 (Austria); Hiley, Basil [University of London, Birkbeck College, Theoretical Physics Unit, London WC1E 7HX (United Kingdom)

    2013-12-06

    We begin by giving correct expressions for the short-time action following the work Makri–Miller. We use these estimates to derive an accurate expression modulo Δt{sup 2} for the quantum propagator and we show that the quantum potential is negligible modulo Δt{sup 2} for a point source, thus justifying an unfortunately largely ignored observation of Holland made twenty years ago. We finally prove that this implies that the quantum motion is classical for very short times.

  13. Optimization of irradiation decay and counting times in nuclear activation analysis using short-lived nuclides

    International Nuclear Information System (INIS)

    Bjoernstad, T.

    This work describes a method and outlines a procedure for optim- ization of an activation analysis with respect to the experimental times, irradiation time, t(subi), decay time and counting time. The method is based on the 'minimum relative standard deviation criterion', and specially designed for the use on short-lived nuclides. A computer program, COMB1, is written in the BASIC language in order to make the calculations easier and faster. It is intended to be understandable, and easily applicable on a computer of modest size. Time and cost are important factors, especially for routine analysis on a service basis. In such cases one can often allow a controlled reduction in the analysis quality (through a higher relative standard deviation). The procedure outlined can therefore help find acceptable conditions by calculation of the 'best practical' (or reasonable) experimental time values, and the minimum number of accumulation cycles necessary to fulfil the requirements given. (Auth.)

  14. Performance evaluation of the short-time objective intelligibility measure with different band importance functions

    DEFF Research Database (Denmark)

    Heidemann Andersen, Asger; de Haan, Jan Mark; Tan, Zheng-Hua

    performance measures: root-mean-squared-error, Pearson correlation, and Kendall rank correlation. The results show substantially improved performance when fitting and evaluating on the same dataset. However, this advantage does not necessarily subsist when fitting and evaluating on different datasets. When...... with a filter bank, 2) envelopes are extracted from each band, 3) the temporal correlation between clean and degraded envelopes is computed in short time segments, and 4) the correlation is averaged across time and frequency bands to obtain the final output. An unusual choice in the design of the STOI measure...

  15. Short irradiation time characteristics of the inverter type X-ray generator

    International Nuclear Information System (INIS)

    Miyazaki, Shigeru; Hara, Takamitu; Matutani, Kazuo; Saito, Kazuhiko.

    1994-01-01

    The linearity of the X-ray output is an important factor in radiography. It is a composite of the linearities of the X-ray tube voltage, the X-ray tube current, and the exposure time. This paper focuses on the linearity of exposure time. Non-linearity of the X-ray output for short-time exposure became a problem when the three-phase X-ray generator was introduced. This paper describes the inverter-type X-ray generator, which is expected to become predominant in the future. Previously, we investigated X-ray output linearity during short-time exposure using the technique of dynamic study. In this paper, we describe the application of a digital memory and a personal computer to further investigation. The non-linearity of the X-ray output was caused by irregular waveforms of the X-ray tube voltage found at the rise time and the fall time. When the rise time was about 0.6 ms, the non-linearity was about 2%, which is negligibly small. The non-linearity due to the fall time of the X-ray tube varied greatly according to the X-ray tube current. For the minimum irradiation time of 1 ms, 4% to 27% of the non-linearity was attributable to the fall time. The main cause was the stray capacitance of the X-ray high-voltage cables. When the X-ray tube current exceeded 400 mA, the rise time was almost equal to the fall time, and the problem did not occur. Consequently, the ideal generator should have a fall time which is equal to the rise time of the X-ray tube voltage. Strictly speaking, such a generator should have rectangular waveforms. (author)

  16. Polynomial Phase Estimation Based on Adaptive Short-Time Fourier Transform.

    Science.gov (United States)

    Jing, Fulong; Zhang, Chunjie; Si, Weijian; Wang, Yu; Jiao, Shuhong

    2018-02-13

    Polynomial phase signals (PPSs) have numerous applications in many fields including radar, sonar, geophysics, and radio communication systems. Therefore, estimation of PPS coefficients is very important. In this paper, a novel approach for PPS parameters estimation based on adaptive short-time Fourier transform (ASTFT), called the PPS-ASTFT estimator, is proposed. Using the PPS-ASTFT estimator, both one-dimensional and multi-dimensional searches and error propagation problems, which widely exist in PPSs field, are avoided. In the proposed algorithm, the instantaneous frequency (IF) is estimated by S-transform (ST), which can preserve information on signal phase and provide a variable resolution similar to the wavelet transform (WT). The width of the ASTFT analysis window is equal to the local stationary length, which is measured by the instantaneous frequency gradient (IFG). The IFG is calculated by the principal component analysis (PCA), which is robust to the noise. Moreover, to improve estimation accuracy, a refinement strategy is presented to estimate signal parameters. Since the PPS-ASTFT avoids parameter search, the proposed algorithm can be computed in a reasonable amount of time. The estimation performance, computational cost, and implementation of the PPS-ASTFT are also analyzed. The conducted numerical simulations support our theoretical results and demonstrate an excellent statistical performance of the proposed algorithm.

  17. Quantum Dynamics with Short-Time Trajectories and Minimal Adaptive Basis Sets.

    Science.gov (United States)

    Saller, Maximilian A C; Habershon, Scott

    2017-07-11

    Methods for solving the time-dependent Schrödinger equation via basis set expansion of the wave function can generally be categorized as having either static (time-independent) or dynamic (time-dependent) basis functions. We have recently introduced an alternative simulation approach which represents a middle road between these two extremes, employing dynamic (classical-like) trajectories to create a static basis set of Gaussian wavepackets in regions of phase-space relevant to future propagation of the wave function [J. Chem. Theory Comput., 11, 8 (2015)]. Here, we propose and test a modification of our methodology which aims to reduce the size of basis sets generated in our original scheme. In particular, we employ short-time classical trajectories to continuously generate new basis functions for short-time quantum propagation of the wave function; to avoid the continued growth of the basis set describing the time-dependent wave function, we employ Matching Pursuit to periodically minimize the number of basis functions required to accurately describe the wave function. Overall, this approach generates a basis set which is adapted to evolution of the wave function while also being as small as possible. In applications to challenging benchmark problems, namely a 4-dimensional model of photoexcited pyrazine and three different double-well tunnelling problems, we find that our new scheme enables accurate wave function propagation with basis sets which are around an order-of-magnitude smaller than our original trajectory-guided basis set methodology, highlighting the benefits of adaptive strategies for wave function propagation.

  18. 12 CFR 1102.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1102.27 Section 1102.27 Banks... for Proceedings § 1102.27 Computing time. (a) General rule. In computing any period of time prescribed... time begins to run is not included. The last day so computed is included, unless it is a Saturday...

  19. 12 CFR 622.21 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Computing time. 622.21 Section 622.21 Banks and... Formal Hearings § 622.21 Computing time. (a) General rule. In computing any period of time prescribed or... run is not to be included. The last day so computed shall be included, unless it is a Saturday, Sunday...

  20. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical components method to compute all phase and sequence quantities for any bus or branch of a given power network ...

  1. Improving Music Genre Classification by Short-Time Feature Integration

    DEFF Research Database (Denmark)

    Meng, Anders; Ahrendt, Peter; Larsen, Jan

    2005-01-01

    Many different short-time features, using time windows in the size of 10-30 ms, have been proposed for music segmentation, retrieval and genre classification. However, often the available time frame of the music to make the actual decision or comparison (the decision time horizon) is in the range...... of seconds instead of milliseconds. The problem of making new features on the larger time scale from the short-time features (feature integration) has only received little attention. This paper investigates different methods for feature integration and late information fusion for music genre classification...

  2. Extending the length and time scales of Gram–Schmidt Lyapunov vector computations

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Anthony B., E-mail: acosta@northwestern.edu [Department of Chemistry, Northwestern University, Evanston, IL 60208 (United States); Green, Jason R., E-mail: jason.green@umb.edu [Department of Chemistry, Northwestern University, Evanston, IL 60208 (United States); Department of Chemistry, University of Massachusetts Boston, Boston, MA 02125 (United States)

    2013-08-01

    Lyapunov vectors have found growing interest recently due to their ability to characterize systems out of thermodynamic equilibrium. The computation of orthogonal Gram–Schmidt vectors requires multiplication and QR decomposition of large matrices, which grow as N{sup 2} (with the particle count). This expense has limited such calculations to relatively small systems and short time scales. Here, we detail two implementations of an algorithm for computing Gram–Schmidt vectors. The first is a distributed-memory message-passing method using Scalapack. The second uses the newly-released MAGMA library for GPUs. We compare the performance of both codes for Lennard–Jones fluids from N=100 to 1300 between Intel Nahalem/Infiniband DDR and NVIDIA C2050 architectures. To our best knowledge, these are the largest systems for which the Gram–Schmidt Lyapunov vectors have been computed, and the first time their calculation has been GPU-accelerated. We conclude that Lyapunov vector calculations can be significantly extended in length and time by leveraging the power of GPU-accelerated linear algebra.

  3. Extending the length and time scales of Gram–Schmidt Lyapunov vector computations

    International Nuclear Information System (INIS)

    Costa, Anthony B.; Green, Jason R.

    2013-01-01

    Lyapunov vectors have found growing interest recently due to their ability to characterize systems out of thermodynamic equilibrium. The computation of orthogonal Gram–Schmidt vectors requires multiplication and QR decomposition of large matrices, which grow as N 2 (with the particle count). This expense has limited such calculations to relatively small systems and short time scales. Here, we detail two implementations of an algorithm for computing Gram–Schmidt vectors. The first is a distributed-memory message-passing method using Scalapack. The second uses the newly-released MAGMA library for GPUs. We compare the performance of both codes for Lennard–Jones fluids from N=100 to 1300 between Intel Nahalem/Infiniband DDR and NVIDIA C2050 architectures. To our best knowledge, these are the largest systems for which the Gram–Schmidt Lyapunov vectors have been computed, and the first time their calculation has been GPU-accelerated. We conclude that Lyapunov vector calculations can be significantly extended in length and time by leveraging the power of GPU-accelerated linear algebra

  4. Computation of short-time diffusion using the particle simulation method

    International Nuclear Information System (INIS)

    Janicke, L.

    1983-01-01

    The method of particle simulation allows a correct description of turbulent diffusion even in areas near the source and the computation of overall average values (anticipated values). The model is suitable for dealing with complex situation. It is derived from the K-model which describes the dispersion of noxious matter using the diffusion formula. (DG) [de

  5. A Distributed Web-based Solution for Ionospheric Model Real-time Management, Monitoring, and Short-term Prediction

    Science.gov (United States)

    Kulchitsky, A.; Maurits, S.; Watkins, B.

    2006-12-01

    provide inputs for the next ionospheic model time step and then stored in a MySQL database as the first part of the time-specific record. The RMM then performs synchronization of the input times with the current model time, prepares a decision on initialization for the next model time step, and monitors its execution. Then, as soon as the model completes computations for the next time step, RMM visualizes the current model output into various short-term (about 1-2 hours) forecasting products and compares prior results with available ionospheric measurements. The RMM places prepared images into the MySQL database, which can be located on a different computer node, and then proceeds to the next time interval continuing the time-loop. The upper-level interface of this real-time system is the a PHP-based Web site (http://www.arsc.edu/SpaceWeather/new). This site provides general information about the Earth polar and adjacent mid-latitude ionosphere, allows for monitoring of the current developments and short-term forecasts, and facilitates access to the comparisons archive stored in the database.

  6. Short recovery time NMR probe

    International Nuclear Information System (INIS)

    Ramia, M.E.; Martin, C.A.; Jeandrevin, S.

    2011-01-01

    A NMR probe for low frequency and short recovery time is presented in this work. The probe contains the tuning circuit, diode expanders and quarter wavelength networks to protect the receiver from both the amplifier noise and the coil ringing following the transmitter power pulse. It also possesses a coil damper which is activated by of non active components. The probe performance shows a recovery time of about of 15μs a sensitive Q factor reduction and an increase of the signal to noise ratio of about 68% during the reception at a work frequency of 2 MHz. (author)

  7. Job quality of short-time workers and perception and support from their managers

    OpenAIRE

    坂爪, 洋美

    2017-01-01

    The purpose of this study was to clarify the relationship between the characteristics of job quality that short-time workers occupied and the managers’ perception and support whose member has used short-time working hour system. A total of 559 first-line managers who has a member using short-time working hour system completed a web-based survey assessing job quality of short-time workers , the risk of using short-timeworking hour system, career perspective of short-time workers, and the suppo...

  8. 12 CFR 908.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 908.27 Section 908.27 Banks and... PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.27 Computing time. (a) General rule. In computing any period of time prescribed or allowed by this subpart, the date of the act or event...

  9. Real-time energy resources scheduling considering short-term and very short-term wind forecast

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Marco; Sousa, Tiago; Morais, Hugo; Vale, Zita [Polytechnic of Porto (Portugal). GECAD - Knowledge Engineering and Decision Support Research Center

    2012-07-01

    This paper proposes an energy resources management methodology based on three distinct time horizons: day-ahead scheduling, hour-ahead scheduling, and real-time scheduling. In each scheduling process the update of generation and consumption operation and of the storage and electric vehicles storage status are used. Besides the new operation conditions, the most accurate forecast values of wind generation and of consumption using results of short-term and very short-term methods are used. A case study considering a distribution network with intensive use of distributed generation and electric vehicles is presented. (orig.)

  10. Short echo time, fast gradient-echo imaging

    International Nuclear Information System (INIS)

    Haacke, E.M.; Lenz, G.W.

    1987-01-01

    Present fast-gradient-echoes schemes can acquire volume data rapidly and are flexible in T1 or T1/T2 contrast behavior. However, sequences used to date employ echo time (TE) values of about 15 ms +- 5 and, because of in vivo field inhomogeneities (short T2), they suffer badly from signal loss near sinuses and tissue boundaries. The authors implemented sequences with TE = 4-6 ms and found significant improvement in image quality, especially at high fields. Examples with long TEs vs. short TEs are given in the knee, spine, head, and orbits. Further advantages include (1) faster repetition times (15 ms), (2) higher-quality spin-density or T1-weighted images, and (3) reduction of blood motion artifacts

  11. Transient nanobubbles in short-time electrolysis

    NARCIS (Netherlands)

    Svetovoy, Vitaly; Sanders, Remco G.P.; Elwenspoek, Michael Curt

    2013-01-01

    Water electrolysis in a microsystem is observed and analyzed on a short-time scale of ∼10 μs. The very unusual properties of the process are stressed. An extremely high current density is observed because the process is not limited by the diffusion of electroactive species. The high current is

  12. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    Science.gov (United States)

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)

  13. Short-term effects of implemented high intensity shoulder elevation during computer work

    DEFF Research Database (Denmark)

    Larsen, Mette K.; Samani, Afshin; Madeleine, Pascal

    2009-01-01

    computer work to prevent neck-shoulder pain may be possible without affecting the working routines. However, the unexpected reduction in clavicular trapezius rest during a pause with preceding high intensity contraction requires further investigation before high intensity shoulder elevations can......BACKGROUND: Work-site strength training sessions are shown effective to prevent and reduce neck-shoulder pain in computer workers, but difficult to integrate in normal working routines. A solution for avoiding neck-shoulder pain during computer work may be to implement high intensity voluntary...... contractions during the computer work. However, it is unknown how this may influence productivity, rate of perceived exertion (RPE) as well as activity and rest of neck-shoulder muscles during computer work. The aim of this study was to investigate short-term effects of a high intensity contraction...

  14. 12 CFR 1780.11 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1780.11 Section 1780.11 Banks... time. (a) General rule. In computing any period of time prescribed or allowed by this subpart, the date of the act or event that commences the designated period of time is not included. The last day so...

  15. Temporal Prediction Errors Affect Short-Term Memory Scanning Response Time.

    Science.gov (United States)

    Limongi, Roberto; Silva, Angélica M

    2016-11-01

    The Sternberg short-term memory scanning task has been used to unveil cognitive operations involved in time perception. Participants produce time intervals during the task, and the researcher explores how task performance affects interval production - where time estimation error is the dependent variable of interest. The perspective of predictive behavior regards time estimation error as a temporal prediction error (PE), an independent variable that controls cognition, behavior, and learning. Based on this perspective, we investigated whether temporal PEs affect short-term memory scanning. Participants performed temporal predictions while they maintained information in memory. Model inference revealed that PEs affected memory scanning response time independently of the memory-set size effect. We discuss the results within the context of formal and mechanistic models of short-term memory scanning and predictive coding, a Bayes-based theory of brain function. We state the hypothesis that our finding could be associated with weak frontostriatal connections and weak striatal activity.

  16. 6 CFR 13.27 - Computation of time.

    Science.gov (United States)

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Computation of time. 13.27 Section 13.27 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY PROGRAM FRAUD CIVIL REMEDIES § 13.27 Computation of time. (a) In computing any period of time under this part or in an order issued...

  17. Short-time, high-dosage penicillin infusion therapy of syphilis

    DEFF Research Database (Denmark)

    Lomholt, Hans; Poulsen, Asmus; Brandrup, Flemming

    2003-01-01

    The optimal dosage and duration of penicillin treatment for the various stages of syphilis are not known. We present data on 20 patients with syphilis (primary, secondary or latent) treated with high-dose, short-time penicillin infusion therapy. Patients were given 10 MIU of penicillin G intraven......The optimal dosage and duration of penicillin treatment for the various stages of syphilis are not known. We present data on 20 patients with syphilis (primary, secondary or latent) treated with high-dose, short-time penicillin infusion therapy. Patients were given 10 MIU of penicillin G...

  18. Noise-constrained switching times for heteroclinic computing

    Science.gov (United States)

    Neves, Fabio Schittler; Voit, Maximilian; Timme, Marc

    2017-03-01

    Heteroclinic computing offers a novel paradigm for universal computation by collective system dynamics. In such a paradigm, input signals are encoded as complex periodic orbits approaching specific sequences of saddle states. Without inputs, the relevant states together with the heteroclinic connections between them form a network of states—the heteroclinic network. Systems of pulse-coupled oscillators or spiking neurons naturally exhibit such heteroclinic networks of saddles, thereby providing a substrate for general analog computations. Several challenges need to be resolved before it becomes possible to effectively realize heteroclinic computing in hardware. The time scales on which computations are performed crucially depend on the switching times between saddles, which in turn are jointly controlled by the system's intrinsic dynamics and the level of external and measurement noise. The nonlinear dynamics of pulse-coupled systems often strongly deviate from that of time-continuously coupled (e.g., phase-coupled) systems. The factors impacting switching times in pulse-coupled systems are still not well understood. Here we systematically investigate switching times in dependence of the levels of noise and intrinsic dissipation in the system. We specifically reveal how local responses to pulses coact with external noise. Our findings confirm that, like in time-continuous phase-coupled systems, piecewise-continuous pulse-coupled systems exhibit switching times that transiently increase exponentially with the number of switches up to some order of magnitude set by the noise level. Complementarily, we show that switching times may constitute a good predictor for the computation reliability, indicating how often an input signal must be reiterated. By characterizing switching times between two saddles in conjunction with the reliability of a computation, our results provide a first step beyond the coding of input signal identities toward a complementary coding for

  19. Effects of computing time delay on real-time control systems

    Science.gov (United States)

    Shin, Kang G.; Cui, Xianzhong

    1988-01-01

    The reliability of a real-time digital control system depends not only on the reliability of the hardware and software used, but also on the speed in executing control algorithms. The latter is due to the negative effects of computing time delay on control system performance. For a given sampling interval, the effects of computing time delay are classified into the delay problem and the loss problem. Analysis of these two problems is presented as a means of evaluating real-time control systems. As an example, both the self-tuning predicted (STP) control and Proportional-Integral-Derivative (PID) control are applied to the problem of tracking robot trajectories, and their respective effects of computing time delay on control performance are comparatively evaluated. For this example, the STP (PID) controller is shown to outperform the PID (STP) controller in coping with the delay (loss) problem.

  20. Real-time exposure fusion on a mobile computer

    CSIR Research Space (South Africa)

    Bachoo, AK

    2009-12-01

    Full Text Available information in these scenarios. An image captured using a short exposure time will not saturate bright image re- gions while an image captured with a long exposure time will show more detail in the dark regions. The pixel depth provided by most camera.... The auto exposure also creates strong blown-out highlights in the foreground (the grass patch). The short shutter time (Exposure 1) correctly exposes the grass while the long shutter time (Exposure 3) is able to correctly expose the camouflaged dummy...

  1. A short history of fractal-Cantorian space-time

    International Nuclear Information System (INIS)

    Marek-Crnjac, L.

    2009-01-01

    The article attempts to give a short historical overview of the discovery of fractal-Cantorian space-time starting from the 17th century up to the present. In the last 25 years a great number of scientists worked on fractal space-time notably Garnet Ord in Canada, Laurent Nottale in France and Mohamed El Naschie in England who gave an exact mathematical procedure for the derivation of the dimensionality and curvature of fractal space-time fuzzy manifold.

  2. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lent, Wineke A.M. van, E-mail: w.v.lent@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands); Deetman, Joost W., E-mail: j.deetman@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Teertstra, H. Jelle, E-mail: h.teertstra@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Muller, Sara H., E-mail: s.muller@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Hans, Erwin W., E-mail: e.w.hans@utwente.nl [University of Twente, School of Management and Governance, Dept. of Industrial Engineering and Business Intelligence Systems, Enschede (Netherlands); Harten, Wim H. van, E-mail: w.v.harten@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands)

    2012-11-15

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  3. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    International Nuclear Information System (INIS)

    Lent, Wineke A.M. van; Deetman, Joost W.; Teertstra, H. Jelle; Muller, Sara H.; Hans, Erwin W.; Harten, Wim H. van

    2012-01-01

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  4. A short-time fading study of Al2O3:C

    International Nuclear Information System (INIS)

    Nascimento, L.F.; Vanhavere, F.; Silva, E.H.; Deene, Y. De

    2015-01-01

    This paper studies the short-time fading from Al 2 O 3 :C by measuring optically stimulated luminescence (OSL) signals (Total OSL: T OSL , and Peak OSL: P OSL ) from droplets and Luxel™ pellets. The influence of various bleaching regimes (blue, green and white) and light power is compared. The fading effect is the decay of the OSL signal in the dark at room temperature. Al 2 O 3 :C detectors were submitted to various bleaching regimes, irradiated with a reference dose and read out after different time spans. Investigations were carried out using 2 mm size droplet detectors, made of thin Al 2 O 3 :C powder mixed with a photocured polymer. Tests were compared to Luxel™-type detectors (Landauer Inc.). Short-time post-irradiation fading is present in OSL results (T OSL and P OSL ) droplets for time spans up to 200 s. The effect of short-time fading can be lowered/removed when treating the detectors with high-power and/or long time bleaching regimes; this result was observed in both T OSL and P OSL from droplets and Luxel™. - Highlights: • Droplet composed of thin powder of Al 2 O 3 :C was prepared using a photo-curable polymer. • Powder grain sizes ranged from 5 μm to 35 μm. • Short-time fading was measured for irradiated samples. • Various bleaching regimes and light power was tested. • Droplets were compared to a commercially dosimeter, Luxel™

  5. Cluster Computing for Embedded/Real-Time Systems

    Science.gov (United States)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  6. Fast algorithms for computing phylogenetic divergence time.

    Science.gov (United States)

    Crosby, Ralph W; Williams, Tiffani L

    2017-12-06

    The inference of species divergence time is a key step in most phylogenetic studies. Methods have been available for the last ten years to perform the inference, but the performance of the methods does not yet scale well to studies with hundreds of taxa and thousands of DNA base pairs. For example a study of 349 primate taxa was estimated to require over 9 months of processing time. In this work, we present a new algorithm, AncestralAge, that significantly improves the performance of the divergence time process. As part of AncestralAge, we demonstrate a new method for the computation of phylogenetic likelihood and our experiments show a 90% improvement in likelihood computation time on the aforementioned dataset of 349 primates taxa with over 60,000 DNA base pairs. Additionally, we show that our new method for the computation of the Bayesian prior on node ages reduces the running time for this computation on the 349 taxa dataset by 99%. Through the use of these new algorithms we open up the ability to perform divergence time inference on large phylogenetic studies.

  7. Computer network time synchronization the network time protocol

    CERN Document Server

    Mills, David L

    2006-01-01

    What started with the sundial has, thus far, been refined to a level of precision based on atomic resonance: Time. Our obsession with time is evident in this continued scaling down to nanosecond resolution and beyond. But this obsession is not without warrant. Precision and time synchronization are critical in many applications, such as air traffic control and stock trading, and pose complex and important challenges in modern information networks.Penned by David L. Mills, the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol

  8. Optimum short-time polynomial regression for signal analysis

    Indian Academy of Sciences (India)

    A Sreenivasa Murthy

    the Proceedings of European Signal Processing Conference. (EUSIPCO) 2008. ... In a seminal paper, Savitzky and Golay [4] showed that short-time polynomial modeling is ...... We next consider a linearly frequency-modulated chirp with an exponentially .... 1 http://www.physionet.org/physiotools/matlab/ECGwaveGen/.

  9. Short Sleep Times Predict Obesity in Internal Medicine Clinic Patients

    Science.gov (United States)

    Buscemi, Dolores; Kumar, Ashwani; Nugent, Rebecca; Nugent, Kenneth

    2007-01-01

    Study Objectives: Epidemiological studies have demonstrated an association between short sleep times and obesity as defined by body mass index (BMI). We wanted to determine whether this association occurs in patients with chronic medical diagnoses since the number of confounding factors is likely higher in patients than the general population. Methods: Two hundred patients attending internal medicine clinics completed a survey regarding sleep habits, lifestyle characteristics, and medical diagnoses. An independent surveyor collected the information on the questionnaires and reviewed the medical records. Height and weight were measured by clinic personnel. Data were analyzed with multivariate logistic regression. Results: Subjects with short sleep times (< 7 hours) had an increased likelihood of obesity as defined by a BMI ≥ 30 kg/m2 when compared to the reference group of (8, 9] hours (odds ratio 2.93; 95% confidence interval, 1.06–8.09). There was a U-shaped relationship between obesity and sleep time in women but not in men. Young age (18 to 49 years), not smoking, drinking alcohol, hypertension, diabetes, and sleep apnea were also associated with obesity in the overall model. Conclusions: This study demonstrates an association between short sleep times and obesity in undifferentiated patients attending an internal medicine clinic using models adjusting for age, lifestyle characteristics, and some medical diagnoses. The U-shaped relationship in women suggests that sleep patterns may have gender specific associations. These observations provide the background for therapeutic trials in weight loss in patients with established medical problems. Citation: Buscemi D; Kumar A; Nugent R; Nugent K. Short sleep times predict obesity in internal medicine clinic patients. J Clin Sleep Med 2007;3(7):681–688. PMID:18198800

  10. Difference-based clustering of short time-course microarray data with replicates

    Directory of Open Access Journals (Sweden)

    Kim Jihoon

    2007-07-01

    Full Text Available Abstract Background There are some limitations associated with conventional clustering methods for short time-course gene expression data. The current algorithms require prior domain knowledge and do not incorporate information from replicates. Moreover, the results are not always easy to interpret biologically. Results We propose a novel algorithm for identifying a subset of genes sharing a significant temporal expression pattern when replicates are used. Our algorithm requires no prior knowledge, instead relying on an observed statistic which is based on the first and second order differences between adjacent time-points. Here, a pattern is predefined as the sequence of symbols indicating direction and the rate of change between time-points, and each gene is assigned to a cluster whose members share a similar pattern. We evaluated the performance of our algorithm to those of K-means, Self-Organizing Map and the Short Time-series Expression Miner methods. Conclusions Assessments using simulated and real data show that our method outperformed aforementioned algorithms. Our approach is an appropriate solution for clustering short time-course microarray data with replicates.

  11. Real-time mobile customer short message system design and implementation

    Science.gov (United States)

    Han, Qirui; Sun, Fang

    To expand the current mobile phone short message service, and to make the contact between schools, teachers, parents and feedback of the modern school office system more timely and conveniently, designed and developed the Short Message System based on the Linux platform. The state-of-the-art principles and designed proposals in the Short Message System based on the Linux platform are introduced. Finally we propose an optimized secure access authentication method. At present, many schools,vbusinesses and research institutions ratify the promotion and application the messaging system gradually, which has shown benign market prospects.

  12. 5 CFR 890.101 - Definitions; time computations.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Definitions; time computations. 890.101....101 Definitions; time computations. (a) In this part, the terms annuitant, carrier, employee, employee... in section 8901 of title 5, United States Code, and supplement the following definitions: Appropriate...

  13. X-ray testing for short-time dynamic applications

    International Nuclear Information System (INIS)

    Kurfiss, Malte; Moser, Stefan; Popko, Gregor; Nau, Siegfried

    2017-01-01

    For nondestructive testing purposes new challenges are short-time dynamic processes. The application of x-ray flash tubes and modern high-speed cameras allows the observation of the opening of air-bags or the energy absorption of compressed tubes as occurring during a vehicle crash. Special algorithms designed for computerized tomography analyses allow the 3D reconstruction at individual time points of the dynamic process. Possibilities and limitations of the actual techniques are discussed.

  14. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  15. 29 CFR 1921.22 - Computation of time.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Computation of time. 1921.22 Section 1921.22 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR... WORKERS' COMPENSATION ACT Miscellaneous § 1921.22 Computation of time. Sundays and holidays shall be...

  16. Extracting biologically significant patterns from short time series gene expression data

    Directory of Open Access Journals (Sweden)

    McGinnis Thomas

    2009-08-01

    Full Text Available Abstract Background Time series gene expression data analysis is used widely to study the dynamics of various cell processes. Most of the time series data available today consist of few time points only, thus making the application of standard clustering techniques difficult. Results We developed two new algorithms that are capable of extracting biological patterns from short time point series gene expression data. The two algorithms, ASTRO and MiMeSR, are inspired by the rank order preserving framework and the minimum mean squared residue approach, respectively. However, ASTRO and MiMeSR differ from previous approaches in that they take advantage of the relatively few number of time points in order to reduce the problem from NP-hard to linear. Tested on well-defined short time expression data, we found that our approaches are robust to noise, as well as to random patterns, and that they can correctly detect the temporal expression profile of relevant functional categories. Evaluation of our methods was performed using Gene Ontology (GO annotations and chromatin immunoprecipitation (ChIP-chip data. Conclusion Our approaches generally outperform both standard clustering algorithms and algorithms designed specifically for clustering of short time series gene expression data. Both algorithms are available at http://www.benoslab.pitt.edu/astro/.

  17. General purpose computers in real time

    International Nuclear Information System (INIS)

    Biel, J.R.

    1989-01-01

    I see three main trends in the use of general purpose computers in real time. The first is more processing power. The second is the use of higher speed interconnects between computers (allowing more data to be delivered to the processors). The third is the use of larger programs running in the computers. Although there is still work that needs to be done, I believe that all indications are that the online need for general purpose computers should be available for the SCC and LHC machines. 2 figs

  18. Road Short-Term Travel Time Prediction Method Based on Flow Spatial Distribution and the Relations

    Directory of Open Access Journals (Sweden)

    Mingjun Deng

    2016-01-01

    Full Text Available There are many short-term road travel time forecasting studies based on time series, but indeed, road travel time not only relies on the historical travel time series, but also depends on the road and its adjacent sections history flow. However, few studies have considered that. This paper is based on the correlation of flow spatial distribution and the road travel time series, applying nearest neighbor and nonparametric regression method to build a forecasting model. In aspect of spatial nearest neighbor search, three different space distances are defined. In addition, two forecasting functions are introduced: one combines the forecasting value by mean weight and the other uses the reciprocal of nearest neighbors distance as combined weight. Three different distances are applied in nearest neighbor search, which apply to the two forecasting functions. For travel time series, the nearest neighbor and nonparametric regression are applied too. Then minimizing forecast error variance is utilized as an objective to establish the combination model. The empirical results show that the combination model can improve the forecast performance obviously. Besides, the experimental results of the evaluation for the computational complexity show that the proposed method can satisfy the real-time requirement.

  19. Computational complexity of algorithms for sequence comparison, short-read assembly and genome alignment.

    Science.gov (United States)

    Baichoo, Shakuntala; Ouzounis, Christos A

    A multitude of algorithms for sequence comparison, short-read assembly and whole-genome alignment have been developed in the general context of molecular biology, to support technology development for high-throughput sequencing, numerous applications in genome biology and fundamental research on comparative genomics. The computational complexity of these algorithms has been previously reported in original research papers, yet this often neglected property has not been reviewed previously in a systematic manner and for a wider audience. We provide a review of space and time complexity of key sequence analysis algorithms and highlight their properties in a comprehensive manner, in order to identify potential opportunities for further research in algorithm or data structure optimization. The complexity aspect is poised to become pivotal as we will be facing challenges related to the continuous increase of genomic data on unprecedented scales and complexity in the foreseeable future, when robust biological simulation at the cell level and above becomes a reality. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Time-Predictable Computer Architecture

    Directory of Open Access Journals (Sweden)

    Schoeberl Martin

    2009-01-01

    Full Text Available Today's general-purpose processors are optimized for maximum throughput. Real-time systems need a processor with both a reasonable and a known worst-case execution time (WCET. Features such as pipelines with instruction dependencies, caches, branch prediction, and out-of-order execution complicate WCET analysis and lead to very conservative estimates. In this paper, we evaluate the issues of current architectures with respect to WCET analysis. Then, we propose solutions for a time-predictable computer architecture. The proposed architecture is evaluated with implementation of some features in a Java processor. The resulting processor is a good target for WCET analysis and still performs well in the average case.

  1. Recent achievements in real-time computational seismology in Taiwan

    Science.gov (United States)

    Lee, S.; Liang, W.; Huang, B.

    2012-12-01

    Real-time computational seismology is currently possible to be achieved which needs highly connection between seismic database and high performance computing. We have developed a real-time moment tensor monitoring system (RMT) by using continuous BATS records and moment tensor inversion (CMT) technique. The real-time online earthquake simulation service is also ready to open for researchers and public earthquake science education (ROS). Combine RMT with ROS, the earthquake report based on computational seismology can provide within 5 minutes after an earthquake occurred (RMT obtains point source information ROS completes a 3D simulation real-time now. For more information, welcome to visit real-time computational seismology earthquake report webpage (RCS).

  2. 7 CFR 1.603 - How are time periods computed?

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false How are time periods computed? 1.603 Section 1.603... Licenses General Provisions § 1.603 How are time periods computed? (a) General. Time periods are computed as follows: (1) The day of the act or event from which the period begins to run is not included. (2...

  3. Experience from transportation of irradiated WWER-440 fuel assemblies at Kozloduy NPP site after a short cooling time

    International Nuclear Information System (INIS)

    Stoyanova, I.; Kamenov, A.; Byrzev, L.; Christoskov, I.

    2003-01-01

    The presented results from the computation and analysis of the radiation characteristics of the irradiated fuel assemblies by the date of their transportation according to the selected loading patterns of the VSPOT cask and following the modified technology of transportation, i.e. without replacement of the pool solution by pure condensate, as well as the corresponding experimental results, confirm the applicability of the newly introduced safety criterion for the selection of a loading pattern of the cask with irradiated fuel assemblies after a short cooling time. The comparison between measured and computed surface dose rates shows that during the procedure of transfer of irradiated fuel assemblies from the pools of Units 1 and 2 to the pools of Kozloduy NPP Units 3 and 4 all safety limits, incl. the radiation protection requirements, were met

  4. Exploiting short-term memory in soft body dynamics as a computational resource.

    Science.gov (United States)

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  5. Implementing Molecular Dynamics for Hybrid High Performance Computers - 1. Short Range Forces

    International Nuclear Information System (INIS)

    Brown, W. Michael; Wang, Peng; Plimpton, Steven J.; Tharrington, Arnold N.

    2011-01-01

    The use of accelerators such as general-purpose graphics processing units (GPGPUs) have become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power requirements. Hybrid high performance computers, machines with more than one type of floating-point processor, are now becoming more prevalent due to these advantages. In this work, we discuss several important issues in porting a large molecular dynamics code for use on parallel hybrid machines - (1) choosing a hybrid parallel decomposition that works on central processing units (CPUs) with distributed memory and accelerator cores with shared memory, (2) minimizing the amount of code that must be ported for efficient acceleration, (3) utilizing the available processing power from both many-core CPUs and accelerators, and (4) choosing a programming model for acceleration. We present our solution to each of these issues for short-range force calculation in the molecular dynamics package LAMMPS. We describe algorithms for efficient short range force calculation on hybrid high performance machines. We describe a new approach for dynamic load balancing of work between CPU and accelerator cores. We describe the Geryon library that allows a single code to compile with both CUDA and OpenCL for use on a variety of accelerators. Finally, we present results on a parallel test cluster containing 32 Fermi GPGPUs and 180 CPU cores.

  6. Instruction timing for the CDC 7600 computer

    International Nuclear Information System (INIS)

    Lipps, H.

    1975-01-01

    This report provides timing information for all instructions of the Control Data 7600 computer, except for instructions of type 01X, to enable the optimization of 7600 programs. The timing rules serve as background information for timing charts which are produced by a program (TIME76) of the CERN Program Library. The rules that co-ordinate the different sections of the CPU are stated in as much detail as is necessary to time the flow of instructions for a given sequence of code. Instruction fetch, instruction issue, and access to small core memory are treated at length, since details are not available from the computer manuals. Annotated timing charts are given for 24 examples, chosen to display the full range of timing considerations. (Author)

  7. Quality of Standard Reference Materials for Short Time Activation Analysis

    International Nuclear Information System (INIS)

    Ismail, S.S.; Oberleitner, W.

    2003-01-01

    Some environmental reference materials (CFA-1633 b, IAEA-SL-1, SARM-1,BCR-176, Coal-1635, IAEA-SL-3, BCR-146, and SRAM-5) were analysed by short-time activation analysis. The results show that these materials can be classified in three groups, according to their activities after irradiation. The obtained results were compared in order to create a quality index for determination of short-lived nuclides at high count rates. It was found that Cfta is not a suitable standard for determining very short-lived nuclides (half-lives<1 min) because the activity it produces is 15-fold higher than that SL-3. Biological reference materials, such as SRM-1571, SRM-1573, SRM-1575, SRM-1577, IAEA-392, and IAEA-393, were also investigated by a higher counting efficiency system. The quality of this system and its well-type detector for investigating short-lived nuclides was discussed

  8. 50 CFR 221.3 - How are time periods computed?

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false How are time periods computed? 221.3... Provisions § 221.3 How are time periods computed? (a) General. Time periods are computed as follows: (1) The day of the act or event from which the period begins to run is not included. (2) The last day of the...

  9. Evaluation of scaling invariance embedded in short time series.

    Directory of Open Access Journals (Sweden)

    Xue Pan

    Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  10. Evaluation of scaling invariance embedded in short time series.

    Science.gov (United States)

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  11. Are anomalously short tunnelling times measurable?

    International Nuclear Information System (INIS)

    Delgado, V.; Muga, J.G.

    1996-01-01

    Low and Mende have analyzed the conditions that would make possible an actual measurement of an anomalously short traversal time through a potential barrier concluding that such a measurement cannot be made because it is not possible to describe the tunnelling of a wave packet initially close to the barrier by the open-quote open-quote usual wave packet space time analysis close-quote close-quote. We complement this work in several ways: It is argued that the described failure of the usual formalism occurs under a set of too restrictive conditions, some of them not physically motivated, so it does not necessarily imply the impossibility of such a measurement. However, by retaining only conditions well motivated on physical grounds we have performed a systematic numerical check which shows that the conclusion by Low and Mende is indeed generally valid. It is shown that, as speculated by Low and Mende, the process is dominated by over the barrier transmission. Copyright copyright 1996 Academic Press, Inc

  12. Directional short-time Fourier transform of distributions

    Directory of Open Access Journals (Sweden)

    Katerina Hadzi-Velkova Saneva

    2016-04-01

    Full Text Available Abstract In this paper we consider the directional short-time Fourier transform (DSTFT that was introduced and investigated in (Giv in J. Math. Anal. Appl. 399:100-107, 2013. We analyze the DSTFT and its transpose on test function spaces S ( R n $\\mathcal {S}(\\mathbb {R}^{n}$ and S ( Y 2 n $\\mathcal {S}(\\mathbb {Y}^{2n}$ , respectively, and prove the continuity theorems on these spaces. Then the obtained results are used to extend the DSTFT to spaces of distributions.

  13. Response probability and response time: a straight line, the Tagging/Retagging interpretation of short term memory, an operational definition of meaningfulness and short term memory time decay and search time.

    Science.gov (United States)

    Tarnow, Eugen

    2008-12-01

    The functional relationship between correct response probability and response time is investigated in data sets from Rubin, Hinton and Wenzel, J Exp Psychol Learn Mem Cogn 25:1161-1176, 1999 and Anderson, J Exp Psychol [Hum Learn] 7:326-343, 1981. The two measures are linearly related through stimulus presentation lags from 0 to 594 s in the former experiment and for repeated learning of words in the latter. The Tagging/Retagging interpretation of short term memory is introduced to explain this linear relationship. At stimulus presentation the words are tagged. This tagging level drops slowly with time. When a probe word is reintroduced the tagging level has to increase for the word to be properly identified leading to a delay in response time. The tagging time is related to the meaningfulness of the words used-the more meaningful the word the longer the tagging time. After stimulus presentation the tagging level drops in a logarithmic fashion to 50% after 10 s and to 20% after 240 s. The incorrect recall and recognition times saturate in the Rubin et al. data set (they are not linear for large time lags), suggesting a limited time to search the short term memory structure: the search time for recall of unusual words is 1.7 s. For recognition of nonsense words the corresponding time is about 0.4 s, similar to the 0.243 s found in Cavanagh (1972).

  14. PASTEURISASI HIGH TEMPERATURE SHORT TIME (HTST) SUSU TERHADAP Listeria monocytogenes PADA PENYIMPANAN REFRIGERATOR

    OpenAIRE

    SABIL, SYAHRIANA

    2015-01-01

    2015 SYAHRIANA SABIL (I 111 11 273). Pasteurisasi High Temperature Short Time (HTST) Susu terhadap Listeria monocytogenes pada Penyimpanan Refrigerator. Dibimbing oleh RATMAWATI MALAKA dan FARIDA NUR YULIATI. Pasteurisasi High Temperature Short Time (HTST) merupakan proses pemanasan susu di bawah titik didih yang diharapkan dapat membunuh Listeria monocytogenes (L. monocytogenes) karena bersifat patogen dan mengakibatkan listeriosis yang merupakan penyakit zoonosis. Tu...

  15. Time-dependent Bragg diffraction and short-pulse reflection by one-dimensional photonic crystals

    International Nuclear Information System (INIS)

    André, Jean-Michel; Jonnard, Philippe

    2015-01-01

    The time-dependence of the Bragg diffraction by one-dimensional photonic crystals and its influence on the short pulse reflection are studied in the framework of the coupled-wave theory. The indicial response of the photonic crystal is calculated and it appears that it presents a time-delay effect with a transient time conditioned by the extinction length. A numerical simulation is presented for a Bragg mirror in the x-ray domain and a pulse envelope modelled by a sine-squared shape. The potential consequences of the time-delay effect in time-dependent optics of short-pulses are emphasized. (paper)

  16. Real-time computational photon-counting LiDAR

    Science.gov (United States)

    Edgar, Matthew; Johnson, Steven; Phillips, David; Padgett, Miles

    2018-03-01

    The availability of compact, low-cost, and high-speed MEMS-based spatial light modulators has generated widespread interest in alternative sampling strategies for imaging systems utilizing single-pixel detectors. The development of compressed sensing schemes for real-time computational imaging may have promising commercial applications for high-performance detectors, where the availability of focal plane arrays is expensive or otherwise limited. We discuss the research and development of a prototype light detection and ranging (LiDAR) system via direct time of flight, which utilizes a single high-sensitivity photon-counting detector and fast-timing electronics to recover millimeter accuracy three-dimensional images in real time. The development of low-cost real time computational LiDAR systems could have importance for applications in security, defense, and autonomous vehicles.

  17. Short-time perturbation theory and nonrelativistic duality

    International Nuclear Information System (INIS)

    Whitenton, J.B.; Durand, B.; Durand, L.

    1983-01-01

    We give a simple proof of the nonrelativistic duality relation 2 sigma/sub bound/>roughly-equal 2 sigma/sub free/> for appropriate energy averages of the cross sections for e + e - →(qq-bar bound states) and e + e - →(free qq-bar pair), and calculate the corrections to the relation by relating W 2 sigma to the Fourier transform of the Feynman propagation function and developing a short-time perturbation series for that function. We illustrate our results in detail for simple power-law potentials and potentials which involve combinations of powers

  18. Equivalence between short-time biphasic and incompressible elastic material responses.

    Science.gov (United States)

    Ateshian, Gerard A; Ellis, Benjamin J; Weiss, Jeffrey A

    2007-06-01

    Porous-permeable tissues have often been modeled using porous media theories such as the biphasic theory. This study examines the equivalence of the short-time biphasic and incompressible elastic responses for arbitrary deformations and constitutive relations from first principles. This equivalence is illustrated in problems of unconfined compression of a disk, and of articular contact under finite deformation, using two different constitutive relations for the solid matrix of cartilage, one of which accounts for the large disparity observed between the tensile and compressive moduli in this tissue. Demonstrating this equivalence under general conditions provides a rationale for using available finite element codes for incompressible elastic materials as a practical substitute for biphasic analyses, so long as only the short-time biphasic response is sought. In practice, an incompressible elastic analysis is representative of a biphasic analysis over the short-term response deltatelasticity tensor, and K is the hydraulic permeability tensor of the solid matrix. Certain notes of caution are provided with regard to implementation issues, particularly when finite element formulations of incompressible elasticity employ an uncoupled strain energy function consisting of additive deviatoric and volumetric components.

  19. Determination of Permissible Short-Time Emergency Overloading of Turbo-Generators and Synchronous Compensators

    Directory of Open Access Journals (Sweden)

    V. A. Anischenko

    2011-01-01

    Full Text Available The paper shows that failure to take into account variable ratio of short-time emergency overloading of turbo-generators (synchronous compensators that can lead to underestimation of overloading capacity or impermissible insulation over-heating.A method has been developed for determination of permissible duration of short-time emergency over-loading that takes into account changes of over-loading ratio in case of a failure.

  20. Flow characteristics of a pilot-scale high temperature, short time pasteurizer.

    Science.gov (United States)

    Tomasula, P M; Kozempel, M F

    2004-09-01

    In this study, we present a method for determining the fastest moving particle (FMP) and residence time distribution (RTD) in a pilot-scale high temperature, short time (HTST) pasteurizer to ensure that laboratory or pilot-scale HTST apparatus meets the Pasteurized Milk Ordinance standards for pasteurization of milk and can be used for obtaining thermal inactivation data. The overall dimensions of the plate in the pasteurizer were 75 x 115 mm, with a thickness of 0.5 mm and effective diameter of 3.0 mm. The pasteurizer was equipped with nominal 21.5- and 52.2-s hold tubes, and flow capacity was variable from 0 to 20 L/h. Tracer studies were used to determine FMP times and RTD data to establish flow characteristics. Using brine milk as tracer, the FMP time for the short holding section was 18.6 s and for the long holding section was 36 s at 72 degrees C, compared with the nominal times of 21.5 and 52.2 s, respectively. The RTD study indicates that the short hold section was 45% back mixed and 55% plug flow for whole milk at 72 degrees C. The long hold section was 91% plug and 9% back mixed for whole milk at 72 degrees C. This study demonstrates that continuous laboratory and pilot-scale pasteurizers may be used to study inactivation of microorganisms only if the flow conditions in the holding tube are established for comparison with commercial HTST systems.

  1. TimeSet: A computer program that accesses five atomic time services on two continents

    Science.gov (United States)

    Petrakis, P. L.

    1993-01-01

    TimeSet is a shareware program for accessing digital time services by telephone. At its initial release, it was capable of capturing time signals only from the U.S. Naval Observatory to set a computer's clock. Later the ability to synchronize with the National Institute of Standards and Technology was added. Now, in Version 7.10, TimeSet is able to access three additional telephone time services in Europe - in Sweden, Austria, and Italy - making a total of five official services addressable by the program. A companion program, TimeGen, allows yet another source of telephone time data strings for callers equipped with TimeSet version 7.10. TimeGen synthesizes UTC time data strings in the Naval Observatory's format from an accurately set and maintained DOS computer clock, and transmits them to callers. This allows an unlimited number of 'freelance' time generating stations to be created. Timesetting from TimeGen is made feasible by the advent of Becker's RighTime, a shareware program that learns the drift characteristics of a computer's clock and continuously applies a correction to keep it accurate, and also brings .01 second resolution to the DOS clock. With clock regulation by RighTime and periodic update calls by the TimeGen station to an official time source via TimeSet, TimeGen offers the same degree of accuracy within the resolution of the computer clock as any official atomic time source.

  2. Short-time regularity assessment of fibrillatory waves from the surface ECG in atrial fibrillation

    International Nuclear Information System (INIS)

    Alcaraz, Raúl; Martínez, Arturo; Hornero, Fernando; Rieta, José J

    2012-01-01

    This paper proposes the first non-invasive method for direct and short-time regularity quantification of atrial fibrillatory (f) waves from the surface ECG in atrial fibrillation (AF). Regularity is estimated by computing individual morphological variations among f waves, which are delineated and extracted from the atrial activity (AA) signal, making use of an adaptive signed correlation index. The algorithm was tested on real AF surface recordings in order to discriminate atrial signals with different organization degrees, providing a notably higher global accuracy (90.3%) than the two non-invasive AF organization estimates defined to date: the dominant atrial frequency (70.5%) and sample entropy (76.1%). Furthermore, due to its ability to assess AA regularity wave to wave, the proposed method is also able to pursue AF organization time course more precisely than the aforementioned indices. As a consequence, this work opens a new perspective in the non-invasive analysis of AF, such as the individualized study of each f wave, that could improve the understanding of AF mechanisms and become useful for its clinical treatment. (paper)

  3. TV time but not computer time is associated with cardiometabolic risk in Dutch young adults.

    Science.gov (United States)

    Altenburg, Teatske M; de Kroon, Marlou L A; Renders, Carry M; Hirasing, Remy; Chinapaw, Mai J M

    2013-01-01

    TV time and total sedentary time have been positively related to biomarkers of cardiometabolic risk in adults. We aim to examine the association of TV time and computer time separately with cardiometabolic biomarkers in young adults. Additionally, the mediating role of waist circumference (WC) is studied. Data of 634 Dutch young adults (18-28 years; 39% male) were used. Cardiometabolic biomarkers included indicators of overweight, blood pressure, blood levels of fasting plasma insulin, cholesterol, glucose, triglycerides and a clustered cardiometabolic risk score. Linear regression analyses were used to assess the cross-sectional association of self-reported TV and computer time with cardiometabolic biomarkers, adjusting for demographic and lifestyle factors. Mediation by WC was checked using the product-of-coefficient method. TV time was significantly associated with triglycerides (B = 0.004; CI = [0.001;0.05]) and insulin (B = 0.10; CI = [0.01;0.20]). Computer time was not significantly associated with any of the cardiometabolic biomarkers. We found no evidence for WC to mediate the association of TV time or computer time with cardiometabolic biomarkers. We found a significantly positive association of TV time with cardiometabolic biomarkers. In addition, we found no evidence for WC as a mediator of this association. Our findings suggest a need to distinguish between TV time and computer time within future guidelines for screen time.

  4. Time-of-Flight Cameras in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2010-01-01

    Computer Graphics, Computer Vision and Human Machine Interaction (HMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real-time geometry...

  5. 29 CFR 4245.8 - Computation of time.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Computation of time. 4245.8 Section 4245.8 Labor Regulations Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION INSOLVENCY, REORGANIZATION, TERMINATION, AND OTHER RULES APPLICABLE TO MULTIEMPLOYER PLANS NOTICE OF INSOLVENCY § 4245.8 Computation of...

  6. Analysis of Seasonal Signal in GPS Short-Baseline Time Series

    Science.gov (United States)

    Wang, Kaihua; Jiang, Weiping; Chen, Hua; An, Xiangdong; Zhou, Xiaohui; Yuan, Peng; Chen, Qusen

    2018-04-01

    Proper modeling of seasonal signals and their quantitative analysis are of interest in geoscience applications, which are based on position time series of permanent GPS stations. Seasonal signals in GPS short-baseline (paper, to better understand the seasonal signal in GPS short-baseline time series, we adopted and processed six different short-baselines with data span that varies from 2 to 14 years and baseline length that varies from 6 to 1100 m. To avoid seasonal signals that are overwhelmed by noise, each of the station pairs is chosen with significant differences in their height (> 5 m) or type of the monument. For comparison, we also processed an approximately zero baseline with a distance of pass-filtered (BP) noise is valid for approximately 40% of the baseline components, and another 20% of the components can be best modeled by a combination of the first-order Gauss-Markov (FOGM) process plus white noise (WN). The TEM displacements are then modeled by considering the monument height of the building structure beneath the GPS antenna. The median contributions of TEM to the annual amplitude in the vertical direction are 84% and 46% with and without additional parts of the monument, respectively. Obvious annual signals with amplitude > 0.4 mm in the horizontal direction are observed in five short-baselines, and the amplitudes exceed 1 mm in four of them. These horizontal seasonal signals are likely related to the propagation of daily/sub-daily TEM displacement or other signals related to the site environment. Mismodeling of the tropospheric delay may also introduce spurious seasonal signals with annual amplitudes of 5 and 2 mm, respectively, for two short-baselines with elevation differences greater than 100 m. The results suggest that the monument height of the additional part of a typical GPS station should be considered when estimating the TEM displacement and that the tropospheric delay should be modeled cautiously, especially with station pairs with

  7. Computation Offloading for Frame-Based Real-Time Tasks under Given Server Response Time Guarantees

    Directory of Open Access Journals (Sweden)

    Anas S. M. Toma

    2014-11-01

    Full Text Available Computation offloading has been adopted to improve the performance of embedded systems by offloading the computation of some tasks, especially computation-intensive tasks, to servers or clouds. This paper explores computation offloading for real-time tasks in embedded systems, provided given response time guarantees from the servers, to decide which tasks should be offloaded to get the results in time. We consider frame-based real-time tasks with the same period and relative deadline. When the execution order of the tasks is given, the problem can be solved in linear time. However, when the execution order is not specified, we prove that the problem is NP-complete. We develop a pseudo-polynomial-time algorithm for deriving feasible schedules, if they exist.  An approximation scheme is also developed to trade the error made from the algorithm and the complexity. Our algorithms are extended to minimize the period/relative deadline of the tasks for performance maximization. The algorithms are evaluated with a case study for a surveillance system and synthesized benchmarks.

  8. Innovation: study of 'ultra-short' time reactions

    International Nuclear Information System (INIS)

    Anon.

    2001-01-01

    This short article presents the new Elyse facility of Orsay-Paris 11 university for the study of ultra-short chemical and biochemical phenomena. Elyse uses the 'pump-probe' technique which consists in two perfectly synchronized electron and photon pulses. It comprises a 3 to 9 MeV electron accelerator with a HF gun photo-triggered with a laser. Elyse can initiate reactions using ultra-short electron pulses (radiolysis) or ultra-short photon pulses (photolysis). (J.S.)

  9. Short time propagation of a singular wave function: Some surprising results

    Science.gov (United States)

    Marchewka, A.; Granot, E.; Schuss, Z.

    2007-08-01

    The Schrödinger evolution of an initially singular wave function was investigated. First it was shown that a wide range of physical problems can be described by initially singular wave function. Then it was demonstrated that outside the support of the initial wave function the time evolution is governed to leading order by the values of the wave function and its derivatives at the singular points. Short-time universality appears where it depends only on a single parameter—the value at the singular point (not even on its derivatives). It was also demonstrated that the short-time evolution in the presence of an absorptive potential is different than in the presence of a nonabsorptive one. Therefore, this dynamics can be harnessed to the determination whether a potential is absorptive or not simply by measuring only the transmitted particles density.

  10. The Simulation Computer Based Learning (SCBL) for Short Circuit Multi Machine Power System Analysis

    Science.gov (United States)

    Rahmaniar; Putri, Maharani

    2018-03-01

    Strengthening Competitiveness of human resources become the reply of college as a conductor of high fomal education. Electrical Engineering Program UNPAB (Prodi TE UNPAB) as one of the department of electrical engineering that manages the field of electrical engineering expertise has a very important part in preparing human resources (HR), Which is required by where graduates are produced by DE UNPAB, Is expected to be able to compete globally, especially related to the implementation of Asean Economic Community (AEC) which requires the active participation of graduates with competence and quality of human resource competitiveness. Preparation of HR formation Competitive is done with the various strategies contained in the Seven (7) Higher Education Standard, one part of which is the implementation of teaching and learning process in Electrical system analysis with short circuit analysis (SCA) This course is a course The core of which is the basis for the competencies of other subjects in the advanced semester at Development of Computer Based Learning model (CBL) is done in the learning of interference analysis of multi-machine short circuit which includes: (a) Short-circuit One phase, (B) Two-phase Short Circuit Disruption, (c) Ground Short Circuit Disruption, (d) Short Circuit Disruption One Ground Floor Development of CBL learning model for Electrical System Analysis course provides space for students to be more active In learning in solving complex (complicated) problems, so it is thrilling Ilkan flexibility of student learning how to actively solve the problem of short-circuit analysis and to form the active participation of students in learning (Student Center Learning, in the course of electrical power system analysis.

  11. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    Science.gov (United States)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Active mode-locking of mid-infrared quantum cascade lasers with short gain recovery time.

    Science.gov (United States)

    Wang, Yongrui; Belyanin, Alexey

    2015-02-23

    We investigate the dynamics of actively modulated mid-infrared quantum cascade lasers (QCLs) using space- and time-domain simulations of coupled density matrix and Maxwell equations with resonant tunneling current taken into account. We show that it is possible to achieve active mode locking and stable generation of picosecond pulses in high performance QCLs with a vertical laser transition and a short gain recovery time by bias modulation of a short section of a monolithic Fabry-Perot cavity. In fact, active mode locking in QCLs with a short gain recovery time turns out to be more robust to the variation of parameters as compared to previously studied lasers with a long gain recovery time. We investigate the effects of spatial hole burning and phase locking on the laser output.

  13. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  14. libgapmis: extending short-read alignments.

    Science.gov (United States)

    Alachiotis, Nikolaos; Berger, Simon; Flouri, Tomáš; Pissis, Solon P; Stamatakis, Alexandros

    2013-01-01

    A wide variety of short-read alignment programmes have been published recently to tackle the problem of mapping millions of short reads to a reference genome, focusing on different aspects of the procedure such as time and memory efficiency, sensitivity, and accuracy. These tools allow for a small number of mismatches in the alignment; however, their ability to allow for gaps varies greatly, with many performing poorly or not allowing them at all. The seed-and-extend strategy is applied in most short-read alignment programmes. After aligning a substring of the reference sequence against the high-quality prefix of a short read--the seed--an important problem is to find the best possible alignment between a substring of the reference sequence succeeding and the remaining suffix of low quality of the read--extend. The fact that the reads are rather short and that the gap occurrence frequency observed in various studies is rather low suggest that aligning (parts of) those reads with a single gap is in fact desirable. In this article, we present libgapmis, a library for extending pairwise short-read alignments. Apart from the standard CPU version, it includes ultrafast SSE- and GPU-based implementations. libgapmis is based on an algorithm computing a modified version of the traditional dynamic-programming matrix for sequence alignment. Extensive experimental results demonstrate that the functions of the CPU version provided in this library accelerate the computations by a factor of 20 compared to other programmes. The analogous SSE- and GPU-based implementations accelerate the computations by a factor of 6 and 11, respectively, compared to the CPU version. The library also provides the user the flexibility to split the read into fragments, based on the observed gap occurrence frequency and the length of the read, thereby allowing for a variable, but bounded, number of gaps in the alignment. We present libgapmis, a library for extending pairwise short-read alignments. We

  15. Corrections for the combined effects of decay and dead time in live-timed counting of short-lived radionuclides

    International Nuclear Information System (INIS)

    Fitzgerald, R.

    2016-01-01

    Studies and calibrations of short-lived radionuclides, for example "1"5O, are of particular interest in nuclear medicine. Yet counting experiments on such species are vulnerable to an error due to the combined effect of decay and dead time. Separate decay corrections and dead-time corrections do not account for this issue. Usually counting data are decay-corrected to the start time of the count period, or else instead of correcting the count rate, the mid-time of the measurement is used as the reference time. Correction factors are derived for both those methods, considering both extending and non-extending dead time. Series approximations are derived here and the accuracy of those approximations are discussed. - Highlights: • Derived combined effects of decay and dead time. • Derived for counting systems with extending or non-extending dead times. • Derived series expansions for both midpoint and decay-to-start-time methods. • Useful for counting experiments with short-lived radionuclides. • Examples given for "1"5O, used in PET scanning.

  16. Ultra-short time sciences. From the atto-second to the peta-watts

    International Nuclear Information System (INIS)

    2000-01-01

    This book presents the recent advances in the scientific and technical domains linked with ultra-short time physics. It deals first with the conceptual and technological aspects of ultra-intense and ultra-brief lasers. Then, it describes the different domains of research (atoms, molecules and aggregates; gaseous phase dynamics using the pump-probe technique; femto-chemistry in dense phase; condensed matter; plasma physics; consistent control; aerosols; functional femto-biology) and the different domains of application (medical diagnosis; ophthalmology; telecommunications; technological and industrial developments). A last part is devoted to the teaching of ultra-short time sciences. (J.S.)

  17. Cloud computing platform for real-time measurement and verification of energy performance

    International Nuclear Information System (INIS)

    Ke, Ming-Tsun; Yeh, Chia-Hung; Su, Cheng-Jie

    2017-01-01

    Highlights: • Application of PSO algorithm can improve the accuracy of the baseline model. • M&V cloud platform automatically calculates energy performance. • M&V cloud platform can be applied in all energy conservation measures. • Real-time operational performance can be monitored through the proposed platform. • M&V cloud platform facilitates the development of EE programs and ESCO industries. - Abstract: Nations worldwide are vigorously promoting policies to improve energy efficiency. The use of measurement and verification (M&V) procedures to quantify energy performance is an essential topic in this field. Currently, energy performance M&V is accomplished via a combination of short-term on-site measurements and engineering calculations. This requires extensive amounts of time and labor and can result in a discrepancy between actual energy savings and calculated results. In addition, the M&V period typically lasts for periods as long as several months or up to a year, the failure to immediately detect abnormal energy performance not only decreases energy performance, results in the inability to make timely correction, and misses the best opportunity to adjust or repair equipment and systems. In this study, a cloud computing platform for the real-time M&V of energy performance is developed. On this platform, particle swarm optimization and multivariate regression analysis are used to construct accurate baseline models. Instantaneous and automatic calculations of the energy performance and access to long-term, cumulative information about the energy performance are provided via a feature that allows direct uploads of the energy consumption data. Finally, the feasibility of this real-time M&V cloud platform is tested for a case study involving improvements to a cold storage system in a hypermarket. Cloud computing platform for real-time energy performance M&V is applicable to any industry and energy conservation measure. With the M&V cloud platform, real-time

  18. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    Science.gov (United States)

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  19. Massively parallel signal processing using the graphics processing unit for real-time brain-computer interface feature extraction

    Directory of Open Access Journals (Sweden)

    J. Adam Wilson

    2009-07-01

    Full Text Available The clock speeds of modern computer processors have nearly plateaued in the past five years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card (GPU was developed for real-time neural signal processing of a brain-computer interface (BCI. The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter, followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally-intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a CPU-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  20. Viruses as groundwater tracers: using ecohydrology to characterize short travel times in aquifers

    Science.gov (United States)

    Hunt, Randall J.; Borchardt, Mark A.; Bradbury, Kenneth R.

    2014-01-01

    Viruses are attractive tracers of short (population over time; therefore, the virus snapshot shed in the fecal wastes of an infected population at a specific point in time can serve as a marker for tracking virus and groundwater movement. The virus tracing approach and an example application are described to illustrate their ability to characterize travel times in high-groundwater velocity settings, and provide insight unavailable from standard hydrogeologic approaches. Although characterization of preferential flowpaths does not usually characterize the majority of other travel times occurring in the groundwater system (e.g., center of plume mass; tail of the breakthrough curve), virus approaches can trace very short times of transport, and thus can fill an important gap in our current hydrogeology toolbox.

  1. Psychometric properties of the Hebrew short version of the Zimbardo Time Perspective Inventory.

    Science.gov (United States)

    Orkibi, Hod

    2015-06-01

    The purpose of this study was to develop a short Hebrew version of the Zimbardo Time Perspective Inventory that can be easily administered by health professionals in research, therapy, and counseling. First, the empirical links of time perspective (TP) to subjective well-being and health protective and health risk behaviors are reviewed. Then, a brief account of the instrument's previous modifications is provided. Results of confirmatory factor analysis (N = 572) verified the five-factor structure of the short version and yielded acceptable internal consistency reliability for each factor. The correlation coefficients between the five subscales of the short (20 items) and the original (56 items) instruments were all above .79, indicating the suitability of the short version for assessing the five TP factors. Support for the discriminant and concurrent validity was also achieved, largely in agreement with previous findings. Finally, limitations and future directions are addressed, and potential applications in therapy and counseling are offered. © The Author(s) 2014.

  2. Time-resolved measurement of the quantum states of photons using two-photon interference with short-time reference pulses

    International Nuclear Information System (INIS)

    Ren Changliang; Hofmann, Holger F.

    2011-01-01

    To fully utilize the energy-time degree of freedom of photons for optical quantum-information processes, it is necessary to control and characterize the temporal quantum states of the photons at extremely short time scales. For measurements of the temporal coherence of the quantum states beyond the time resolution of available detectors, two-photon interference with a photon in a short-time reference pulse may be a viable alternative. In this paper, we derive the temporal measurement operators for the bunching statistics of a single-photon input state with a photon from a weak coherent reference pulse. It is shown that the effects of the pulse shape of the reference pulse can be expressed in terms of a spectral filter selecting the bandwidth within which the measurement can be treated as an ideal projection on eigenstates of time. For full quantum tomography, temporal coherence can be determined by using superpositions of reference pulses at two different times. Moreover, energy-time entanglement can be evaluated based on the two-by-two entanglement observed in the coherences between pairs of detection times.

  3. Imprecise results: Utilizing partial computations in real-time systems

    Science.gov (United States)

    Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.

    1987-01-01

    In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.

  4. Ultra-Wideband, Short Pulse Electromagnetics 9

    CERN Document Server

    Rachidi, Farhad; Kaelin, Armin; Sabath, Frank; UWB SP 9

    2010-01-01

    Ultra-wideband (UWB), short-pulse (SP) electromagnetics are now being used for an increasingly wide variety of applications, including collision avoidance radar, concealed object detection, and communications. Notable progress in UWB and SP technologies has been achieved by investigations of their theoretical bases and improvements in solid-state manufacturing, computers, and digitizers. UWB radar systems are also being used for mine clearing, oil pipeline inspections, archeology, geology, and electronic effects testing. Ultra-wideband Short-Pulse Electromagnetics 9 presents selected papers of deep technical content and high scientific quality from the UWB-SP9 Conference, which was held from July 21-25, 2008, in Lausanne, Switzerland. The wide-ranging coverage includes contributions on electromagnetic theory, time-domain computational techniques, modeling, antennas, pulsed-power, UWB interactions, radar systems, UWB communications, and broadband systems and components. This book serves as a state-of-the-art r...

  5. Multifractals embedded in short time series: An unbiased estimation of probability moment

    Science.gov (United States)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  6. Application of short-time activation analysis in the sciences

    International Nuclear Information System (INIS)

    Grass, F.

    1991-01-01

    Short-time activation analysis has proved to be a valuable tool in nearly all fields of science. To take full advantage of this technique, it is favorable to use a fast transfer system and a high resolution high rate gamma-spectroscopy system for short lived gamma-emitters and a Cherenkov detector for the determination of hard beta-emitters. It is then possible to utilize sub-minute nuclides Li-8 (740 ms), B-12 (20 ms), F-20 (11.1 s), Y-89m (16 s), and Pb-207m (800 ms) for the determination of these elements. Besides these sub-minute nuclides which constitute the only possibility for neutron activation analysis of these elements there are a number of other elements which form longer lived nuclides on short irradiation. The analysis of the halogenides F, Cl, Br, I in waste water of a sewage incineration plant can be achieved with a single 20 s irradiation and two consecutive measurement of 20 and 600 s using Cl-38m, F-20, Br-79m as well as the longer lived Cl-38, Br-80, I-128

  7. The case of escape probability as linear in short time

    Science.gov (United States)

    Marchewka, A.; Schuss, Z.

    2018-02-01

    We derive rigorously the short-time escape probability of a quantum particle from its compactly supported initial state, which has a discontinuous derivative at the boundary of the support. We show that this probability is linear in time, which seems to be a new result. The novelty of our calculation is the inclusion of the boundary layer of the propagated wave function formed outside the initial support. This result has applications to the decay law of the particle, to the Zeno behaviour, quantum absorption, time of arrival, quantum measurements, and more.

  8. Real-Time Thevenin Impedance Computation

    DEFF Research Database (Denmark)

    Sommer, Stefan Horst; Jóhannsson, Hjörtur

    2013-01-01

    operating state, and strict time constraints are difficult to adhere to as the complexity of the grid increases. Several suggested approaches for real-time stability assessment require Thevenin impedances to be determined for the observed system conditions. By combining matrix factorization, graph reduction......, and parallelization, we develop an algorithm for computing Thevenin impedances an order of magnitude faster than previous approaches. We test the factor-and-solve algorithm with data from several power grids of varying complexity, and we show how the algorithm allows realtime stability assessment of complex power...

  9. COMPUTER VISION SYNDROME: A SHORT REVIEW.

    OpenAIRE

    Sameena; Mohd Inayatullah

    2012-01-01

    Computers are probably one of the biggest scientific inventions of the modern era, and since then they have become an integral part of our life. The increased usage of computers have lead to variety of ocular symptoms which includ es eye strain, tired eyes, irritation, redness, blurred vision, and diplopia, collectively referred to as Computer Vision Syndrome (CVS). CVS may have a significant impact not only on visual com fort but also occupational productivit...

  10. Spying on real-time computers to improve performance

    International Nuclear Information System (INIS)

    Taff, L.M.

    1975-01-01

    The sampled program-counter histogram, an established technique for shortening the execution times of programs, is described for a real-time computer. The use of a real-time clock allows particularly easy implementation. (Auth.)

  11. Chemistry, physics and time: the computer modelling of glassmaking.

    Science.gov (United States)

    Martlew, David

    2003-01-01

    A decade or so ago the remains of an early flat glass furnace were discovered in St Helens. Continuous glass production only became feasible after the Siemens Brothers demonstrated their continuous tank furnace at Dresden in 1870. One manufacturer of flat glass enthusiastically adopted the new technology and secretly explored many variations on this theme during the next fifteen years. Study of the surviving furnace remains using today's computer simulation techniques showed how, in 1887, that technology was adapted to the special demands of window glass making. Heterogeneous chemical reactions at high temperatures are required to convert the mixture of granular raw materials into the homogeneous glass needed for windows. Kinetics (and therefore the economics) of glassmaking is dominated by heat transfer and chemical diffusion as refractory grains are converted to highly viscous molten glass. Removal of gas bubbles in a sufficiently short period of time is vital for profitability, but the glassmaker must achieve this in a reaction vessel which is itself being dissolved by the molten glass. Design and operational studies of today's continuous tank furnaces need to take account of these factors, and good use is made of computer simulation techniques to shed light on the way furnaces behave and how improvements may be made. This paper seeks to show how those same techniques can be used to understand how the early Siemens continuous tank furnaces were designed and operated, and how the Victorian entrepreneurs succeeded in managing the thorny problems of what was, in effect, a vulnerable high temperature continuous chemical reactor.

  12. A Distributed Computing Network for Real-Time Systems.

    Science.gov (United States)

    1980-11-03

    7 ) AU2 o NAVA TUNDEWATER SY$TEMS CENTER NEWPORT RI F/G 9/2 UIS RIBUT E 0 COMPUTIN G N LTWORK FOR REAL - TIME SYSTEMS .(U) UASSIFIED NOV Al 6 1...MORAIS - UT 92 dLEVEL c A Distributed Computing Network for Real - Time Systems . 11 𔃺-1 Gordon E/Morson I7 y tm- ,r - t "en t As J 2 -p .. - 7 I’ cNaval...NUMBER TD 5932 / N 4. TITLE mand SubotI. S. TYPE OF REPORT & PERIOD COVERED A DISTRIBUTED COMPUTING NETWORK FOR REAL - TIME SYSTEMS 6. PERFORMING ORG

  13. Real-time computing platform for spiking neurons (RT-spike).

    Science.gov (United States)

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  14. 43 CFR 45.3 - How are time periods computed?

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false How are time periods computed? 45.3... IN FERC HYDROPOWER LICENSES General Provisions § 45.3 How are time periods computed? (a) General... run is not included. (2) The last day of the period is included. (i) If that day is a Saturday, Sunday...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  16. Short Paper and Poster Proceedings of the 22nd Annual Conference on Computer Animation and Social Agents

    NARCIS (Netherlands)

    Nijholt, Antinus; Egges, A.; van Welbergen, H.; Hondorp, G.H.W.

    2009-01-01

    These are the proceedings containing the short and poster papers of CASA 2009, the twenty second international conference on Computer Animation and Social Agents. CASA 2009 was organized in Amsterdam, the Netherlands from the 17th to the 19th of June 2009. CASA is organized under the auspices of the

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  18. Design and development of a computer-based continuous monitor for the determination of the short-lived decay products of radon and thoron

    Energy Technology Data Exchange (ETDEWEB)

    Bigu, J [Department of Energy, Mines and Resources, Elliot Lake, Ontario (Canada). Elliot Lake Lab.; Raz, R; Golden, K; Dominguez, P [Alpha-NUCLEAR, Toronto, Ontario (Canada)

    1984-08-15

    A portable, rugged, monitor has been designed and built for measuring the short-lived decay products of radon and thoron. The monitor is computer-based and employs a continuous filter strip which can be advanced at programmable time intervals to allow unattended continuous operation with automatic sampling, analysis and recording of radiation levels. Radionuclide analysis is carried out by two silicon diffused-junction alpha-detectors and electronic circuitry with multichannel spectral analysis capabilities. Standard gross ..cap alpha..-count methods and ..cap alpha..-spectroscopy methods can easily be implemented. The built-in computer performs a variety of operations via a specially designed interface module, including control and data recording functions, and computations, program storage and display functions. Programs and data are stored in the built-in casette tape drive and the computer integrated CRT display and keyboard allow simple, prompted menu-type operation of standard software. Graphical presentation of ..cap alpha..-spectra can be shown on the computer CRT and printed when required on the computer built-in thermal printer. In addition, to implementing the specially developed radionuclide analysis software, the operator can interact and modify existing software, and program new ones, through BASIC language programming, or employ the computer in a totally unrelated, general purpose model. Although the monitor is ideally suited for environmental radon (thoron) daughter monitoring, it could also be used in the determination of other airborne radionuclides provided adequate analytical procedures are developed or included in the already existing computer software.

  19. Design and development of a computer-based continuous monitor for the determination of the short-lived decay products of radon and thoron

    International Nuclear Information System (INIS)

    Bigu, J.

    1984-01-01

    A portable, rugged, monitor has been designed and built for measuring the short-lived decay products of radon and thoron. The monitor is computer-based and employs a continuous filter strip which can be advanced at programmable time intervals to allow unattended continuous operatin with automatic sampling, analysis and recording of radiation levels. Radionuclide analysis is carried out by two silicon diffused-junction alpha-detectors and electronic circuitry with multichannel spectral analysis capabilities. Standard gross α-count methods and α-spectroscopy methods can easily be implemented. The built-in computer performs a variety of operations via a specially designed interface module, including control and data recording functions, and computations, program storage and display functions. Programs and data are stored in the built-in casette tape drive and the computer integrated CRT display and keyboard allow simple, prompted menu-type operation of standard software. Graphical presentation of α-spectra can be shown on the computer CRT and printed when required on the computer built-in thermal printer. In addition, to implementing the specially developed radionuclide analysis software, the operator can interact and modify existing software, and program new ones, through BASIC language programming, or employ the computer in a totally unrelated, general purpose model. Although the monitor is ideally suited for environmental radon (thoron) daughter monitoring, it could also be used in the determination of other airborne radionuclides provided adequate analytical procedures are developed or included in the already existing computer software. (orig.)

  20. Relativistic Photoionization Computations with the Time Dependent Dirac Equation

    Science.gov (United States)

    2016-10-12

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6795--16-9698 Relativistic Photoionization Computations with the Time Dependent Dirac... Photoionization Computations with the Time Dependent Dirac Equation Daniel F. Gordon and Bahman Hafizi Naval Research Laboratory 4555 Overlook Avenue, SW...Unclassified Unlimited Unclassified Unlimited 22 Daniel Gordon (202) 767-5036 Tunneling Photoionization Ionization of inner shell electrons by laser

  1. The Computer Revolution and Physical Chemistry.

    Science.gov (United States)

    O'Brien, James F.

    1989-01-01

    Describes laboratory-oriented software programs that are short, time-saving, eliminate computational errors, and not found in public domain courseware. Program availability for IBM and Apple microcomputers is included. (RT)

  2. The role of short-time intensity and envelope power for speech intelligibility and psychoacoustic masking.

    Science.gov (United States)

    Biberger, Thomas; Ewert, Stephan D

    2017-08-01

    The generalized power spectrum model [GPSM; Biberger and Ewert (2016). J. Acoust. Soc. Am. 140, 1023-1038], combining the "classical" concept of the power-spectrum model (PSM) and the envelope power spectrum-model (EPSM), was demonstrated to account for several psychoacoustic and speech intelligibility (SI) experiments. The PSM path of the model uses long-time power signal-to-noise ratios (SNRs), while the EPSM path uses short-time envelope power SNRs. A systematic comparison of existing SI models for several spectro-temporal manipulations of speech maskers and gender combinations of target and masker speakers [Schubotz et al. (2016). J. Acoust. Soc. Am. 140, 524-540] showed the importance of short-time power features. Conversely, Jørgensen et al. [(2013). J. Acoust. Soc. Am. 134, 436-446] demonstrated a higher predictive power of short-time envelope power SNRs than power SNRs using reverberation and spectral subtraction. Here the GPSM was extended to utilize short-time power SNRs and was shown to account for all psychoacoustic and SI data of the three mentioned studies. The best processing strategy was to exclusively use either power or envelope-power SNRs, depending on the experimental task. By analyzing both domains, the suggested model might provide a useful tool for clarifying the contribution of amplitude modulation masking and energetic masking.

  3. STICK: Spike Time Interval Computational Kernel, a Framework for General Purpose Computation Using Neurons, Precise Timing, Delays, and Synchrony.

    Science.gov (United States)

    Lagorce, Xavier; Benosman, Ryad

    2015-11-01

    There has been significant research over the past two decades in developing new platforms for spiking neural computation. Current neural computers are primarily developed to mimic biology. They use neural networks, which can be trained to perform specific tasks to mainly solve pattern recognition problems. These machines can do more than simulate biology; they allow us to rethink our current paradigm of computation. The ultimate goal is to develop brain-inspired general purpose computation architectures that can breach the current bottleneck introduced by the von Neumann architecture. This work proposes a new framework for such a machine. We show that the use of neuron-like units with precise timing representation, synaptic diversity, and temporal delays allows us to set a complete, scalable compact computation framework. The framework provides both linear and nonlinear operations, allowing us to represent and solve any function. We show usability in solving real use cases from simple differential equations to sets of nonlinear differential equations leading to chaotic attractors.

  4. Computer-controlled neutron time-of-flight spectrometer. Part II

    International Nuclear Information System (INIS)

    Merriman, S.H.

    1979-12-01

    A time-of-flight spectrometer for neutron inelastic scattering research has been interfaced to a PDP-15/30 computer. The computer is used for experimental data acquisition and analysis and for apparatus control. This report was prepared to summarize the functions of the computer and to act as a users' guide to the software system

  5. Onboard Short Term Plan Viewer

    Science.gov (United States)

    Hall, Tim; LeBlanc, Troy; Ulman, Brian; McDonald, Aaron; Gramm, Paul; Chang, Li-Min; Keerthi, Suman; Kivlovitz, Dov; Hadlock, Jason

    2011-01-01

    Onboard Short Term Plan Viewer (OSTPV) is a computer program for electronic display of mission plans and timelines, both aboard the International Space Station (ISS) and in ISS ground control stations located in several countries. OSTPV was specifically designed both (1) for use within the limited ISS computing environment and (2) to be compatible with computers used in ground control stations. OSTPV supplants a prior system in which, aboard the ISS, timelines were printed on paper and incorporated into files that also contained other paper documents. Hence, the introduction of OSTPV has both reduced the consumption of resources and saved time in updating plans and timelines. OSTPV accepts, as input, the mission timeline output of a legacy, print-oriented, UNIX-based program called "Consolidated Planning System" and converts the timeline information for display in an interactive, dynamic, Windows Web-based graphical user interface that is used by both the ISS crew and ground control teams in real time. OSTPV enables the ISS crew to electronically indicate execution of timeline steps, launch electronic procedures, and efficiently report to ground control teams on the statuses of ISS activities, all by use of laptop computers aboard the ISS.

  6. Modular High Voltage Pulse Converter for Short Rise and Decay Times

    NARCIS (Netherlands)

    Mao, S.

    2018-01-01

    This thesis explores a modular HV pulse converter technology with short rise and decay times. A systematic methodology to derive and classify HV architectures based on a modularization level of power building blocks of the HV pulse converter is developed to summarize existing architectures and

  7. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  8. Heterogeneous real-time computing in radio astronomy

    Science.gov (United States)

    Ford, John M.; Demorest, Paul; Ransom, Scott

    2010-07-01

    Modern computer architectures suited for general purpose computing are often not the best choice for either I/O-bound or compute-bound problems. Sometimes the best choice is not to choose a single architecture, but to take advantage of the best characteristics of different computer architectures to solve your problems. This paper examines the tradeoffs between using computer systems based on the ubiquitous X86 Central Processing Units (CPU's), Field Programmable Gate Array (FPGA) based signal processors, and Graphical Processing Units (GPU's). We will show how a heterogeneous system can be produced that blends the best of each of these technologies into a real-time signal processing system. FPGA's tightly coupled to analog-to-digital converters connect the instrument to the telescope and supply the first level of computing to the system. These FPGA's are coupled to other FPGA's to continue to provide highly efficient processing power. Data is then packaged up and shipped over fast networks to a cluster of general purpose computers equipped with GPU's, which are used for floating-point intensive computation. Finally, the data is handled by the CPU and written to disk, or further processed. Each of the elements in the system has been chosen for its specific characteristics and the role it can play in creating a system that does the most for the least, in terms of power, space, and money.

  9. Ubiquitous computing technology for just-in-time motivation of behavior change.

    Science.gov (United States)

    Intille, Stephen S

    2004-01-01

    This paper describes a vision of health care where "just-in-time" user interfaces are used to transform people from passive to active consumers of health care. Systems that use computational pattern recognition to detect points of decision, behavior, or consequences automatically can present motivational messages to encourage healthy behavior at just the right time. Further, new ubiquitous computing and mobile computing devices permit information to be conveyed to users at just the right place. In combination, computer systems that present messages at the right time and place can be developed to motivate physical activity and healthy eating. Computational sensing technologies can also be used to measure the impact of the motivational technology on behavior.

  10. Long, partial-short, and special conformal fields

    Energy Technology Data Exchange (ETDEWEB)

    Metsaev, R.R. [Department of Theoretical Physics, P.N. Lebedev Physical Institute,Leninsky prospect 53, Moscow 119991 (Russian Federation)

    2016-05-17

    In the framework of metric-like approach, totally symmetric arbitrary spin bosonic conformal fields propagating in flat space-time are studied. Depending on the values of conformal dimension, spin, and dimension of space-time, we classify all conformal field as long, partial-short, short, and special conformal fields. An ordinary-derivative (second-derivative) Lagrangian formulation for such conformal fields is obtained. The ordinary-derivative Lagrangian formulation is realized by using double-traceless gauge fields, Stueckelberg fields, and auxiliary fields. Gauge-fixed Lagrangian invariant under global BRST transformations is obtained. The gauge-fixed BRST Lagrangian is used for the computation of partition functions for all conformal fields. Using the result for the partition functions, numbers of propagating D.o.F for the conformal fields are also found.

  11. Eulerian short-time statistics of turbulent flow at large Reynolds number

    NARCIS (Netherlands)

    Brouwers, J.J.H.

    2004-01-01

    An asymptotic analysis is presented of the short-time behavior of second-order temporal velocity structure functions and Eulerian acceleration correlations in a frame that moves with the local mean velocity of the turbulent flow field. Expressions in closed-form are derived which cover the viscous

  12. Rotor-System Log-Decrement Identification Using Short-Time Fourier-Transform Filter

    OpenAIRE

    Li, Qihang; Wang, Weimin; Chen, Lifang; Sun, Dan

    2015-01-01

    With the increase of the centrifugal compressor capability, such as large scale LNG and CO2 reinjection, the stability margin evaluation is crucial to assure the compressor work in the designed operating conditions in field. Improving the precision of parameter identification of stability is essential and necessary as well. Based on the time-varying characteristics of response vibration during the sine-swept process, a short-time Fourier transform (STFT) filter was introduced to increase the ...

  13. Continuous-Time Symmetric Hopfield Nets are Computationally Universal

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Orponen, P.

    2003-01-01

    Roč. 15, č. 3 (2003), s. 693-733 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : continuous-time Hopfield network * Liapunov function * analog computation * computational power * Turing universality Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003

  14. Multiscale Space-Time Computational Methods for Fluid-Structure Interactions

    Science.gov (United States)

    2015-09-13

    thermo-fluid analysis of a ground vehicle and its tires ST-SI Computational Analysis of a Vertical - Axis Wind Turbine We have successfully...of a vertical - axis wind turbine . Multiscale Compressible-Flow Computation with Particle Tracking We have successfully tested the multiscale...Tezduyar, Spenser McIntyre, Nikolay Kostov, Ryan Kolesar, Casey Habluetzel. Space–time VMS computation of wind - turbine rotor and tower aerodynamics

  15. Using Computer Techniques To Predict OPEC Oil Prices For Period 2000 To 2015 By Time-Series Methods

    Directory of Open Access Journals (Sweden)

    Mohammad Esmail Ahmad

    2015-08-01

    Full Text Available The instability in the world and OPEC oil process results from many factors through a long time. The problems can be summarized as that the oil exports dont constitute a large share of N.I. only but it also makes up most of the saving of the oil states. The oil prices affect their market through the interaction of supply and demand forces of oil. The research hypothesis states that the movement of oil prices caused shocks crises and economic problems. These shocks happen due to changes in oil prices need to make a prediction within the framework of economic planning in a short run period in order to avoid shocks through using computer techniques by time series models.

  16. Dimension reduction of frequency-based direct Granger causality measures on short time series.

    Science.gov (United States)

    Siggiridou, Elsa; Kimiskidis, Vasilios K; Kugiumtzis, Dimitris

    2017-09-01

    The mainstream in the estimation of effective brain connectivity relies on Granger causality measures in the frequency domain. If the measure is meant to capture direct causal effects accounting for the presence of other observed variables, as in multi-channel electroencephalograms (EEG), typically the fit of a vector autoregressive (VAR) model on the multivariate time series is required. For short time series of many variables, the estimation of VAR may not be stable requiring dimension reduction resulting in restricted or sparse VAR models. The restricted VAR obtained by the modified backward-in-time selection method (mBTS) is adapted to the generalized partial directed coherence (GPDC), termed restricted GPDC (RGPDC). Dimension reduction on other frequency based measures, such the direct directed transfer function (dDTF), is straightforward. First, a simulation study using linear stochastic multivariate systems is conducted and RGPDC is favorably compared to GPDC on short time series in terms of sensitivity and specificity. Then the two measures are tested for their ability to detect changes in brain connectivity during an epileptiform discharge (ED) from multi-channel scalp EEG. It is shown that RGPDC identifies better than GPDC the connectivity structure of the simulated systems, as well as changes in the brain connectivity, and is less dependent on the free parameter of VAR order. The proposed dimension reduction in frequency measures based on VAR constitutes an appropriate strategy to estimate reliably brain networks within short-time windows. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Quantifying complexity of financial short-term time series by composite multiscale entropy measure

    Science.gov (United States)

    Niu, Hongli; Wang, Jun

    2015-05-01

    It is significant to study the complexity of financial time series since the financial market is a complex evolved dynamic system. Multiscale entropy is a prevailing method used to quantify the complexity of a time series. Due to its less reliability of entropy estimation for short-term time series at large time scales, a modification method, the composite multiscale entropy, is applied to the financial market. To qualify its effectiveness, its applications in the synthetic white noise and 1 / f noise with different data lengths are reproduced first in the present paper. Then it is introduced for the first time to make a reliability test with two Chinese stock indices. After conducting on short-time return series, the CMSE method shows the advantages in reducing deviations of entropy estimation and demonstrates more stable and reliable results when compared with the conventional MSE algorithm. Finally, the composite multiscale entropy of six important stock indices from the world financial markets is investigated, and some useful and interesting empirical results are obtained.

  18. Short-term memory loss over time without retroactive stimulus interference.

    Science.gov (United States)

    Cowan, Nelson; AuBuchon, Angela M

    2008-02-01

    A key question in cognitive psychology is whether information in short-term memory is lost as a function of time. Lewandowsky, Duncan, and Brown (2004) argued against that memory loss because forgetting in serial recall occurred to the same extent across serial positions regardless of the rate of recall. However, we believe Lewandowsky et al. (2004) only prevented one of two types of rehearsal; they did not prevent nonarticulatory rehearsal via attention. To prevent articulatory and nonarticulatory rehearsal without introducing interference, we presented unevenly timed stimuli for serial recall and, on some trials, required that the timing of stimuli be reproduced in the response. In those trials only, evidence of memory loss over time emerged. Further research is needed to identify whether this memory loss is decay or lost distinctiveness.

  19. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  20. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  1. Highly reliable computer network for real time system

    International Nuclear Information System (INIS)

    Mohammed, F.A.; Omar, A.A.; Ayad, N.M.A.; Madkour, M.A.I.; Ibrahim, M.K.

    1988-01-01

    Many of computer networks have been studied different trends regarding the network architecture and the various protocols that govern data transfers and guarantee a reliable communication among all a hierarchical network structure has been proposed to provide a simple and inexpensive way for the realization of a reliable real-time computer network. In such architecture all computers in the same level are connected to a common serial channel through intelligent nodes that collectively control data transfers over the serial channel. This level of computer network can be considered as a local area computer network (LACN) that can be used in nuclear power plant control system since it has geographically dispersed subsystems. network expansion would be straight the common channel for each added computer (HOST). All the nodes are designed around a microprocessor chip to provide the required intelligence. The node can be divided into two sections namely a common section that interfaces with serial data channel and a private section to interface with the host computer. This part would naturally tend to have some variations in the hardware details to match the requirements of individual host computers. fig 7

  2. Thermonuclear Bursts with Short Recurrence Times from Neutron Stars Explained by Opacity-driven Convection

    Energy Technology Data Exchange (ETDEWEB)

    Keek, L. [X-ray Astrophysics Laboratory, Astrophysics Science Division, NASA/GSFC, Greenbelt, MD 20771 (United States); Heger, A., E-mail: laurens.keek@nasa.gov [Monash Center for Astrophysics, School of Physics and Astronomy, Monash University, Victoria, 3800 (Australia)

    2017-06-20

    Thermonuclear flashes of hydrogen and helium accreted onto neutron stars produce the frequently observed Type I X-ray bursts. It is the current paradigm that almost all material burns in a burst, after which it takes hours to accumulate fresh fuel for the next burst. In rare cases, however, bursts are observed with recurrence times as short as minutes. We present the first one-dimensional multi-zone simulations that reproduce this phenomenon. Bursts that ignite in a relatively hot neutron star envelope leave a substantial fraction of the fuel unburned at shallow depths. In the wake of the burst, convective mixing events driven by opacity bring this fuel down to the ignition depth on the observed timescale of minutes. There, unburned hydrogen mixes with the metal-rich ashes, igniting to produce a subsequent burst. We find burst pairs and triplets, similar to the observed instances. Our simulations reproduce the observed fraction of bursts with short waiting times of ∼30%, and demonstrate that short recurrence time bursts are typically less bright and of shorter duration.

  3. Computer simulations of long-time tails: what's new?

    NARCIS (Netherlands)

    Hoef, van der M.A.; Frenkel, D.

    1995-01-01

    Twenty five years ago Alder and Wainwright discovered, by simulation, the 'long-time tails' in the velocity autocorrelation function of a single particle in fluid [1]. Since then, few qualitatively new results on long-time tails have been obtained by computer simulations. However, within the

  4. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  5. Real-time Tsunami Inundation Prediction Using High Performance Computers

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2014-12-01

    Recently off-shore tsunami observation stations based on cabled ocean bottom pressure gauges are actively being deployed especially in Japan. These cabled systems are designed to provide real-time tsunami data before tsunamis reach coastlines for disaster mitigation purposes. To receive real benefits of these observations, real-time analysis techniques to make an effective use of these data are necessary. A representative study was made by Tsushima et al. (2009) that proposed a method to provide instant tsunami source prediction based on achieving tsunami waveform data. As time passes, the prediction is improved by using updated waveform data. After a tsunami source is predicted, tsunami waveforms are synthesized from pre-computed tsunami Green functions of linear long wave equations. Tsushima et al. (2014) updated the method by combining the tsunami waveform inversion with an instant inversion of coseismic crustal deformation and improved the prediction accuracy and speed in the early stages. For disaster mitigation purposes, real-time predictions of tsunami inundation are also important. In this study, we discuss the possibility of real-time tsunami inundation predictions, which require faster-than-real-time tsunami inundation simulation in addition to instant tsunami source analysis. Although the computational amount is large to solve non-linear shallow water equations for inundation predictions, it has become executable through the recent developments of high performance computing technologies. We conducted parallel computations of tsunami inundation and achieved 6.0 TFLOPS by using 19,000 CPU cores. We employed a leap-frog finite difference method with nested staggered grids of which resolution range from 405 m to 5 m. The resolution ratio of each nested domain was 1/3. Total number of grid points were 13 million, and the time step was 0.1 seconds. Tsunami sources of 2011 Tohoku-oki earthquake were tested. The inundation prediction up to 2 hours after the

  6. Short-time existence of solutions for mean-field games with congestion

    KAUST Repository

    Gomes, Diogo A.

    2015-11-20

    We consider time-dependent mean-field games with congestion that are given by a Hamilton–Jacobi equation coupled with a Fokker–Planck equation. These models are motivated by crowd dynamics in which agents have difficulty moving in high-density areas. The congestion effects make the Hamilton–Jacobi equation singular. The uniqueness of solutions for this problem is well understood; however, the existence of classical solutions was only known in very special cases, stationary problems with quadratic Hamiltonians and some time-dependent explicit examples. Here, we demonstrate the short-time existence of C∞ solutions for sub-quadratic Hamiltonians.

  7. Time-frequency analysis of fusion plasma signals beyond the short-time Fourier transform paradigm: An overview

    International Nuclear Information System (INIS)

    Bizarro, Joao P.S.; Figueiredo, Antonio C.A.

    2008-01-01

    Performing a time-frequency (t-f) analysis on actual magnetic pick-up coil data from the JET tokamak, a comparison is presented between the spectrogram and the Wigner and Choi-Williams distributions. Whereas the former, which stems from the short-time Fourier transform and has been the work-horse for t-f signal processing, implies an unavoidable trade-off between time and frequency resolutions, the latter two belong to a later generation of distributions that yield better, if not optimal joint t-f localization. Topics addressed include signal representation in the t-f plane, frequency identification and evolution, instantaneous-frequency estimation, and amplitude tracking

  8. 5 CFR 831.703 - Computation of annuities for part-time service.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Computation of annuities for part-time... part-time service. (a) Purpose. The computational method in this section shall be used to determine the annuity for an employee who has part-time service on or after April 7, 1986. (b) Definitions. In this...

  9. Real-time data acquisition and feedback control using Linux Intel computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Piglowski, D.A.; Johnson, R.D.; Walker, M.L.

    2006-01-01

    This paper describes the experiences of the DIII-D programming staff in adapting Linux based Intel computing hardware for use in real-time data acquisition and feedback control systems. Due to the highly dynamic and unstable nature of magnetically confined plasmas in tokamak fusion experiments, real-time data acquisition and feedback control systems are in routine use with all major tokamaks. At DIII-D, plasmas are created and sustained using a real-time application known as the digital plasma control system (PCS). During each experiment, the PCS periodically samples data from hundreds of diagnostic signals and provides these data to control algorithms implemented in software. These algorithms compute the necessary commands to send to various actuators that affect plasma performance. The PCS consists of a group of rack mounted Intel Xeon computer systems running an in-house customized version of the Linux operating system tailored specifically to meet the real-time performance needs of the plasma experiments. This paper provides a more detailed description of the real-time computing hardware and custom developed software, including recent work to utilize dual Intel Xeon equipped computers within the PCS

  10. Computing Wigner distributions and time correlation functions using the quantum thermal bath method: application to proton transfer spectroscopy.

    Science.gov (United States)

    Basire, Marie; Borgis, Daniel; Vuilleumier, Rodolphe

    2013-08-14

    Langevin dynamics coupled to a quantum thermal bath (QTB) allows for the inclusion of vibrational quantum effects in molecular dynamics simulations at virtually no additional computer cost. We investigate here the ability of the QTB method to reproduce the quantum Wigner distribution of a variety of model potentials, designed to assess the performances and limits of the method. We further compute the infrared spectrum of a multidimensional model of proton transfer in the gas phase and in solution, using classical trajectories sampled initially from the Wigner distribution. It is shown that for this type of system involving large anharmonicities and strong nonlinear coupling to the environment, the quantum thermal bath is able to sample the Wigner distribution satisfactorily and to account for both zero point energy and tunneling effects. It leads to quantum time correlation functions having the correct short-time behavior, and the correct associated spectral frequencies, but that are slightly too overdamped. This is attributed to the classical propagation approximation rather than the generation of the quantized initial conditions themselves.

  11. [Clinical characteristics of short tear film breakup time (BUT) -type dry eye].

    Science.gov (United States)

    Yamamoto, Yuji; Yokoi, Norihiko; Higashihara, Hisayo; Inagaki, Kayoko; Sonomura, Yukiko; Komuro, Aoi; Kinoshita, Shigeru

    2012-12-01

    To evaluate the clinical characteristics and management of short tear film breakup time (BUT) -type dry eye. Clinical background and post-treatment changes of symptoms in 77 patients with short BUT -type dry eye were investigated. Treatment consisted of artificial-tear eye-drop instillation and, if necessary, the addition of a low-density-level steroid, hyaluronic acid, a low-density-level cyclopentolate prepared by ourselves and punctal plugs inserted into the upper and lower lacrimal puncta. There were three times more women than men among the patients, and the peak age of occurrence was in the twenties in the men and in the sixties in the women. Our findings show that visual display terminal (VDT) work, contact lens (CL) wear, and changes in the sex hormones may initiate subjective symptoms. Some patients had simultaneous conjunctivochalasis, allergic conjunctivitis, and meibomian gland dysfunction. Nineteen patients (24.7%) were effectively treated with eye-drop instillation alone. Thirty-seven patients (48.1%) required punctal-plug insertion, which was completely effective in only 8 of them (21.6%). Mainly young men and menopausal women contract short BUT -type dry eye. Changes in sex hormones, VDT work and CL wear may be causal, and the disease cannot be controlled by eyedrop and punctal-plug treatment alone.

  12. 7th conference on ultra-wideband, short-pulse electromagnetics

    CERN Document Server

    Schenk, Uwe; Nitsch, Daniel; Sabath, Frank; Ultra-Wideband, Short-Pulse Electromagnetics 7; UWBSP7

    2007-01-01

    Ultra-wideband (UWB), short-pulse (SP) electromagnetics are now being used for an increasingly wide variety of applications, including collision avoidance radar, concealed object detection, and communications. Notable progress in UWB and SP technologies has been achieved by investigations of their theoretical bases and improvements in solid-state manufacturing, computers, and digitizers. UWB radar systems are also being used for mine clearing, oil pipeline inspections, archeology, geology, and electronic effects testing. Ultra-Wideband Short-Pulse Electromagnetics 7 presents selected papers of deep technical content and high scientific quality from the UWB-SP7 Conference, including wide-ranging contributions on electromagnetic theory, scattering, UWB antennas, UWB systems, ground penetrating radar (GPR), UWB communications, pulsed-power generation, time-domain computational electromagnetics, UWB compatibility, target detection and discrimination, propagation through dispersive media, and wavelet and multi-res...

  13. GPU-based cone beam computed tomography.

    Science.gov (United States)

    Noël, Peter B; Walczak, Alan M; Xu, Jinhui; Corso, Jason J; Hoffmann, Kenneth R; Schafer, Sebastian

    2010-06-01

    The use of cone beam computed tomography (CBCT) is growing in the clinical arena due to its ability to provide 3D information during interventions, its high diagnostic quality (sub-millimeter resolution), and its short scanning times (60 s). In many situations, the short scanning time of CBCT is followed by a time-consuming 3D reconstruction. The standard reconstruction algorithm for CBCT data is the filtered backprojection, which for a volume of size 256(3) takes up to 25 min on a standard system. Recent developments in the area of Graphic Processing Units (GPUs) make it possible to have access to high-performance computing solutions at a low cost, allowing their use in many scientific problems. We have implemented an algorithm for 3D reconstruction of CBCT data using the Compute Unified Device Architecture (CUDA) provided by NVIDIA (NVIDIA Corporation, Santa Clara, California), which was executed on a NVIDIA GeForce GTX 280. Our implementation results in improved reconstruction times from minutes, and perhaps hours, to a matter of seconds, while also giving the clinician the ability to view 3D volumetric data at higher resolutions. We evaluated our implementation on ten clinical data sets and one phantom data set to observe if differences occur between CPU and GPU-based reconstructions. By using our approach, the computation time for 256(3) is reduced from 25 min on the CPU to 3.2 s on the GPU. The GPU reconstruction time for 512(3) volumes is 8.5 s. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  14. EVOLVING TO TYPE Ia SUPERNOVAE WITH SHORT DELAY TIMES

    International Nuclear Information System (INIS)

    Wang Bo; Chen Xuefei; Han Zhanwen; Meng Xiangcun

    2009-01-01

    The single-degenerate model is currently a favorable progenitor model for Type Ia supernovae (SNe Ia). Recent investigations on the white dwarf (WD) + He star channel of the single-degenerate model imply that this channel is noteworthy for producing SNe Ia. In this paper, we studied SN Ia birthrates and delay times of this channel via a detailed binary population synthesis approach. We found that the Galactic SN Ia birthrate from the WD + He star channel is ∼0.3 x 10 -3 yr -1 according to our standard model, and that this channel can explain SNe Ia with short delay times (∼4.5 x 10 7 -1.4 x 10 8 yr). Meanwhile, these WD + He star systems may be related to the young supersoft X-ray sources prior to SN Ia explosions.

  15. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    Science.gov (United States)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  16. NMR transmit-receive system with short recovery time and effective isolation

    Science.gov (United States)

    Jurga, K.; Reynhardt, E. C.; Jurga, S.

    A transmit-receive system with a short recovery time and excellent isolation has been developed. The system operates in conjunction with an ENI Model 3200L broadband amplifier and a spin-lock NMR pulse spectrometer. The system has been tested in the frequency range 5.5 to 52 MHz and seems not to generate any background noise.

  17. Determination of rail wear and short-time wear measurements of rails applying radioisotopes

    International Nuclear Information System (INIS)

    Grohmann, H.D.

    1981-01-01

    An energetic model has been developed for calculating rail wear. Short-time wear tests on rails after surface activation and following activity measurements showed a good agreement with the calculated values

  18. Short-term adaptations as a response to travel time: results of a stated adaptation experimentincreases

    NARCIS (Netherlands)

    Psarra, I.; Arentze, T.A.; Timmermans, H.J.P.

    2016-01-01

    This study focused on short-term dynamics of activity-travel behavior as a response to travel time increases. It is assumed that short-term changes are triggered by stress, which is defined as the deviation between an individual’s aspirations and his or her daily experiences. When stress exceeds a

  19. Distributed computing for real-time petroleum reservoir monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Ayodele, O. R. [University of Alberta, Edmonton, AB (Canada)

    2004-05-01

    Computer software architecture is presented to illustrate how the concept of distributed computing can be applied to real-time reservoir monitoring processes, permitting the continuous monitoring of the dynamic behaviour of petroleum reservoirs at much shorter intervals. The paper describes the fundamental technologies driving distributed computing, namely Java 2 Platform Enterprise edition (J2EE) by Sun Microsystems, and the Microsoft Dot-Net (Microsoft.Net) initiative, and explains the challenges involved in distributed computing. These are: (1) availability of permanently placed downhole equipment to acquire and transmit seismic data; (2) availability of high bandwidth to transmit the data; (3) security considerations; (4) adaptation of existing legacy codes to run on networks as downloads on demand; and (5) credibility issues concerning data security over the Internet. Other applications of distributed computing in the petroleum industry are also considered, specifically MWD, LWD and SWD (measurement-while-drilling, logging-while-drilling, and simulation-while-drilling), and drill-string vibration monitoring. 23 refs., 1 fig.

  20. Short-term wind power forecasting: probabilistic and space-time aspects

    DEFF Research Database (Denmark)

    Tastu, Julija

    work deals with the proposal and evaluation of new mathematical models and forecasting methods for short-term wind power forecasting, accounting for space-time dynamics based on geographically distributed information. Different forms of power predictions are considered, starting from traditional point...... into the corresponding models are analysed. As a final step, emphasis is placed on generating space-time trajectories: this calls for the prediction of joint multivariate predictive densities describing wind power generation at a number of distributed locations and for a number of successive lead times. In addition......Optimal integration of wind energy into power systems calls for high quality wind power predictions. State-of-the-art forecasting systems typically provide forecasts for every location individually, without taking into account information coming from the neighbouring territories. It is however...

  1. Real Time Animation of Trees Based on BBSC in Computer Games

    Directory of Open Access Journals (Sweden)

    Xuefeng Ao

    2009-01-01

    Full Text Available That researchers in the field of computer games usually find it is difficult to simulate the motion of actual 3D model trees lies in the fact that the tree model itself has very complicated structure, and many sophisticated factors need to be considered during the simulation. Though there are some works on simulating 3D tree and its motion, few of them are used in computer games due to the high demand for real-time in computer games. In this paper, an approach of animating trees in computer games based on a novel tree model representation—Ball B-Spline Curves (BBSCs are proposed. By taking advantage of the good features of the BBSC-based model, physical simulation of the motion of leafless trees with wind blowing becomes easier and more efficient. The method can generate realistic 3D tree animation in real-time, which meets the high requirement for real time in computer games.

  2. Short-Term Change Detection in Wetlands Using Sentinel-1 Time Series

    DEFF Research Database (Denmark)

    Muro, Javier; Canty, Morton; Conradsen, Knut

    2016-01-01

    Automated monitoring systems that can capture wetlands’ high spatial and temporal variability are essential for their management. SAR-based change detection approaches offer a great opportunity to enhance our understanding of complex and dynamic ecosystems. We test a recently-developed time serie...... certain landscape changes are detected only by either the Landsat-based or the S1-omnibus method. The S1-omnibus method shows a great potential for an automated monitoring of short time changes and accurate delineation of areas of high variability and of slow and gradual changes....

  3. Short-Term Memory Loss Over Time Without Retroactive Stimulus Interference

    OpenAIRE

    Cowan, Nelson; AuBuchon, Angela M.

    2008-01-01

    A key question in cognitive psychology is whether information in short-term memory is lost as a function of time. Lewandowsky, Duncan, and Brown (2004) argued against that memory loss because forgetting in serial recall occurred to the same extent across serial positions regardless of the rate of recall. However, we believe Lewandowsky et al. only prevented one of two types of rehearsal; they did not prevent non-articulatory rehearsal via attention. To prevent articulatory and non-articulator...

  4. Computer network time synchronization the network time protocol on earth and in space

    CERN Document Server

    Mills, David L

    2010-01-01

    Carefully coordinated, reliable, and accurate time synchronization is vital to a wide spectrum of fields-from air and ground traffic control, to buying and selling goods and services, to TV network programming. Ill-gotten time could even lead to the unimaginable and cause DNS caches to expire, leaving the entire Internet to implode on the root servers.Written by the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol on Earth and in Space, Second Edition addresses the technological infrastructure of time dissemination, distrib

  5. The reliable solution and computation time of variable parameters Logistic model

    OpenAIRE

    Pengfei, Wang; Xinnong, Pan

    2016-01-01

    The reliable computation time (RCT, marked as Tc) when applying a double precision computation of a variable parameters logistic map (VPLM) is studied. First, using the method proposed, the reliable solutions for the logistic map are obtained. Second, for a time-dependent non-stationary parameters VPLM, 10000 samples of reliable experiments are constructed, and the mean Tc is then computed. The results indicate that for each different initial value, the Tcs of the VPLM are generally different...

  6. X-ray testing for short-time dynamic applications; Roentgenuntersuchungen fuer kurzzeitdynamische Anwendungen

    Energy Technology Data Exchange (ETDEWEB)

    Kurfiss, Malte; Moser, Stefan; Popko, Gregor; Nau, Siegfried [Fraunhofer-Institut fuer Kurzzeitdynamik, Efringen-Kirchen (Germany). Ernst-Mach-Inst. (EMI)

    2017-08-01

    For nondestructive testing purposes new challenges are short-time dynamic processes. The application of x-ray flash tubes and modern high-speed cameras allows the observation of the opening of air-bags or the energy absorption of compressed tubes as occurring during a vehicle crash. Special algorithms designed for computerized tomography analyses allow the 3D reconstruction at individual time points of the dynamic process. Possibilities and limitations of the actual techniques are discussed.

  7. Potential Bone to Implant Contact Area of Short Versus Standard Implants: An In Vitro Micro-Computed Tomography Analysis.

    Science.gov (United States)

    Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo

    2016-02-01

    To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.

  8. FREQUENCY COMPONENT EXTRACTION OF HEARTBEAT CUES WITH SHORT TIME FOURIER TRANSFORM (STFT

    Directory of Open Access Journals (Sweden)

    Sumarna Sumarna

    2017-01-01

      Electro-acoustic human heartbeat detector have been made with the main parts : (a stetoscope (piece chest, (b mic condenser, (c transistor amplifier, and (d cues analysis program with MATLAB. The frequency components that contained in heartbeat. cues have also been extracted with Short Time Fourier Transform (STFT from 9 volunteers. The results of the analysis showed that heart rate appeared in every cue frequency spectrum with their harmony. The steps of the research were including detector instrument design, test and instrument repair, cues heartbeat recording with Sound Forge 10 program and stored in wav file ; cues breaking at the start and the end, and extraction/cues analysis using MATLAB. The MATLAB program included filter (bandpass filter with bandwidth between 0.01 – 110 Hz, cues breaking with hamming window and every part was calculated using Fourier Transform (STFT mechanism and the result were shown in frequency spectrum graph.   Keywords: frequency components extraction, heartbeat cues, Short Time Fourier Transform

  9. The loss of short-term visual representations over time: decay or temporal distinctiveness?

    Science.gov (United States)

    Mercer, Tom

    2014-12-01

    There has been much recent interest in the loss of visual short-term memories over the passage of time. According to decay theory, visual representations are gradually forgotten as time passes, reflecting a slow and steady distortion of the memory trace. However, this is controversial and decay effects can be explained in other ways. The present experiment aimed to reexamine the maintenance and loss of visual information over the short term. Decay and temporal distinctiveness models were tested using a delayed discrimination task, in which participants compared complex and novel objects over unfilled retention intervals of variable length. Experiment 1 found no significant change in the accuracy of visual memory from 2 to 6 s, but the gap separating trials reliably influenced task performance. Experiment 2 found evidence for information loss at a 10-s retention interval, but temporally separating trials restored the fidelity of visual memory, possibly because temporally isolated representations are distinct from older memory traces. In conclusion, visual representations lose accuracy at some point after 6 s, but only within temporally crowded contexts. These findings highlight the importance of temporal distinctiveness within visual short-term memory. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  10. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  11. Effective description of the short-time dynamics in open quantum systems

    Science.gov (United States)

    Rossi, Matteo A. C.; Foti, Caterina; Cuccoli, Alessandro; Trapani, Jacopo; Verrucchi, Paola; Paris, Matteo G. A.

    2017-09-01

    We address the dynamics of a bosonic system coupled to either a bosonic or a magnetic environment and derive a set of sufficient conditions that allow one to describe the dynamics in terms of the effective interaction with a classical fluctuating field. We find that for short interaction times the dynamics of the open system is described by a Gaussian noise map for several different interaction models and independently on the temperature of the environment. In order to go beyond a qualitative understanding of the origin and physical meaning of the above short-time constraint, we take a general viewpoint and, based on an algebraic approach, suggest that any quantum environment can be described by classical fields whenever global symmetries lead to the definition of environmental operators that remain well defined when increasing the size, i.e., the number of dynamical variables, of the environment. In the case of the bosonic environment this statement is exactly demonstrated via a constructive procedure that explicitly shows why a large number of environmental dynamical variables and, necessarily, global symmetries, entail the set of conditions derived in the first part of the work.

  12. Micro-Doppler Ambiguity Resolution Based on Short-Time Compressed Sensing

    Directory of Open Access Journals (Sweden)

    Jing-bo Zhuang

    2015-01-01

    Full Text Available When using a long range radar (LRR to track a target with micromotion, the micro-Doppler embodied in the radar echoes may suffer from ambiguity problem. In this paper, we propose a novel method based on compressed sensing (CS to solve micro-Doppler ambiguity. According to the RIP requirement, a sparse probing pulse train with its transmitting time random is designed. After matched filtering, the slow-time echo signals of the micromotion target can be viewed as randomly sparse sampling of Doppler spectrum. Select several successive pulses to form a short-time window and the CS sensing matrix can be built according to the time stamps of these pulses. Then performing Orthogonal Matching Pursuit (OMP, the unambiguous micro-Doppler spectrum can be obtained. The proposed algorithm is verified using the echo signals generated according to the theoretical model and the signals with micro-Doppler signature produced using the commercial electromagnetic simulation software FEKO.

  13. Soft Real-Time PID Control on a VME Computer

    Science.gov (United States)

    Karayan, Vahag; Sander, Stanley; Cageao, Richard

    2007-01-01

    microPID (uPID) is a computer program for real-time proportional + integral + derivative (PID) control of a translation stage in a Fourier-transform ultraviolet spectrometer. microPID implements a PID control loop over a position profile at sampling rate of 8 kHz (sampling period 125microseconds). The software runs in a strippeddown Linux operating system on a VersaModule Eurocard (VME) computer operating in real-time priority queue using an embedded controller, a 16-bit digital-to-analog converter (D/A) board, and a laser-positioning board (LPB). microPID consists of three main parts: (1) VME device-driver routines, (2) software that administers a custom protocol for serial communication with a control computer, and (3) a loop section that obtains the current position from an LPB-driver routine, calculates the ideal position from the profile, and calculates a new voltage command by use of an embedded PID routine all within each sampling period. The voltage command is sent to the D/A board to control the stage. microPID uses special kernel headers to obtain microsecond timing resolution. Inasmuch as microPID implements a single-threaded process and all other processes are disabled, the Linux operating system acts as a soft real-time system.

  14. Delay-time distribution in the scattering of time-narrow wave packets (II)—quantum graphs

    Science.gov (United States)

    Smilansky, Uzy; Schanz, Holger

    2018-02-01

    We apply the framework developed in the preceding paper in this series (Smilansky 2017 J. Phys. A: Math. Theor. 50 215301) to compute the time-delay distribution in the scattering of ultra short radio frequency pulses on complex networks of transmission lines which are modeled by metric (quantum) graphs. We consider wave packets which are centered at high wave number and comprise many energy levels. In the limit of pulses of very short duration we compute upper and lower bounds to the actual time-delay distribution of the radiation emerging from the network using a simplified problem where time is replaced by the discrete count of vertex-scattering events. The classical limit of the time-delay distribution is also discussed and we show that for finite networks it decays exponentially, with a decay constant which depends on the graph connectivity and the distribution of its edge lengths. We illustrate and apply our theory to a simple model graph where an algebraic decay of the quantum time-delay distribution is established.

  15. A general dead-time correction method based on live-time stamping. Application to the measurement of short-lived radionuclides.

    Science.gov (United States)

    Chauvenet, B; Bobin, C; Bouchard, J

    2017-12-01

    Dead-time correction formulae are established in the general case of superimposed non-homogeneous Poisson processes. Based on the same principles as conventional live-timed counting, this method exploits the additional information made available using digital signal processing systems, and especially the possibility to store the time stamps of live-time intervals. No approximation needs to be made to obtain those formulae. Estimates of the variances of corrected rates are also presented. This method is applied to the activity measurement of short-lived radionuclides. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Computing return times or return periods with rare event algorithms

    Science.gov (United States)

    Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy

    2018-04-01

    The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.

  17. Near real-time digital holographic microscope based on GPU parallel computing

    Science.gov (United States)

    Zhu, Gang; Zhao, Zhixiong; Wang, Huarui; Yang, Yan

    2018-01-01

    A transmission near real-time digital holographic microscope with in-line and off-axis light path is presented, in which the parallel computing technology based on compute unified device architecture (CUDA) and digital holographic microscopy are combined. Compared to other holographic microscopes, which have to implement reconstruction in multiple focal planes and are time-consuming the reconstruction speed of the near real-time digital holographic microscope can be greatly improved with the parallel computing technology based on CUDA, so it is especially suitable for measurements of particle field in micrometer and nanometer scale. Simulations and experiments show that the proposed transmission digital holographic microscope can accurately measure and display the velocity of particle field in micrometer scale, and the average velocity error is lower than 10%.With the graphic processing units(GPU), the computing time of the 100 reconstruction planes(512×512 grids) is lower than 120ms, while it is 4.9s using traditional reconstruction method by CPU. The reconstruction speed has been raised by 40 times. In other words, it can handle holograms at 8.3 frames per second and the near real-time measurement and display of particle velocity field are realized. The real-time three-dimensional reconstruction of particle velocity field is expected to achieve by further optimization of software and hardware. Keywords: digital holographic microscope,

  18. Nonlinear response of vessel walls due to short-time thermomechanical loading

    International Nuclear Information System (INIS)

    Pfeiffer, P.A.; Kulak, R.F.

    1994-01-01

    Maintaining structural integrity of the reactor pressure vessel (RPV) during a postulated core melt accident is an important safety consideration in the design of the vessel. This study addresses the failure predictions of the vessel due to thermal and pressure loadings fro the molten core debris depositing on the lower head of the vessel. Different loading combinations were considered based on the dead load, yield stress assumptions, material response and internal pressurization. The analyses considered only short term failure (quasi static) modes, long term failure modes were not considered. Short term failure modes include plastic instabilities of the structure and failure due to exceeding the failure strain. Long term failure odes would be caused by creep rupture that leads to plastic instability of the structure. Due to the sort time durations analyzed, creep was not considered in the analyses presented

  19. Does Occupational Exposure of Shahid Dastghieb International Airport Workers to Radiofrequency Radiation Affect Their Short Term Memory and Reaction Time?

    Directory of Open Access Journals (Sweden)

    Jarideh S.

    2015-05-01

    Full Text Available Background: Airport workers are continuously exposed to different levels of radiofrequency microwave (RF/MW radiation emitted by radar equipments. Radars are extensively used in military and aviation industries. Over the past several years, our lab has focused on the health effects of exposure to different sources of electromagnetic fields such as cellular phones, mobile base stations, mobile phone jammers, laptop computers, radars, dentistry cavitrons and MRI. The main goal of this study was to investigate if occupational exposure of Shahid Dastghieb international airport workers to radiofrequency radiation affects their short term memory and reaction time. Methods: Thirty two airport workers involved in duties at control and approach tower (21 males and 11 females, with the age range of 27-67 years old (mean age of 37.38, participated voluntary in this study. On the other hand, 29 workers (13 males, and 16 females whose offices were in the city with no exposure history to radar systems were also participated in this study as the control group. The employees’ reaction time and short term memory were analyzed using a standard visual reaction time (VRT test software and the modified Wechsler memory scale test, respectively. Results: The mean± SD values for the reaction times of the airport employees (N=32 and the control group (N=29 were 0.45±0.12 sec and 0.46±0.17 sec, respectively. Moreover, in the four subset tests; i.e. paired words, forward digit span, backward digit span and word recognition, the following points were obtained for the airport employees and the control group, respectively: (i pair words test: 28.00±13.13 and 32.07±11.65, (ii forward digit span: 8.38±1.40 and 9.03±1.32, (iii backward digit span: 5.54±1.87 and 6.31±1.46, and (iv word recognition: 5.73±2.36 and 6.50±1.93. These differences were not statistically significant. Conclusion: The occupational exposure of the employees to the RF radiation in Shahid

  20. Research on resistance characteristics of YBCO tape under short-time DC large current impact

    Science.gov (United States)

    Zhang, Zhifeng; Yang, Jiabin; Qiu, Qingquan; Zhang, Guomin; Lin, Liangzhen

    2017-06-01

    Research of the resistance characteristics of YBCO tape under short-time DC large current impact is the foundation of the developing DC superconducting fault current limiter (SFCL) for voltage source converter-based high voltage direct current system (VSC-HVDC), which is one of the valid approaches to solve the problems of renewable energy integration. SFCL can limit DC short-circuit and enhance the interrupting capabilities of DC circuit breakers. In this paper, under short-time DC large current impacts, the resistance features of naked tape of YBCO tape are studied to find the resistance - temperature change rule and the maximum impact current. The influence of insulation for the resistance - temperature characteristics of YBCO tape is studied by comparison tests with naked tape and insulating tape in 77 K. The influence of operating temperature on the tape is also studied under subcooled liquid nitrogen condition. For the current impact security of YBCO tape, the critical current degradation and top temperature are analyzed and worked as judgment standards. The testing results is helpful for in developing SFCL in VSC-HVDC.

  1. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  2. Short version of the Zimbardo Time Perspective Inventory (ZTPI-short) with and without the Future-Negative scale, verified on nationally representative samples

    Czech Academy of Sciences Publication Activity Database

    Košťál, Jaroslav; Klicperová-Baker, Martina; Lukavská, K.; Lukavský, Jiří

    2016-01-01

    Roč. 25, č. 2 (2016), s. 169-192 ISSN 0961-463X Institutional support: RVO:68081740 Keywords : ZTPI * ZTPI-short * time perspective * temporal orientation * representative sample Subject RIV: AN - Psychology Impact factor: 1.206, year: 2016

  3. Real-Time Control of an Articulatory-Based Speech Synthesizer for Brain Computer Interfaces.

    Directory of Open Access Journals (Sweden)

    Florent Bocquelet

    2016-11-01

    Full Text Available Restoring natural speech in paralyzed and aphasic people could be achieved using a Brain-Computer Interface (BCI controlling a speech synthesizer in real-time. To reach this goal, a prerequisite is to develop a speech synthesizer producing intelligible speech in real-time with a reasonable number of control parameters. We present here an articulatory-based speech synthesizer that can be controlled in real-time for future BCI applications. This synthesizer converts movements of the main speech articulators (tongue, jaw, velum, and lips into intelligible speech. The articulatory-to-acoustic mapping is performed using a deep neural network (DNN trained on electromagnetic articulography (EMA data recorded on a reference speaker synchronously with the produced speech signal. This DNN is then used in both offline and online modes to map the position of sensors glued on different speech articulators into acoustic parameters that are further converted into an audio signal using a vocoder. In offline mode, highly intelligible speech could be obtained as assessed by perceptual evaluation performed by 12 listeners. Then, to anticipate future BCI applications, we further assessed the real-time control of the synthesizer by both the reference speaker and new speakers, in a closed-loop paradigm using EMA data recorded in real time. A short calibration period was used to compensate for differences in sensor positions and articulatory differences between new speakers and the reference speaker. We found that real-time synthesis of vowels and consonants was possible with good intelligibility. In conclusion, these results open to future speech BCI applications using such articulatory-based speech synthesizer.

  4. Forecast model of landslides in a short time

    International Nuclear Information System (INIS)

    Sanchez Lopez, Reinaldo

    2006-01-01

    The IDEAM in development of their functions as member of the national technical committee for the prevention and disasters attention (SNPAD) accomplishes the follow-up, monitoring and forecast in real time of the environmental dynamics that in extreme situations constitute threats and natural risks. One of the frequent dynamics and of greater impact is related to landslides, those that affect persistently the life of the persons, the infrastructure, the socioeconomic activities and the balance of the environment. The landslide in Colombia and in the world are caused mainly by effects of the rain, due to that, IDEAM has come developing forecast model, as an instrument for risk management in a short time. This article presents aspects related to their structure, operation, temporary space resolution, products, results, achievements and projections of the model. Conceptually, the model is support by the principle of the dynamic temporary - space, of the processes that consolidate natural hazards, particularly in areas where the man has come building the risk. Structurally, the model is composed by two sub-models; the general susceptibility of the earthly model and the critical rain model as a denotative factor, that consolidate the hazard process. In real time, the model, works as a GIS, permitting the automatic zoning of the landslides hazard for issue public advisory warming to help makers decisions on the risk that cause frequently these events, in the country

  5. Computation of reactor control rod drop time under accident conditions

    International Nuclear Information System (INIS)

    Dou Yikang; Yao Weida; Yang Renan; Jiang Nanyan

    1998-01-01

    The computational method of reactor control rod drop time under accident conditions lies mainly in establishing forced vibration equations for the components under action of outside forces on control rod driven line and motion equation for the control rod moving in vertical direction. The above two kinds of equations are connected by considering the impact effects between control rod and its outside components. Finite difference method is adopted to make discretization of the vibration equations and Wilson-θ method is applied to deal with the time history problem. The non-linearity caused by impact is iteratively treated with modified Newton method. Some experimental results are used to validate the validity and reliability of the computational method. Theoretical and experimental testing problems show that the computer program based on the computational method is applicable and reliable. The program can act as an effective tool of design by analysis and safety analysis for the relevant components

  6. Real-Time Accumulative Computation Motion Detectors

    Directory of Open Access Journals (Sweden)

    Saturnino Maldonado-Bascón

    2009-12-01

    Full Text Available The neurally inspired accumulative computation (AC method and its application to motion detection have been introduced in the past years. This paper revisits the fact that many researchers have explored the relationship between neural networks and finite state machines. Indeed, finite state machines constitute the best characterized computational model, whereas artificial neural networks have become a very successful tool for modeling and problem solving. The article shows how to reach real-time performance after using a model described as a finite state machine. This paper introduces two steps towards that direction: (a A simplification of the general AC method is performed by formally transforming it into a finite state machine. (b A hardware implementation in FPGA of such a designed AC module, as well as an 8-AC motion detector, providing promising performance results. We also offer two case studies of the use of AC motion detectors in surveillance applications, namely infrared-based people segmentation and color-based people tracking, respectively.

  7. Heating Augmentation for Short Hypersonic Protuberances

    Science.gov (United States)

    Mazaheri, Ali R.; Wood, William A.

    2008-01-01

    Computational aeroheating analyses of the Space Shuttle Orbiter plug repair models are validated against data collected in the Calspan University of Buffalo Research Center (CUBRC) 48 inch shock tunnel. The comparison shows that the average difference between computed heat transfer results and the data is about 9.5%. Using CFD and Wind Tunnel (WT) data, an empirical correlation for estimating heating augmentation on short hypersonic protuberances (k/delta less than 0.3) is proposed. This proposed correlation is compared with several computed flight simulation cases and good agreement is achieved. Accordingly, this correlation is proposed for further investigation on other short hypersonic protuberances for estimating heating augmentation.

  8. Drought analysis and short-term forecast in the Aison River Basin (Greece)

    OpenAIRE

    Kavalieratou, S.; Karpouzos, D. K.; Babajimopoulos, C.

    2012-01-01

    A combined regional drought analysis and forecast is elaborated and applied to the Aison River Basin (Greece). The historical frequency, duration and severity were estimated using the standardized precipitation index (SPI) computed on variable time scales, while short-term drought forecast was investigated by means of 3-D loglinear models. A quasi-association model with homogenous diagonal effect was proposed to fit the observed frequencies of class transitions of the SPI values computed on t...

  9. Television viewing, computer use and total screen time in Canadian youth.

    Science.gov (United States)

    Mark, Amy E; Boyce, William F; Janssen, Ian

    2006-11-01

    Research has linked excessive television viewing and computer use in children and adolescents to a variety of health and social problems. Current recommendations are that screen time in children and adolescents should be limited to no more than 2 h per day. To determine the percentage of Canadian youth meeting the screen time guideline recommendations. The representative study sample consisted of 6942 Canadian youth in grades 6 to 10 who participated in the 2001/2002 World Health Organization Health Behaviour in School-Aged Children survey. Only 41% of girls and 34% of boys in grades 6 to 10 watched 2 h or less of television per day. Once the time of leisure computer use was included and total daily screen time was examined, only 18% of girls and 14% of boys met the guidelines. The prevalence of those meeting the screen time guidelines was higher in girls than boys. Fewer than 20% of Canadian youth in grades 6 to 10 met the total screen time guidelines, suggesting that increased public health interventions are needed to reduce the number of leisure time hours that Canadian youth spend watching television and using the computer.

  10. 12 CFR 516.10 - How does OTS compute time periods under this part?

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false How does OTS compute time periods under this part? 516.10 Section 516.10 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPLICATION PROCESSING PROCEDURES § 516.10 How does OTS compute time periods under this part? In computing...

  11. Accessible high performance computing solutions for near real-time image processing for time critical applications

    Science.gov (United States)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  12. Forward and adjoint sensitivity computation of chaotic dynamical systems

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qiqi, E-mail: qiqi@mit.edu [Department of Aeronautics and Astronautics, MIT, 77 Mass Ave., Cambridge, MA 02139 (United States)

    2013-02-15

    This paper describes a forward algorithm and an adjoint algorithm for computing sensitivity derivatives in chaotic dynamical systems, such as the Lorenz attractor. The algorithms compute the derivative of long time averaged “statistical” quantities to infinitesimal perturbations of the system parameters. The algorithms are demonstrated on the Lorenz attractor. We show that sensitivity derivatives of statistical quantities can be accurately estimated using a single, short trajectory (over a time interval of 20) on the Lorenz attractor.

  13. Effect of the MCNP model definition on the computation time

    International Nuclear Information System (INIS)

    Šunka, Michal

    2017-01-01

    The presented work studies the influence of the method of defining the geometry in the MCNP transport code and its impact on the computational time, including the difficulty of preparing an input file describing the given geometry. Cases using different geometric definitions including the use of basic 2-dimensional and 3-dimensional objects and theirs combinations were studied. The results indicate that an inappropriate definition can increase the computational time by up to 59% (a more realistic case indicates 37%) for the same results and the same statistical uncertainty. (orig.)

  14. Short-time beta grain growth kinetics for a conventional titanium alloy

    International Nuclear Information System (INIS)

    Semiatin, S.L.; Sukonnik, I.M.

    1996-01-01

    The kinetics of beta grain growth during short-time, supertransus heat treatment of Ti-5Al-4V were determined using a salt-pot technique. The finite-time, subtransus temperature transient during salt-pot heating was quantified through measurements of the heat transfer coefficient characterizing conduction across the salt-titanium interface and a simple heat conduction analysis which incorporated this heat transfer coefficient. Grain size versus time data adjusted to account for the subtransus temperature transient were successfully fit to the parabolic grain growth law d n - d 0 n = kt exp(-Q/RT) using an exponent n equal to 2.0. Comparison of the present results to rapid, continuous heat treatment data in the literature for a similar titanium alloy revealed a number of semi-quantitative similarities

  15. Cerebral blood flow measurement using stable xenon CT with very short inhalation times

    Energy Technology Data Exchange (ETDEWEB)

    Touho, Hajime; Karasawa, Jun; Shishido, Hisashi; Yamada, Keisuke; Shibamoto, Keiji [Osaka Neurological Inst., Toyonaka (Japan)

    1991-02-01

    A noninvasive, simplified method using inhalation of stable xenon (Xe{sup s}) and computed tomographic (CT) scanning to estimate regional cerebral blood flow (rCBF) and regional partition coefficient (r{lambda}) is described. Twenty-four patients with cerebrovascular occlusive disease and six volunteer controls inhaled 30% Xe{sup s} and 70% oxygen for 180 seconds and exhaled for 144 seconds during serial CT scanning without denitrogenation. The end-tidal Xe{sup s} concentration was continuously monitored with a thermoconductivity analyzer to determine the build-up range (A value) and build-up rate constant (K value) for arteries with the curve fitting method. The time-CT number (Hounsfield unit) curve for cerebral tissue during the Xe{sup s} washin and washout phases was used to calculate r{lambda} and rCBF using least squares curve fitting analysis. The resultant r{lambda} and rCBF map demonstrated a reliable distribution between the gray and white matter, and infarcted areas. rCBF was high in gray matter, low in white matter, and much lower in infarcted areas than in white matter. r{lambda} was high in white matter, low in gray matter, and much lower in infarcted areas. Xe{sup s} CT-CBF studies with very short inhalation of 180 seconds is a clinically useful method for evaluation of rCBF in patients with cerebrovascular diseases. (author).

  16. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  17. Newnes short wave listening handbook

    CERN Document Server

    Pritchard, Joe

    2013-01-01

    Newnes Short Wave Listening Handbook is a guide for starting up in short wave listening (SWL). The book is comprised of 15 chapters that discuss the basics and fundamental concepts of short wave radio listening. The coverage of the text includes electrical principles; types of signals that can be heard in the radio spectrum; and using computers in SWL. The book also covers SWL equipment, such as receivers, converters, and circuits. The text will be of great use to individuals who want to get into short wave listening.

  18. The history of computed tomography

    International Nuclear Information System (INIS)

    Bull, J.

    1980-01-01

    New scientific discoveries are often made by the synthetising of other discoveries. Computed tomography is such an example. The three necessary elements were: 1/ the fact that certain simple crystals scintillate when exposed to X-rays, 2/ the advent of electronics and 3/ that of computers. The fact that X-rays cause crystals to scintillate was learnt very shortly after Roentgen's discovery, electronics and computers coming very much later. To put all these together and apply them to diagnostic radiology, and at the same time dismiss the concept so firmly ingrained in everyone's mind that an X-ray picture must be produced on photographic film, required a genius. (orig./VJ) [de

  19. Role of short-time acoustic temporal fine structure cues in sentence recognition for normal-hearing listeners.

    Science.gov (United States)

    Hou, Limin; Xu, Li

    2018-02-01

    Short-time processing was employed to manipulate the amplitude, bandwidth, and temporal fine structure (TFS) in sentences. Fifty-two native-English-speaking, normal-hearing listeners participated in four sentence-recognition experiments. Results showed that recovered envelope (E) played an important role in speech recognition when the bandwidth was > 1 equivalent rectangular bandwidth. Removing TFS drastically reduced sentence recognition. Preserving TFS greatly improved sentence recognition when amplitude information was available at a rate ≥ 10 Hz (i.e., time segment ≤ 100 ms). Therefore, the short-time TFS facilitates speech perception together with the recovered E and works with the coarse amplitude cues to provide useful information for speech recognition.

  20. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  1. Detection of advance item knowledge using response times in computer adaptive testing

    NARCIS (Netherlands)

    Meijer, R.R.; Sotaridona, Leonardo

    2006-01-01

    We propose a new method for detecting item preknowledge in a CAT based on an estimate of “effective response time” for each item. Effective response time is defined as the time required for an individual examinee to answer an item correctly. An unusually short response time relative to the expected

  2. LHC Computing Grid Project Launches intAction with International Support. A thousand times more computing power by 2006

    CERN Multimedia

    2001-01-01

    The first phase of the LHC Computing Grid project was approved at an extraordinary meeting of the Council on 20 September 2001. CERN is preparing for the unprecedented avalanche of data that will be produced by the Large Hadron Collider experiments. A thousand times more computer power will be needed by 2006! CERN's need for a dramatic advance in computing capacity is urgent. As from 2006, the four giant detectors observing trillions of elementary particle collisions at the LHC will accumulate over ten million Gigabytes of data, equivalent to the contents of about 20 million CD-ROMs, each year of its operation. A thousand times more computing power will be needed than is available to CERN today. The strategy the collabortations have adopted to analyse and store this unprecedented amount of data is the coordinated deployment of Grid technologies at hundreds of institutes which will be able to search out and analyse information from an interconnected worldwide grid of tens of thousands of computers and storag...

  3. Real-time brain computer interface using imaginary movements

    DEFF Research Database (Denmark)

    El-Madani, Ahmad; Sørensen, Helge Bjarup Dissing; Kjær, Troels W.

    2015-01-01

    Background: Brain Computer Interface (BCI) is the method of transforming mental thoughts and imagination into actions. A real-time BCI system can improve the quality of life of patients with severe neuromuscular disorders by enabling them to communicate with the outside world. In this paper...

  4. Photoluminescence decay dynamics in γ-Ga2O3 nanocrystals: The role of exclusion distance at short time scales

    Science.gov (United States)

    Fernandes, Brian; Hegde, Manu; Stanish, Paul C.; Mišković, Zoran L.; Radovanovic, Pavle V.

    2017-09-01

    We developed a comprehensive theoretical model describing the photoluminescence decay dynamics at short and long time scales based on the donor-acceptor defect interactions in γ-Ga2O3 nanocrystals, and quantitatively determined the importance of exclusion distance and spatial distribution of defects. We allowed for donors and acceptors to be adjacent to each other or separated by different exclusion distances. The optimal exclusion distance was found to be comparable to the donor Bohr radius and have a strong effect on the photoluminescence decay curve at short times. The importance of the exclusion distance at short time scales was confirmed by Monte Carlo simulations.

  5. The effect of long and short time oil shocks on economic growth in Iran

    OpenAIRE

    Sayyed Abdolmajid Jalae; Sanaz Mohammadi

    2012-01-01

    Oil is one of the strategic good so that price fluctuations and shocks of it have major effects on economic growth and recession in depended countries to revenues of it. In this study, it is tried that the effect of oil price shocks investigated in two types (short and long time) on Economic growth in Iran. Its Period is from 1974 to 2006. According it, oil price uncertainty is quantized by GARCH model and is determined the effects of oil price shocks on economic growth in Iran during a short...

  6. Short-time maximum entropy method analysis of molecular dynamics simulation: Unimolecular decomposition of formic acid

    Science.gov (United States)

    Takahashi, Osamu; Nomura, Tetsuo; Tabayashi, Kiyohiko; Yamasaki, Katsuyoshi

    2008-07-01

    We performed spectral analysis by using the maximum entropy method instead of the traditional Fourier transform technique to investigate the short-time behavior in molecular systems, such as the energy transfer between vibrational modes and chemical reactions. This procedure was applied to direct ab initio molecular dynamics calculations for the decomposition of formic acid. More reactive trajectories of dehydrolation than those of decarboxylation were obtained for Z-formic acid, which was consistent with the prediction of previous theoretical and experimental studies. Short-time maximum entropy method analyses were performed for typical reactive and non-reactive trajectories. Spectrograms of a reactive trajectory were obtained; these clearly showed the reactant, transient, and product regions, especially for the dehydrolation path.

  7. Clinical observation of one time short-pulse pattern scan laser pan-retinal photocoagulation for proliferative diabetic retinopathy

    Directory of Open Access Journals (Sweden)

    Xin Liu

    2016-04-01

    Full Text Available AIM: To investigate the clinical efficacy and benefit of short-pulse pattern scan laser(PASCALphotocoagulation for proliferative diabetic retinopathy(PDR.METHODS:Twenty-eight PDR patients(42 eyesunderwent short-pulse PASCAL pan-retinal photocoagulation(PRPwere analyzed.The best corrected visual acuity was ≥0.1 in 36 eyes, RESULTS: All the cases had no pain during the short-pulse PASCAL treatment.One year after treatments,the final visual acuity was improved in 6 eyes,kept stable in 28 eyes and decreased in 8 eyes; neovascularization were regressed in 18 eyes(43%, stable in 12 eyes(29%, uncontrolled in 12 eyes(29%. Five eyes(12%received vitrectomy due to vitreous hemorrhage.Compared with before operation, retina thickness in central fovea of macula and visual field had no obvious change after one-time PASCAL PRP(P>0.05. CONCLUSION:The one-time short-pulse PASCAL PRP could stabilize the progress of PDR safely, effectively and simply.

  8. The effect of short-time active listening training.

    Science.gov (United States)

    Tatsumi, Asami; Sumiyoshi, Kenichi; Kawaguchi, Hitomi; Sano, Yukiko

    2010-01-01

    We conducted mental health training incorporating active listening for managers at a site of a general chemical company with 1,400 employees. Our purpose was to clarify the effect of active listening training of 2.5h. All subjects were managers. The mental health training was given to 229 managers, 21 times from May 2007 until March 2008. Surveys were conducted from May 2007 to September 2008. The training sessions were conducted in a company meeting room, starting at 2:00 p.m. The importance and significance of listening as a mental health measure and methods of active listening were explained in the training. Afterward, role-playing and follow-up discussions were done twice each. In summaries, participants wrote down what they noticed about listening and gave group presentations. The instructor commented on the presentations, and ended the session by passing out and explaining a paper summarizing what is important in listening. The training was evaluated with a questionnaire distributed at the completion of training, and questionnaires on implementation of what was learned were distributed 1, 3, and 6 mo later. The Active Listening Attitude Scale (ALAS; composed of two scales for method of listening and listening attitude) developed by Mishima et al. was also used before and 1, 3, and 6 mo after the training. In questionnaires distributed on the same day after training, 60% of the 212 respondents said the training time was just right, and 30.1% felt it was too short. The difficulty level of the training was considered appropriate by 77.8%, and 79.7% intended to implement what they had learned. Overall satisfaction was high at 85.9%. In the questionnaire 6 mo after training, 81.4% of the 145 respondents remembered the content of the training and 49.7% said they were practicing what they had learned. They responded that their conversations with subordinates about non-work topics had increased, and communication and support at work had become smoother. ALAS was

  9. New solutions for the short-time analysis of geothermal vertical boreholes

    Energy Technology Data Exchange (ETDEWEB)

    Lamarche, Louis; Beauchamp, Benoit [Ecole de Technologie Supereure, 1100 Notre-Dame Ouest, Montreal (Canada)

    2007-04-15

    Many models, either numerical or analytical, have been proposed to analyse the thermal response of vertical heat exchangers that are used in ground coupled heat pump systems (GCHP). In both approaches, most of the models are valid after few hours of operation since they neglect the heat capacity of the borehole. This is valid for design purposes, where the time of interest is in the order of months and years. Recently, the short time response of vertical boreholes became a subject of interest. In this paper, we present a new analytical approach to treat this problem. It solves the exact solution for concentric cylinders and is a good approximation for the familiar U-tube configuration. (author)

  10. Burn-up measurements of LEU fuel for short cooling times

    International Nuclear Information System (INIS)

    Pereda B, C.; Henriquez A, C.; Klein D, J.; Medel R, J.

    2005-01-01

    The measurements presented in this work were made essentially at in-pool gamma-spectrometric facility, installed inside of the secondary pool of the RECH-1 research reactor, where the measured fuel elements are under 2 meters of water. The main reason for using the in-pool facility was because of its capability to measure the burning of fuel elements without having to wait so long, that is with only 5 cooling days, which are the usual times between reactor operations. Regarding these short cooling times, this work confirms again the possibility of using the 95 Zr as a promising burnup monitor, in spite of the rough approximations used to do it. These results are statistically reasonable within the range calculated using codes. The work corroborates previous results, presented in Santiago de Chile, and it suggests future improvements in that way. (author)

  11. A characterization of persistence at short times in the WFC3/IR detector

    Science.gov (United States)

    Gennaro, M.; Bajaj, V.; Long, K.

    2018-05-01

    Persistence in the WFC3/IR detector appears to decay as a power law as a function of time elapsed since the end of a stimulus. In this report we study departures from the power law at times shorter than a few hundreds seconds after the stimulus. In order to have better short-time cadence, we use the Multiaccum (.ima) files, which trace the accumulated charge in the pixels as function of time, rather than the final pipeline products (.flt files), which instead report the electron rate estimated via a linear fit to the accumulated charge vs. time relation. We note that at short times after the stimulus, the absolute change in persistence is the strongest, thus a linear fit to the accumulated signal (the .flt values) can be a poor representation of the strongly varying persistence signal. The already observed power-law decay of the persistence signal, still holds at shorter times, with typical values of the power law index, gamma in [-0.8,-1] for stimuli that saturate the WFC3 pixels. To a good degree of approximation, a single power law is a good fit to the persistence signal decay from 100 to 5000 seconds. We also detect a tapering-off in the power-law decay at increasingly shorter times. This change in behavior is of the order of Delta Gamma 0.02 - 0.05 when comparing power-law fits performed to the persistence signal from 0 up to 250 seconds and from 0 up to 4000 seconds after the stimulus, indicating that persistence decays slightly more rapidly as time progresses. Our results may suggest that for even shorter times, not probed by our study, the WFC3 persistence signal might deviate from a single power-law model.

  12. Computationally effective solution of the inverse problem in time-of-flight spectroscopy

    DEFF Research Database (Denmark)

    Kamran, Faisal; Abildgaard, Otto Højager Attermann; Subash, Arman Ahamed

    2015-01-01

    Photon time-of-flight (PTOF) spectroscopy enables the estimation of absorption and reduced scattering coefficients of turbid media by measuring the propagation time of short light pulses through turbid medium. The present investigation provides a comparison of the assessed absorption and reduced...

  13. Sorting on STAR. [CDC computer algorithm timing comparison

    Science.gov (United States)

    Stone, H. S.

    1978-01-01

    Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.

  14. Variation in computer time with geometry prescription in monte carlo code KENO-IV

    International Nuclear Information System (INIS)

    Gopalakrishnan, C.R.

    1988-01-01

    In most studies, the Monte Carlo criticality code KENO-IV has been compared with other Monte Carlo codes, but evaluation of its performance with different box descriptions has not been done so far. In Monte Carlo computations, any fractional savings of computing time is highly desirable. Variation in computation time with box description in KENO for two different fast reactor fuel subassemblies of FBTR and PFBR is studied. The K eff of an infinite array of fuel subassemblies is calculated by modelling the subassemblies in two different ways (i) multi-region, (ii) multi-box. In addition to these two cases, excess reactivity calculations of FBTR are also performed in two ways to study this effect in a complex geometry. It is observed that the K eff values calculated by multi-region and multi-box models agree very well. However the increase in computation time from the multi-box to the multi-region is considerable, while the difference in computer storage requirements for the two models is negligible. This variation in computing time arises from the way the neutron is tracked in the two cases. (author)

  15. High-Temperature-Short-Time Annealing Process for High-Performance Large-Area Perovskite Solar Cells.

    Science.gov (United States)

    Kim, Minjin; Kim, Gi-Hwan; Oh, Kyoung Suk; Jo, Yimhyun; Yoon, Hyun; Kim, Ka-Hyun; Lee, Heon; Kim, Jin Young; Kim, Dong Suk

    2017-06-27

    Organic-inorganic hybrid metal halide perovskite solar cells (PSCs) are attracting tremendous research interest due to their high solar-to-electric power conversion efficiency with a high possibility of cost-effective fabrication and certified power conversion efficiency now exceeding 22%. Although many effective methods for their application have been developed over the past decade, their practical transition to large-size devices has been restricted by difficulties in achieving high performance. Here we report on the development of a simple and cost-effective production method with high-temperature and short-time annealing processing to obtain uniform, smooth, and large-size grain domains of perovskite films over large areas. With high-temperature short-time annealing at 400 °C for 4 s, the perovskite film with an average domain size of 1 μm was obtained, which resulted in fast solvent evaporation. Solar cells fabricated using this processing technique had a maximum power conversion efficiency exceeding 20% over a 0.1 cm 2 active area and 18% over a 1 cm 2 active area. We believe our approach will enable the realization of highly efficient large-area PCSs for practical development with a very simple and short-time procedure. This simple method should lead the field toward the fabrication of uniform large-scale perovskite films, which are necessary for the production of high-efficiency solar cells that may also be applicable to several other material systems for more widespread practical deployment.

  16. SLMRACE: a noise-free RACE implementation with reduced computational time

    Science.gov (United States)

    Chauvin, Juliet; Provenzi, Edoardo

    2017-05-01

    We present a faster and noise-free implementation of the RACE algorithm. RACE has mixed characteristics between the famous Retinex model of Land and McCann and the automatic color equalization (ACE) color-correction algorithm. The original random spray-based RACE implementation suffers from two main problems: its computational time and the presence of noise. Here, we will show that it is possible to adapt two techniques recently proposed by Banić et al. to the RACE framework in order to drastically decrease the computational time and noise generation. The implementation will be called smart-light-memory-RACE (SLMRACE).

  17. Generic short-time propagation of sharp-boundaries wave packets

    Science.gov (United States)

    Granot, E.; Marchewka, A.

    2005-11-01

    A general solution to the "shutter" problem is presented. The propagation of an arbitrary initially bounded wave function is investigated, and the general solution for any such function is formulated. It is shown that the exact solution can be written as an expression that depends only on the values of the function (and its derivatives) at the boundaries. In particular, it is shown that at short times (t << 2mx2/hbar, where x is the distance to the boundaries) the wave function propagation depends only on the wave function's values (or its derivatives) at the boundaries of the region. Finally, we generalize these findings to a non-singular wave function (i.e., for wave packets with finite-width boundaries) and suggest an experimental verification.

  18. Efficient Geo-Computational Algorithms for Constructing Space-Time Prisms in Road Networks

    Directory of Open Access Journals (Sweden)

    Hui-Ping Chen

    2016-11-01

    Full Text Available The Space-time prism (STP is a key concept in time geography for analyzing human activity-travel behavior under various Space-time constraints. Most existing time-geographic studies use a straightforward algorithm to construct STPs in road networks by using two one-to-all shortest path searches. However, this straightforward algorithm can introduce considerable computational overhead, given the fact that accessible links in a STP are generally a small portion of the whole network. To address this issue, an efficient geo-computational algorithm, called NTP-A*, is proposed. The proposed NTP-A* algorithm employs the A* and branch-and-bound techniques to discard inaccessible links during two shortest path searches, and thereby improves the STP construction performance. Comprehensive computational experiments are carried out to demonstrate the computational advantage of the proposed algorithm. Several implementation techniques, including the label-correcting technique and the hybrid link-node labeling technique, are discussed and analyzed. Experimental results show that the proposed NTP-A* algorithm can significantly improve STP construction performance in large-scale road networks by a factor of 100, compared with existing algorithms.

  19. Solving the 0/1 Knapsack Problem by a Biomolecular DNA Computer

    Directory of Open Access Journals (Sweden)

    Hassan Taghipour

    2013-01-01

    Full Text Available Solving some mathematical problems such as NP-complete problems by conventional silicon-based computers is problematic and takes so long time. DNA computing is an alternative method of computing which uses DNA molecules for computing purposes. DNA computers have massive degrees of parallel processing capability. The massive parallel processing characteristic of DNA computers is of particular interest in solving NP-complete and hard combinatorial problems. NP-complete problems such as knapsack problem and other hard combinatorial problems can be easily solved by DNA computers in a very short period of time comparing to conventional silicon-based computers. Sticker-based DNA computing is one of the methods of DNA computing. In this paper, the sticker based DNA computing was used for solving the 0/1 knapsack problem. At first, a biomolecular solution space was constructed by using appropriate DNA memory complexes. Then, by the application of a sticker-based parallel algorithm using biological operations, knapsack problem was resolved in polynomial time.

  20. Computational Procedures for a Class of GI/D/k Systems in Discrete Time

    Directory of Open Access Journals (Sweden)

    Md. Mostafizur Rahman

    2009-01-01

    Full Text Available A class of discrete time GI/D/k systems is considered for which the interarrival times have finite support and customers are served in first-in first-out (FIFO order. The system is formulated as a single server queue with new general independent interarrival times and constant service duration by assuming cyclic assignment of customers to the identical servers. Then the queue length is set up as a quasi-birth-death (QBD type Markov chain. It is shown that this transformed GI/D/1 system has special structures which make the computation of the matrix R simple and efficient, thereby reducing the number of multiplications in each iteration significantly. As a result we were able to keep the computation time very low. Moreover, use of the resulting structural properties makes the computation of the distribution of queue length of the transformed system efficient. The computation of the distribution of waiting time is also shown to be simple by exploiting the special structures.

  1. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo; Abdelaziz, Ibrahim; Aldilaijan, Abdulla; Canini, Marco; Kalnis, Panos

    2017-01-01

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers' computation time.

  2. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo

    2017-11-27

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers\\' computation time.

  3. Decoherence in adiabatic quantum computation

    Science.gov (United States)

    Albash, Tameem; Lidar, Daniel A.

    2015-06-01

    Recent experiments with increasingly larger numbers of qubits have sparked renewed interest in adiabatic quantum computation, and in particular quantum annealing. A central question that is repeatedly asked is whether quantum features of the evolution can survive over the long time scales used for quantum annealing relative to standard measures of the decoherence time. We reconsider the role of decoherence in adiabatic quantum computation and quantum annealing using the adiabatic quantum master-equation formalism. We restrict ourselves to the weak-coupling and singular-coupling limits, which correspond to decoherence in the energy eigenbasis and in the computational basis, respectively. We demonstrate that decoherence in the instantaneous energy eigenbasis does not necessarily detrimentally affect adiabatic quantum computation, and in particular that a short single-qubit T2 time need not imply adverse consequences for the success of the quantum adiabatic algorithm. We further demonstrate that boundary cancellation methods, designed to improve the fidelity of adiabatic quantum computing in the closed-system setting, remain beneficial in the open-system setting. To address the high computational cost of master-equation simulations, we also demonstrate that a quantum Monte Carlo algorithm that explicitly accounts for a thermal bosonic bath can be used to interpolate between classical and quantum annealing. Our study highlights and clarifies the significantly different role played by decoherence in the adiabatic and circuit models of quantum computing.

  4. Identification of the structure parameters using short-time non-stationary stochastic excitation

    Science.gov (United States)

    Jarczewska, Kamila; Koszela, Piotr; Śniady, PaweŁ; Korzec, Aleksandra

    2011-07-01

    In this paper, we propose an approach to the flexural stiffness or eigenvalue frequency identification of a linear structure using a non-stationary stochastic excitation process. The idea of the proposed approach lies within time domain input-output methods. The proposed method is based on transforming the dynamical problem into a static one by integrating the input and the output signals. The output signal is the structure reaction, i.e. structure displacements due to the short-time, irregular load of random type. The systems with single and multiple degrees of freedom, as well as continuous systems are considered.

  5. Automated Detection of Short Optical Transients of Astrophysical Origin in Real Time

    Directory of Open Access Journals (Sweden)

    Marcin Sokołowski

    2010-01-01

    Full Text Available The detection of short optical transients of astrophysical origin in real time is an important task for existing robotic telescopes. The faster a new optical transient is detected, the earlier follow-up observations can be started. The sooner the object is identified, the more data can be collected before the source fades away, particularly in the most interesting early period of the transient. In this the real-time pipeline designed for identification of optical flashes with the “Pi of the Sky” project will be presented in detail together with solutions used by other experiments.

  6. Short-term forecasts of district heating load and outdoor temperature by use of on-line connected computers; Korttidsprognoser foer fjaerrvaermelast och utetemperatur med on-linekopplade datorer

    Energy Technology Data Exchange (ETDEWEB)

    Malmstroem, B; Ernfors, P; Nilsson, Daniel; Vallgren, H [Chalmers Tekniska Hoegskola, Goeteborg (Sweden). Institutionen foer Energiteknik

    1996-10-01

    In this report the available methods for forecasting weather and district heating load have been studied. A forecast method based on neural networks has been tested against the more common statistical methods. The accuracy of the weather forecasts from the SMHI (Swedish Meteorological and Hydrological Institute) has been estimated. In connection with these tests, the possibilities of improving the forecasts by using on-line connected computers has been analysed. The most important results from the study are: Energy company staff generally look upon the forecasting of district heating load as a problem of such a magnitude that computer support is needed. At the companies where computer calculated forecasts are in use, their accuracy is regarded as quite satisfactory; The interest in computer produced load forecasts among energy company staff is increasing; At present, a sufficient number of commercial suppliers of weather forecasts as well as load forecasts is available to fulfill the needs of energy companies; Forecasts based on neural networks did not attain any precision improvement in comparison to more traditional statistical methods. There may though be other types of neural networks, not tested in this study, that are possibly capable of improving the forecast precision; Forecasts of outdoor temperature and district heating load can be significantly improved through the use of on-line-connected computers supplied with instantaneous measurements of temperature and load. This study shows that a general reduction of the load prediction errors by approximately 15% is attainable. For short time horizons (less than 5 hours), more extensive load prediction error reductions can be reached. For the 1-hour time horizon, the possible reduction amounts to up to 50%. 21 refs, 4 figs, 7 appendices

  7. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  8. Investigation of the motion of diesel injection jets using high-speed cinematography and short time holography

    International Nuclear Information System (INIS)

    Eisfeld, F.

    1987-01-01

    The knowledge about the penetration of diesel injection jets, particularly about the flow within the short behind the nozzle, and the arising of droplets from an injection jet is very limited. Experimental investigations are required to describe the process of penetration and spreading of the jet. The research method requires high speed cinematography and short time holography. Problems in the investigation method are described

  9. 10th and 11th conference on Ultra-Wideband Short-Pulse Electromagnetics

    CERN Document Server

    Mokole, Eric; UWB SP 10; UWB SP 11

    2014-01-01

    This book presents contributions of deep technical content and high scientific quality in the areas of electromagnetic theory, scattering, UWB antennas, UWB systems, ground penetrating radar (GPR), UWB communications, pulsed-power generation, time-domain computational electromagnetics, UWB compatibility, target detection and discrimination, propagation through dispersive media, and wavelet and multi-resolution techniques. Ultra-wideband (UWB), short-pulse (SP) electromagnetics are now being used for an increasingly wide variety of applications, including collision avoidance radar, concealed object detection, and communications. Notable progress in UWB and SP technologies has been achieved by investigations of their theoretical bases and improvements in solid-state manufacturing, computers, and digitizers. UWB radar systems are also being used for mine clearing, oil pipeline inspections, archeology, geology, and electronic effects testing. Like previous books in this series, Ultra-Wideband Short-Pulse Electrom...

  10. Multiscale Methods, Parallel Computation, and Neural Networks for Real-Time Computer Vision.

    Science.gov (United States)

    Battiti, Roberto

    1990-01-01

    This thesis presents new algorithms for low and intermediate level computer vision. The guiding ideas in the presented approach are those of hierarchical and adaptive processing, concurrent computation, and supervised learning. Processing of the visual data at different resolutions is used not only to reduce the amount of computation necessary to reach the fixed point, but also to produce a more accurate estimation of the desired parameters. The presented adaptive multiple scale technique is applied to the problem of motion field estimation. Different parts of the image are analyzed at a resolution that is chosen in order to minimize the error in the coefficients of the differential equations to be solved. Tests with video-acquired images show that velocity estimation is more accurate over a wide range of motion with respect to the homogeneous scheme. In some cases introduction of explicit discontinuities coupled to the continuous variables can be used to avoid propagation of visual information from areas corresponding to objects with different physical and/or kinematic properties. The human visual system uses concurrent computation in order to process the vast amount of visual data in "real -time." Although with different technological constraints, parallel computation can be used efficiently for computer vision. All the presented algorithms have been implemented on medium grain distributed memory multicomputers with a speed-up approximately proportional to the number of processors used. A simple two-dimensional domain decomposition assigns regions of the multiresolution pyramid to the different processors. The inter-processor communication needed during the solution process is proportional to the linear dimension of the assigned domain, so that efficiency is close to 100% if a large region is assigned to each processor. Finally, learning algorithms are shown to be a viable technique to engineer computer vision systems for different applications starting from

  11. A strategy for reducing turnaround time in design optimization using a distributed computer system

    Science.gov (United States)

    Young, Katherine C.; Padula, Sharon L.; Rogers, James L.

    1988-01-01

    There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.

  12. Computing the Maximum Detour of a Plane Graph in Subquadratic Time

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    Let G be a plane graph where each edge is a line segment. We consider the problem of computing the maximum detour of G, defined as the maximum over all pairs of distinct points p and q of G of the ratio between the distance between p and q in G and the distance |pq|. The fastest known algorithm f...... for this problem has O(n^2) running time. We show how to obtain O(n^{3/2}*(log n)^3) expected running time. We also show that if G has bounded treewidth, its maximum detour can be computed in O(n*(log n)^3) expected time....

  13. Computer-determined assay time based on preset precision

    International Nuclear Information System (INIS)

    Foster, L.A.; Hagan, R.; Martin, E.R.; Wachter, J.R.; Bonner, C.A.; Malcom, J.E.

    1994-01-01

    Most current assay systems for special nuclear materials (SNM) operate on the principle of a fixed assay time which provides acceptable measurement precision without sacrificing the required throughput of the instrument. Waste items to be assayed for SNM content can contain a wide range of nuclear material. Counting all items for the same preset assay time results in a wide range of measurement precision and wastes time at the upper end of the calibration range. A short time sample taken at the beginning of the assay could optimize the analysis time on the basis of the required measurement precision. To illustrate the technique of automatically determining the assay time, measurements were made with a segmented gamma scanner at the Plutonium Facility of Los Alamos National Laboratory with the assay time for each segment determined by counting statistics in that segment. Segments with very little SNM were quickly determined to be below the lower limit of the measurement range and the measurement was stopped. Segments with significant SNM were optimally assays to the preset precision. With this method the total assay time for each item is determined by the desired preset precision. This report describes the precision-based algorithm and presents the results of measurements made to test its validity

  14. Short-Term Bus Passenger Demand Prediction Based on Time Series Model and Interactive Multiple Model Approach

    Directory of Open Access Journals (Sweden)

    Rui Xue

    2015-01-01

    Full Text Available Although bus passenger demand prediction has attracted increased attention during recent years, limited research has been conducted in the context of short-term passenger demand forecasting. This paper proposes an interactive multiple model (IMM filter algorithm-based model to predict short-term passenger demand. After aggregated in 15 min interval, passenger demand data collected from a busy bus route over four months were used to generate time series. Considering that passenger demand exhibits various characteristics in different time scales, three time series were developed, named weekly, daily, and 15 min time series. After the correlation, periodicity, and stationarity analyses, time series models were constructed. Particularly, the heteroscedasticity of time series was explored to achieve better prediction performance. Finally, IMM filter algorithm was applied to combine individual forecasting models with dynamically predicted passenger demand for next interval. Different error indices were adopted for the analyses of individual and hybrid models. The performance comparison indicates that hybrid model forecasts are superior to individual ones in accuracy. Findings of this study are of theoretical and practical significance in bus scheduling.

  15. Design, functioning and possible applications of process computers

    International Nuclear Information System (INIS)

    Kussl, V.

    1975-01-01

    Process computers are useful as automation instruments a) when large numbers of data are processed in analog or digital form, b) for low data flow (data rate), and c) when data must be stored over short or long periods of time. (orig./AK) [de

  16. Non-linear dynamical classification of short time series of the rössler system in high noise regimes.

    Science.gov (United States)

    Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E; Poizner, Howard; Sejnowski, Terrence J

    2013-01-01

    Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson's disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to -30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A' under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data.

  17. Time-Based Loss in Visual Short-Term Memory Is from Trace Decay, Not Temporal Distinctiveness

    Science.gov (United States)

    Ricker, Timothy J.; Spiegel, Lauren R.; Cowan, Nelson

    2014-01-01

    There is no consensus as to why forgetting occurs in short-term memory tasks. In past work, we have shown that forgetting occurs with the passage of time, but there are 2 classes of theories that can explain this effect. In the present work, we investigate the reason for time-based forgetting by contrasting the predictions of temporal…

  18. Evaluation of skeletal muscle during exercise on short repetition time MR imaging

    International Nuclear Information System (INIS)

    Yoshioka, Hiroshi; Niitsu, Mamoru; Anno, Izumi; Takahashi, Hideyuki; Kuno, Shinya; Matsumoto, Kunihiko; Itai, Yuji

    1992-01-01

    There have been many reports on the effects of exercise on skeletal muscle signal intensities based on magnetic resonance (MR) imaging. These images were obtained using T 2 -weighted MR images. The purpose of this study was to observe muscles during exercise while shortening the repetition time (TR) on spin echo images. In addition, inactive and active muscles were differentiated in the same manner. T 2 values of the tibialis anterior m. were calculated from TR=400 ms to TR=3000 ms. These values were mostly constant and didn't depend upon TR. Increases in signal intensities of the exercise muscles could be observed on the short TR (600 ms) MR images since the changes of the signal intensities mainly depend upon T 2 values. Thus, the T 2 value is useful as a quantitative index to assess the exercise muscle even on the short TR MR images. (author)

  19. Short echo time proton spectroscopy of the brain in healthy volunteers using an insert gradient head coil

    DEFF Research Database (Denmark)

    Gideon, P; Danielsen, E R; Schneider, M

    1995-01-01

    An insert gradient head coil with built-in X, Y, and Z gradients was used for localized proton spectroscopy in the brain of healthy volunteers, using short echo time stimulated echo acquisition mode (STEAM) sequences. Volume of interest size was 3.4 ml, repetition time was 6.0 s, and echo times...... were 10 and 20 ms, respectively. Good quality proton spectra with practically no eddy current artefacts were acquired allowing observation of strongly coupled compounds, and compounds with short T2 relaxation times. The gradient head coil thus permits further studies of compounds such as glutamine....../glutamate and myo-inositols. These compounds were more prominent within grey matter than within white matter. Rough estimations of metabolite concentrations using water as an internal standard were in good agreement with previous reports....

  20. Applications of parallel computer architectures to the real-time simulation of nuclear power systems

    International Nuclear Information System (INIS)

    Doster, J.M.; Sills, E.D.

    1988-01-01

    In this paper the authors report on efforts to utilize parallel computer architectures for the thermal-hydraulic simulation of nuclear power systems and current research efforts toward the development of advanced reactor operator aids and control systems based on this new technology. Many aspects of reactor thermal-hydraulic calculations are inherently parallel, and the computationally intensive portions of these calculations can be effectively implemented on modern computers. Timing studies indicate faster-than-real-time, high-fidelity physics models can be developed when the computational algorithms are designed to take advantage of the computer's architecture. These capabilities allow for the development of novel control systems and advanced reactor operator aids. Coupled with an integral real-time data acquisition system, evolving parallel computer architectures can provide operators and control room designers improved control and protection capabilities. Current research efforts are currently under way in this area

  1. Nonlinear detection of disordered voice productions from short time series based on a Volterra-Wiener-Korenberg model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Yu, E-mail: yuzhang@xmu.edu.cn [Key Laboratory of Underwater Acoustic Communication and Marine Information Technology of the Ministry of Education, Xiamen University, Xiamen Fujian 361005 (China); Sprecher, Alicia J. [Department of Surgery, Division of Otolaryngology - Head and Neck Surgery, University of Wisconsin School of Medicine and Public Health, Madison, WI 53792-7375 (United States); Zhao Zongxi [Key Laboratory of Underwater Acoustic Communication and Marine Information Technology of the Ministry of Education, Xiamen University, Xiamen Fujian 361005 (China); Jiang, Jack J. [Department of Surgery, Division of Otolaryngology - Head and Neck Surgery, University of Wisconsin School of Medicine and Public Health, Madison, WI 53792-7375 (United States)

    2011-09-15

    Highlights: > The VWK method effectively detects the nonlinearity of a discrete map. > The method describes the chaotic time series of a biomechanical vocal fold model. > Nonlinearity in laryngeal pathology is detected from short and noisy time series. - Abstract: In this paper, we apply the Volterra-Wiener-Korenberg (VWK) model method to detect nonlinearity in disordered voice productions. The VWK method effectively describes the nonlinearity of a third-order nonlinear map. It allows for the analysis of short and noisy data sets. The extracted VWK model parameters show an agreement with the original nonlinear map parameters. Furthermore, the VWK mode method is applied to successfully assess the nonlinearity of a biomechanical voice production model simulating irregular vibratory dynamics of vocal folds with a unilateral vocal polyp. Finally, we show the clinical applicability of this nonlinear detection method to analyze the electroglottographic data generated by 14 patients with vocal nodules or polyps. The VWK model method shows potential in describing the nonlinearity inherent in disordered voice productions from short and noisy time series that are common in the clinical setting.

  2. Nonlinear detection of disordered voice productions from short time series based on a Volterra-Wiener-Korenberg model

    International Nuclear Information System (INIS)

    Zhang Yu; Sprecher, Alicia J.; Zhao Zongxi; Jiang, Jack J.

    2011-01-01

    Highlights: → The VWK method effectively detects the nonlinearity of a discrete map. → The method describes the chaotic time series of a biomechanical vocal fold model. → Nonlinearity in laryngeal pathology is detected from short and noisy time series. - Abstract: In this paper, we apply the Volterra-Wiener-Korenberg (VWK) model method to detect nonlinearity in disordered voice productions. The VWK method effectively describes the nonlinearity of a third-order nonlinear map. It allows for the analysis of short and noisy data sets. The extracted VWK model parameters show an agreement with the original nonlinear map parameters. Furthermore, the VWK mode method is applied to successfully assess the nonlinearity of a biomechanical voice production model simulating irregular vibratory dynamics of vocal folds with a unilateral vocal polyp. Finally, we show the clinical applicability of this nonlinear detection method to analyze the electroglottographic data generated by 14 patients with vocal nodules or polyps. The VWK model method shows potential in describing the nonlinearity inherent in disordered voice productions from short and noisy time series that are common in the clinical setting.

  3. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  4. Computationally determining the salience of decision points for real-time wayfinding support

    Directory of Open Access Journals (Sweden)

    Makoto Takemiya

    2012-06-01

    Full Text Available This study introduces the concept of computational salience to explain the discriminatory efficacy of decision points, which in turn may have applications to providing real-time assistance to users of navigational aids. This research compared algorithms for calculating the computational salience of decision points and validated the results via three methods: high-salience decision points were used to classify wayfinders; salience scores were used to weight a conditional probabilistic scoring function for real-time wayfinder performance classification; and salience scores were correlated with wayfinding-performance metrics. As an exploratory step to linking computational and cognitive salience, a photograph-recognition experiment was conducted. Results reveal a distinction between algorithms useful for determining computational and cognitive saliences. For computational salience, information about the structural integration of decision points is effective, while information about the probability of decision-point traversal shows promise for determining cognitive salience. Limitations from only using structural information and motivations for future work that include non-structural information are elicited.

  5. Computation of transit times using the milestoning method with applications to polymer translocation

    Science.gov (United States)

    Hawk, Alexander T.; Konda, Sai Sriharsha M.; Makarov, Dmitrii E.

    2013-08-01

    Milestoning is an efficient approximation for computing long-time kinetics and thermodynamics of large molecular systems, which are inaccessible to brute-force molecular dynamics simulations. A common use of milestoning is to compute the mean first passage time (MFPT) for a conformational transition of interest. However, the MFPT is not always the experimentally observed timescale. In particular, the duration of the transition path, or the mean transit time, can be measured in single-molecule experiments, such as studies of polymers translocating through pores and fluorescence resonance energy transfer studies of protein folding. Here we show how to use milestoning to compute transit times and illustrate our approach by applying it to the translocation of a polymer through a narrow pore.

  6. Age and admission times as predictive factors for failure of admissions to discharge-stream short-stay units.

    Science.gov (United States)

    Shetty, Amith L; Shankar Raju, Savitha Banagar; Hermiz, Arsalan; Vaghasiya, Milan; Vukasovic, Matthew

    2015-02-01

    Discharge-stream emergency short-stay units (ESSU) improve ED and hospital efficiency. Age of patients and time of hospital presentations have been shown to correlate with increasing complexity of care. We aim to determine whether an age and time cut-off could be derived to subsequently improve short-stay unit success rates. We conducted a retrospective audit on 6703 (5522 inclusions) patients admitted to our discharge-stream short-stay unit. Patients were classified as appropriate or inappropriate admissions, and deemed successful if discharged out of the unit within 24 h; and failures if they needed inpatient admission into the hospital. We calculated short-stay unit length of stay for patients in each of these groups. A 15% failure rate was deemed as acceptable key performance indicator (KPI) for our unit. There were 197 out of 4621 (4.3%, 95% CI 3.7-4.9%) patients up to the age of 70 who failed admission to ESSU compared with 67 out of 901 (7.4%, 95% CI 5.9-9.3%, P 70 years of age have higher rates of failure after admission to discharge-stream ESSU. Although in appropriately selected discharge-stream patients, no age group or time-band of presentation was associated with increased failure rate beyond the stipulated KPI. © 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  7. A heterogeneous hierarchical architecture for real-time computing

    Energy Technology Data Exchange (ETDEWEB)

    Skroch, D.A.; Fornaro, R.J.

    1988-12-01

    The need for high-speed data acquisition and control algorithms has prompted continued research in the area of multiprocessor systems and related programming techniques. The result presented here is a unique hardware and software architecture for high-speed real-time computer systems. The implementation of a prototype of this architecture has required the integration of architecture, operating systems and programming languages into a cohesive unit. This report describes a Heterogeneous Hierarchial Architecture for Real-Time (H{sup 2} ART) and system software for program loading and interprocessor communication.

  8. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  9. Efficient quantum algorithm for computing n-time correlation functions.

    Science.gov (United States)

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  10. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads.

    Science.gov (United States)

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jirí

    2017-07-07

    Satellite DNA is one of the major classes of repetitive DNA, characterized by tandemly arranged repeat copies that form contiguous arrays up to megabases in length. This type of genomic organization makes satellite DNA difficult to assemble, which hampers characterization of satellite sequences by computational analysis of genomic contigs. Here, we present tandem repeat analyzer (TAREAN), a novel computational pipeline that circumvents this problem by detecting satellite repeats directly from unassembled short reads. The pipeline first employs graph-based sequence clustering to identify groups of reads that represent repetitive elements. Putative satellite repeats are subsequently detected by the presence of circular structures in their cluster graphs. Consensus sequences of repeat monomers are then reconstructed from the most frequent k-mers obtained by decomposing read sequences from corresponding clusters. The pipeline performance was successfully validated by analyzing low-pass genome sequencing data from five plant species where satellite DNA was previously experimentally characterized. Moreover, novel satellite repeats were predicted for the genome of Vicia faba and three of these repeats were verified by detecting their sequences on metaphase chromosomes using fluorescence in situ hybridization. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Computers live on in Colombian classrooms | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-02-08

    Feb 8, 2011 ... ... its Windows operating system and Office applications suite, and there is a wealth ... The key to the success of the program in so short a time is political will, ... is a fraction of the real cost to provide computers to 2 000 schools.

  12. What do short-term and long-term relationships look like? Building the relationship coordination and strategic timing (ReCAST) model.

    Science.gov (United States)

    Eastwick, Paul W; Keneski, Elizabeth; Morgan, Taylor A; McDonald, Meagan A; Huang, Sabrina A

    2018-05-01

    Close relationships research has examined committed couples (e.g., dating relationships, marriages) using intensive methods that plot relationship development over time. But a substantial proportion of people's real-life sexual experiences take place (a) before committed relationships become "official" and (b) in short-term relationships; methods that document the time course of relationships have rarely been applied to these contexts. We adapted a classic relationship trajectory-plotting technique to generate the first empirical comparisons between the features of people's real-life short-term and long-term relationships across their entire timespan. Five studies compared long-term and short-term relationships in terms of the timing of relationship milestones (e.g., flirting, first sexual intercourse) and the occurrence/intensity of important relationship experiences (e.g., romantic interest, strong sexual desire, attachment). As romantic interest was rising and partners were becoming acquainted, long-term and short-term relationships were indistinguishable. Eventually, romantic interest in short-term relationships plateaued and declined while romantic interest in long-term relationships continued to rise, ultimately reaching a higher peak. As relationships progressed, participants evidenced more features characteristic of the attachment-behavioral system (e.g., attachment, caregiving) in long-term than short-term relationships but similar levels of other features (e.g., sexual desire, self-promotion, intrasexual competition). These data inform a new synthesis of close relationships and evolutionary psychological perspectives called the Relationship Coordination and Strategic Timing (ReCAST) model. ReCAST depicts short-term and long-term relationships as partially overlapping trajectories (rather than relationships initiated with distinct strategies) that differ in their progression along a normative relationship development sequence. (PsycINFO Database Record (c

  13. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  14. Climate Data Provenance Tracking for Just-In-Time Computation

    Science.gov (United States)

    Fries, S.; Nadeau, D.; Doutriaux, C.; Williams, D. N.

    2016-12-01

    The "Climate Data Management System" (CDMS) was created in 1996 as part of the Climate Data Analysis Tools suite of software. It provides a simple interface into a wide variety of climate data formats, and creates NetCDF CF-Compliant files. It leverages the NumPy framework for high performance computation, and is an all-in-one IO and computation package. CDMS has been extended to track manipulations of data, and trace that data all the way to the original raw data. This extension tracks provenance about data, and enables just-in-time (JIT) computation. The provenance for each variable is packaged as part of the variable's metadata, and can be used to validate data processing and computations (by repeating the analysis on the original data). It also allows for an alternate solution for sharing analyzed data; if the bandwidth for a transfer is prohibitively expensive, the provenance serialization can be passed in a much more compact format and the analysis rerun on the input data. Data provenance tracking in CDMS enables far-reaching and impactful functionalities, permitting implementation of many analytical paradigms.

  15. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  16. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    Energy Technology Data Exchange (ETDEWEB)

    Goings, Joshua J.; Li, Xiaosong, E-mail: xsli@uw.edu [Department of Chemistry, University of Washington, Seattle, Washington 98195 (United States)

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  17. The time of onset of abnormal calcification in spondylometaepiphyseal dysplasia, short limb-abnormal calcification type

    Energy Technology Data Exchange (ETDEWEB)

    Tueysuez, Beyhan [Istanbul University, Department of Pediatric Genetics, Cerrahpasa Medical School, Istanbul (Turkey); Gazioglu, Nurperi [Istanbul University, Department of Neurosurgery, Cerrahpasa Medical School, Istanbul (Turkey); Uenguer, Savas [Istanbul University, Department of Pediatric Radiology, Cerrahpasa Medical School, Istanbul (Turkey); Aji, Dolly Yafet [Istanbul University, Department of Pediatrics, Cerrahpasa Medical School, Istanbul (Turkey); Tuerkmen, Seval [Istanbul University, Department of Pediatric Genetics, Cerrahpasa Medical School, Istanbul (Turkey); Universitatsklinikum Berlin, Charite Virchow-Klinik, Berlin (Germany)

    2009-01-15

    A 1-month-old boy with shortness of extremities on prenatal US was referred to our department with a provisional diagnosis of achondroplasia. His height was normal but he had short extremities and platyspondyly, premature carpal epiphyses on both hands, and short tubular bones with irregular metaphyses on radiographs. Re-evaluation of the patient at the age of 1 year revealed very short height and premature calcification of the costal cartilages and epiphyses. Spondylometaepiphyseal dysplasia (SMED), short limb-abnormal calcification type was diagnosed. This condition is a very rare autosomal recessively inherited disorder, and most of the patients die in early childhood due to neurological involvement. At the age of 2 years and 5 months, a CT scan showed narrowing of the cervical spinal canal. One month later he died suddenly because of spinal cord injury. In conclusion early diagnosis is very important because the recurrence risk is high and patients may die due to early neurological complications. The time of onset of abnormal calcifications, a diagnostic finding of the disease, is at the age of around 1 year in most patients. When abnormal calcifications are not yet present, but radiological changes associated with SMED are present, this rare disease must be considered. (orig.)

  18. The time of onset of abnormal calcification in spondylometaepiphyseal dysplasia, short limb-abnormal calcification type

    International Nuclear Information System (INIS)

    Tueysuez, Beyhan; Gazioglu, Nurperi; Uenguer, Savas; Aji, Dolly Yafet; Tuerkmen, Seval

    2009-01-01

    A 1-month-old boy with shortness of extremities on prenatal US was referred to our department with a provisional diagnosis of achondroplasia. His height was normal but he had short extremities and platyspondyly, premature carpal epiphyses on both hands, and short tubular bones with irregular metaphyses on radiographs. Re-evaluation of the patient at the age of 1 year revealed very short height and premature calcification of the costal cartilages and epiphyses. Spondylometaepiphyseal dysplasia (SMED), short limb-abnormal calcification type was diagnosed. This condition is a very rare autosomal recessively inherited disorder, and most of the patients die in early childhood due to neurological involvement. At the age of 2 years and 5 months, a CT scan showed narrowing of the cervical spinal canal. One month later he died suddenly because of spinal cord injury. In conclusion early diagnosis is very important because the recurrence risk is high and patients may die due to early neurological complications. The time of onset of abnormal calcifications, a diagnostic finding of the disease, is at the age of around 1 year in most patients. When abnormal calcifications are not yet present, but radiological changes associated with SMED are present, this rare disease must be considered. (orig.)

  19. Projected Applications of a ``Climate in a Box'' Computing System at the NASA Short-term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, G.; Molthan, A.; Zavodsky, B.; Case, J.; Lafontaine, F.

    2010-12-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to “Climate in a Box” systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the “Climate in a Box” system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA’s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the “Climate in a Box” system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed

  20. Projected Applications of a "Climate in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPo

  1. Perceived problems with computer gaming and internet use among adolescents

    DEFF Research Database (Denmark)

    Holstein, Bjørn E; Pedersen, Trine Pagh; Bendtsen, Pernille

    2014-01-01

    BACKGROUND: Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer...... on weekdays on computer- and console-gaming and internet use for communication and surfing. The outcome measures were three indexes on perceived problems related to computer and console gaming and internet use. RESULTS: The three new indexes showed high face validity and acceptable internal consistency. Most...... schoolchildren with high screen time did not experience problems related to computer use. Still, there was a strong and graded association between time use and perceived problems related to computer gaming, console gaming (only boys) and internet use, odds ratios ranging from 6.90 to 10.23. CONCLUSION: The three...

  2. Real-time emergency forecasting technique for situation management systems

    Science.gov (United States)

    Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.

    2018-05-01

    The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.

  3. New serial time codes for seismic short period and long period data acquisition systems

    International Nuclear Information System (INIS)

    Kolvankar, V.G.; Rao, D.S.

    1988-01-01

    This paper discusses a new time code for time indexing multichannel short period (1 to 25 hz) seismic event data recorded on a single track of magnetic tape in digital format and discusses its usefulness in contrast to Vela time code used in continuous analog multichannel data recording system on multitrack instrumentation tape deck. This paper also discusses another time code, used for time indexing of seismic long period (DC to 2.5 seconds) multichannel data recorded on a single track of magnetic tape in digital format. The time code decoding and display system developed to provide quick access to any desired portion of the tape in both data recording and repro duce system is also discussed. (author). 7 figs

  4. Complications with computer-aided designed/computer-assisted manufactured titanium and soldered gold bars for mandibular implant-overdentures: short-term observations.

    Science.gov (United States)

    Katsoulis, Joannis; Wälchli, Julia; Kobel, Simone; Gholami, Hadi; Mericske-Stern, Regina

    2015-01-01

    Implant-overdentures supported by rigid bars provide stability in the edentulous atrophic mandible. However, fractures of solder joints and matrices, and loosening of screws and matrices were observed with soldered gold bars (G-bars). Computer-aided designed/computer-assisted manufactured (CAD/CAM) titanium bars (Ti-bars) may reduce technical complications due to enhanced material quality. To compare prosthetic-technical maintenance service of mandibular implant-overdentures supported by CAD/CAM Ti-bar and soldered G-bar. Edentulous patients were consecutively admitted for implant-prosthodontic treatment with a maxillary complete denture and a mandibular implant-overdenture connected to a rigid G-bar or Ti-bar. Maintenance service and problems with the implant-retention device complex and the prosthesis were recorded during minimally 3-4 years. Annual peri-implant crestal bone level changes (ΔBIC) were radiographically assessed. Data of 213 edentulous patients (mean age 68 ± 10 years), who had received a total of 477 tapered implants, were available. Ti-bar and G-bar comprised 101 and 112 patients with 231 and 246 implants, respectively. Ti-bar mostly exhibited distal bar extensions (96%) compared to 34% of G-bar (p overdentures supported by soldered gold bars or milled CAD/CAM Ti-bars are a successful treatment modality but require regular maintenance service. These short-term observations support the hypothesis that CAD/CAM Ti-bars reduce technical complications. Fracture location indicated that the titanium thickness around the screw-access hole should be increased. © 2013 Wiley Periodicals, Inc.

  5. Optimal filtering of dynamics in short-time features for music organization

    DEFF Research Database (Denmark)

    Arenas-García, Jerónimo; Larsen, Jan; Hansen, Lars Kai

    2006-01-01

    There is an increasing interest in customizable methods for organizing music collections. Relevant music characterization can be obtained from short-time features, but it is not obvious how to combine them to get useful information. In this work, a novel method, denoted as the Positive Constrained...... Orthonormalized Partial Least Squares (POPLS), is proposed. Working on the periodograms of MFCCs time series, this supervised method finds optimal filters which pick up the most discriminative temporal information for any music organization task. Two examples are presented in the paper, the first being a simple...... proof-of-concept, where an altosax with and without vibrato is modelled. A more complex \\$11\\$ music genre classification setup is also investigated to illustrate the robustness and validity of the proposed method on larger datasets. Both experiments showed the good properties of our method, as well...

  6. Application of computer graphics in the design of custom orthopedic implants.

    Science.gov (United States)

    Bechtold, J E

    1986-10-01

    Implementation of newly developed computer modelling techniques and computer graphics displays and software have greatly aided the orthopedic design engineer and physician in creating a custom implant with good anatomic conformity in a short turnaround time. Further advances in computerized design and manufacturing will continue to simplify the development of custom prostheses and enlarge their niche in the joint replacement market.

  7. Short-time action electric generators to power physical devices

    International Nuclear Information System (INIS)

    Glebov, I.A.; Kasharskij, Eh.G.; Rutberg, F.G.; Khutoretskij, G.M.

    1982-01-01

    Requirements to be met by power-supply sources of the native electrophysical facilities have been analyzed and trends in designing foreign electric machine units of short-time action have been considered. Specifications of a generator, manufactured in the form of synchronous bipolar turbogenerator with an all-forged rotor with indirect air cooling of the rotor and stator windings are presented. Front parts of the stator winding are additionally fixed using glass-textolite rings, brackets and gaskets. A flywheel, manufactured in the form of all-forged steel cylinder is joined directly with the generator rotor by means of a half-coupling. An acceleration asynchronous engine with a phase rotor of 4 MW nominal capacity is located on the opposite side of the flywheel. The generator peak power is 242 MVxA; power factor = 0.9; energy transferred to the load 5per 1 pulse =00 MJ; the flywheel weight 81 t

  8. Nonequilibrium Physics at Short Time Scales: Formation of Correlations

    International Nuclear Information System (INIS)

    Peliti, L

    2005-01-01

    It is a happy situation when similar concepts and theoretical techniques can be applied to widely different physical systems because of a deep similarity in the situations being studied. The book illustrates this well; it focuses on the description of correlations in quantum systems out of equilibrium at very short time scales, prompted by experiments with short laser pulses in semiconductors, and in complex reactions in heavy nuclei. In both cases the experiments are characterized by nonlinear dynamics and by strong correlations out of equilibrium. In some systems there are also important finite-size effects. The book comprises several independent contributions of moderate length, and I sometimes felt that a more intensive effort in cross-coordination of the different contributions could have been of help. It is divided almost equally between theory and experiment. In the theoretical part, there is a thorough discussion both of the kinematic aspects (description of correlations) and the dynamical ones (evaluation of correlations). The experimental part is naturally divided according to the nature of the system: the interaction of pulsed lasers with matter on the one hand, and the correlations in finite-size systems (nanoparticles and nuclei) on the other. There is also a discussion on the dynamics of superconductors, a subject currently of great interest. Although an effort has been made to keep each contribution self-contained, I must admit that reading level is uneven. However, there are a number of thorough and stimulating contributions that make this book a useful introduction to the topic at the level of graduate students or researchers acquainted with quantum statistical mechanics. (book review)

  9. The quantum computer game: citizen science

    Science.gov (United States)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  10. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  11. A computer-based time study system for timber harvesting operations

    Science.gov (United States)

    Jingxin Wang; Joe McNeel; John Baumgras

    2003-01-01

    A computer-based time study system was developed for timber harvesting operations. Object-oriented techniques were used to model and design the system. The front-end of the time study system resides on the MS Windows CE and the back-end is supported by MS Access. The system consists of three major components: a handheld system, data transfer interface, and data storage...

  12. Continuous-variable quantum computing in optical time-frequency modes using quantum memories.

    Science.gov (United States)

    Humphreys, Peter C; Kolthammer, W Steven; Nunn, Joshua; Barbieri, Marco; Datta, Animesh; Walmsley, Ian A

    2014-09-26

    We develop a scheme for time-frequency encoded continuous-variable cluster-state quantum computing using quantum memories. In particular, we propose a method to produce, manipulate, and measure two-dimensional cluster states in a single spatial mode by exploiting the intrinsic time-frequency selectivity of Raman quantum memories. Time-frequency encoding enables the scheme to be extremely compact, requiring a number of memories that are a linear function of only the number of different frequencies in which the computational state is encoded, independent of its temporal duration. We therefore show that quantum memories can be a powerful component for scalable photonic quantum information processing architectures.

  13. Key technologies of the server monitor and control system based on GSM short messages

    International Nuclear Information System (INIS)

    Chen Taiwei; Zhou Zhenliu; Liu Baoxu

    2007-01-01

    The network management based on SNMP protocol cannot effectively monitor and control application-system states and key-process states on the computer server. Furthermore, it needs the administrator's longtime surveillance. When the administrator leaves the computer, he can't receive the malfunction message in time. In this paper we present a server monitor and control system based on monitor agents and GSM short messages, introduce the key technology to realize it, and implement a model system in the real network environment. (authors)

  14. VNAP2: a computer program for computation of two-dimensional, time-dependent, compressible, turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Cline, M.C.

    1981-08-01

    VNAP2 is a computer program for calculating turbulent (as well as laminar and inviscid), steady, and unsteady flow. VNAP2 solves the two-dimensional, time-dependent, compressible Navier-Stokes equations. The turbulence is modeled with either an algebraic mixing-length model, a one-equation model, or the Jones-Launder two-equation model. The geometry may be a single- or a dual-flowing stream. The interior grid points are computed using the unsplit MacCormack scheme. Two options to speed up the calculations for high Reynolds number flows are included. The boundary grid points are computed using a reference-plane-characteristic scheme with the viscous terms treated as source functions. An explicit artificial viscosity is included for shock computations. The fluid is assumed to be a perfect gas. The flow boundaries may be arbitrary curved solid walls, inflow/outflow boundaries, or free-jet envelopes. Typical problems that can be solved concern nozzles, inlets, jet-powered afterbodies, airfoils, and free-jet expansions. The accuracy and efficiency of the program are shown by calculations of several inviscid and turbulent flows. The program and its use are described completely, and six sample cases and a code listing are included.

  15. A Matter of Computer Time

    Science.gov (United States)

    Celano, Donna; Neuman, Susan B.

    2010-01-01

    Many low-income children do not have the opportunity to develop the computer skills necessary to succeed in our technological economy. Their only access to computers and the Internet--school, afterschool programs, and community organizations--is woefully inadequate. Educators must work to close this knowledge gap and to ensure that low-income…

  16. A note on computing average state occupation times

    Directory of Open Access Journals (Sweden)

    Jan Beyersmann

    2014-05-01

    Full Text Available Objective: This review discusses how biometricians would probably compute or estimate expected waiting times, if they had the data. Methods: Our framework is a time-inhomogeneous Markov multistate model, where all transition hazards are allowed to be time-varying. We assume that the cumulative transition hazards are given. That is, they are either known, as in a simulation, determined by expert guesses, or obtained via some method of statistical estimation. Our basic tool is product integration, which transforms the transition hazards into the matrix of transition probabilities. Product integration enjoys a rich mathematical theory, which has successfully been used to study probabilistic and statistical aspects of multistate models. Our emphasis will be on practical implementation of product integration, which allows us to numerically approximate the transition probabilities. Average state occupation times and other quantities of interest may then be derived from the transition probabilities.

  17. Computational electrodynamics the finite-difference time-domain method

    CERN Document Server

    Taflove, Allen

    2005-01-01

    This extensively revised and expanded third edition of the Artech House bestseller, Computational Electrodynamics: The Finite-Difference Time-Domain Method, offers engineers the most up-to-date and definitive resource on this critical method for solving Maxwell's equations. The method helps practitioners design antennas, wireless communications devices, high-speed digital and microwave circuits, and integrated optical devices with unsurpassed efficiency. There has been considerable advancement in FDTD computational technology over the past few years, and the third edition brings professionals the very latest details with entirely new chapters on important techniques, major updates on key topics, and new discussions on emerging areas such as nanophotonics. What's more, to supplement the third edition, the authors have created a Web site with solutions to problems, downloadable graphics and videos, and updates, making this new edition the ideal textbook on the subject as well.

  18. Real-time data-intensive computing

    Energy Technology Data Exchange (ETDEWEB)

    Parkinson, Dilworth Y., E-mail: dyparkinson@lbl.gov; Chen, Xian; Hexemer, Alexander; MacDowell, Alastair A.; Padmore, Howard A.; Shapiro, David; Tamura, Nobumichi [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Beattie, Keith; Krishnan, Harinarayan; Patton, Simon J.; Perciano, Talita; Stromsness, Rune; Tull, Craig E.; Ushizima, Daniela [Computational Research Division, Lawrence Berkeley National Laboratory Berkeley CA 94720 (United States); Correa, Joaquin; Deslippe, Jack R. [National Energy Research Scientific Computing Center, Berkeley, CA 94720 (United States); Dart, Eli; Tierney, Brian L. [Energy Sciences Network, Berkeley, CA 94720 (United States); Daurer, Benedikt J.; Maia, Filipe R. N. C. [Uppsala University, Uppsala (Sweden); and others

    2016-07-27

    Today users visit synchrotrons as sources of understanding and discovery—not as sources of just light, and not as sources of data. To achieve this, the synchrotron facilities frequently provide not just light but often the entire end station and increasingly, advanced computational facilities that can reduce terabytes of data into a form that can reveal a new key insight. The Advanced Light Source (ALS) has partnered with high performance computing, fast networking, and applied mathematics groups to create a “super-facility”, giving users simultaneous access to the experimental, computational, and algorithmic resources to make this possible. This combination forms an efficient closed loop, where data—despite its high rate and volume—is transferred and processed immediately and automatically on appropriate computing resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beamtime. We will describe our work at the ALS ptychography, scattering, micro-diffraction, and micro-tomography beamlines.

  19. Spectral phase encoding of ultra-short optical pulse in time domain for OCDMA application.

    Science.gov (United States)

    Wang, Xu; Wada, Naoya

    2007-06-11

    We propose a novel reconfigurable time domain spectral phase encoding (SPE) scheme for coherent optical code-division-multiple-access application. In the proposed scheme, the ultra-short optical pulse is stretched by dispersive device and the SPE is done in time domain using high speed phase modulator. The time domain SPE scheme is robust to wavelength drift of the light source and is very flexible and compatible with the fiber optical system. Proof-of-principle experiments of encoding with 16-chip, 20 GHz/chip binary-phase-shift-keying codes and 1.25 Gbps data transmission have been successfully demonstrated together with an arrayed-wave-guide decoder.

  20. GRAPHIC, time-sharing magnet design computer programs at Argonne

    International Nuclear Information System (INIS)

    Lari, R.J.

    1974-01-01

    This paper describes three magnet design computer programs in use at the Zero Gradient Synchrotron of Argonne National Laboratory. These programs are used in the time sharing mode in conjunction with a Tektronix model 4012 graphic display terminal. The first program in called TRIM, the second MAGNET, and the third GFUN. (U.S.)

  1. Short-time home coming project in evacuation zone

    International Nuclear Information System (INIS)

    Tatsuzaki, Hideo

    2011-01-01

    Accident at Fukushima Daiichi Nuclear Power Plants (NPPs) forced neighboring residents to evacuate, and evacuation zone (20 km radius from NPPs) was defined as highly contaminated and designated as no-entry zones. Residents had been obliged to live a refugee life for a longer period than expected. Short-time home coming project was initiated according to their requests. They came to the meeting place called transfer place (20 - 30 km radius from NPPs), wore protective clothing and personal dosimeter with having drinking water and came home in evacuation zone with staffs by bus. Their healthcare management professionals were fully prepared for emergency. After collecting necessary articles at home within two hours, they returned to the meeting place by bus for screening and dressing, and went back to refuge house. If screening data were greater than 13 kcpm using GM counters, partial body decontamination had been conducted by wiping and if greater than 100 kcpm, whole body decontamination was requested but not conducted. Dose rate of residents and staffs was controlled less than 1 mSv, which was alarm level of personal dosimeter. Stable iodine was prepared but actually not used. (T. Tanaka)

  2. Time computations in anuran auditory systems

    Directory of Open Access Journals (Sweden)

    Gary J Rose

    2014-05-01

    Full Text Available Temporal computations are important in the acoustic communication of anurans. In many cases, calls between closely related species are nearly identical spectrally but differ markedly in temporal structure. Depending on the species, calls can differ in pulse duration, shape and/or rate (i.e., amplitude modulation, direction and rate of frequency modulation, and overall call duration. Also, behavioral studies have shown that anurans are able to discriminate between calls that differ in temporal structure. In the peripheral auditory system, temporal information is coded primarily in the spatiotemporal patterns of activity of auditory-nerve fibers. However, major transformations in the representation of temporal information occur in the central auditory system. In this review I summarize recent advances in understanding how temporal information is represented in the anuran midbrain, with particular emphasis on mechanisms that underlie selectivity for pulse duration and pulse rate (i.e., intervals between onsets of successive pulses. Two types of neurons have been identified that show selectivity for pulse rate: long-interval cells respond well to slow pulse rates but fail to spike or respond phasically to fast pulse rates; conversely, interval-counting neurons respond to intermediate or fast pulse rates, but only after a threshold number of pulses, presented at optimal intervals, have occurred. Duration-selectivity is manifest as short-pass, band-pass or long-pass tuning. Whole-cell patch recordings, in vivo, suggest that excitation and inhibition are integrated in diverse ways to generate temporal selectivity. In many cases, activity-related enhancement or depression of excitatory or inhibitory processes appear to contribute to selective responses.

  3. FRANTIC: a computer code for time dependent unavailability analysis

    International Nuclear Information System (INIS)

    Vesely, W.E.; Goldberg, F.F.

    1977-03-01

    The FRANTIC computer code evaluates the time dependent and average unavailability for any general system model. The code is written in FORTRAN IV for the IBM 370 computer. Non-repairable components, monitored components, and periodically tested components are handled. One unique feature of FRANTIC is the detailed, time dependent modeling of periodic testing which includes the effects of test downtimes, test overrides, detection inefficiencies, and test-caused failures. The exponential distribution is used for the component failure times and periodic equations are developed for the testing and repair contributions. Human errors and common mode failures can be included by assigning an appropriate constant probability for the contributors. The output from FRANTIC consists of tables and plots of the system unavailability along with a breakdown of the unavailability contributions. Sensitivity studies can be simply performed and a wide range of tables and plots can be obtained for reporting purposes. The FRANTIC code represents a first step in the development of an approach that can be of direct value in future system evaluations. Modifications resulting from use of the code, along with the development of reliability data based on operating reactor experience, can be expected to provide increased confidence in its use and potential application to the licensing process

  4. Drought analysis and short-term forecast in the Aison River Basin (Greece

    Directory of Open Access Journals (Sweden)

    S. Kavalieratou

    2012-05-01

    Full Text Available A combined regional drought analysis and forecast is elaborated and applied to the Aison River Basin (Greece. The historical frequency, duration and severity were estimated using the standardized precipitation index (SPI computed on variable time scales, while short-term drought forecast was investigated by means of 3-D loglinear models. A quasi-association model with homogenous diagonal effect was proposed to fit the observed frequencies of class transitions of the SPI values computed on the 12-month time scale. Then, an adapted submodel was selected for each data set through the backward elimination method. The analysis and forecast of the drought class transition probabilities were based on the odds of the expected frequencies, estimated by these submodels, and the respective confidence intervals of these odds. The parsimonious forecast models fitted adequately the observed data. Results gave a comprehensive insight on drought behavior, highlighting a dominant drought period (1988–1991 with extreme drought events and revealing, in most cases, smooth drought class transitions. The proposed approach can be an efficient tool in regional water resources management and short-term drought warning, especially in irrigated districts.

  5. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  6. Computational time analysis of the numerical solution of 3D electrostatic Poisson's equation

    Science.gov (United States)

    Kamboh, Shakeel Ahmed; Labadin, Jane; Rigit, Andrew Ragai Henri; Ling, Tech Chaw; Amur, Khuda Bux; Chaudhary, Muhammad Tayyab

    2015-05-01

    3D Poisson's equation is solved numerically to simulate the electric potential in a prototype design of electrohydrodynamic (EHD) ion-drag micropump. Finite difference method (FDM) is employed to discretize the governing equation. The system of linear equations resulting from FDM is solved iteratively by using the sequential Jacobi (SJ) and sequential Gauss-Seidel (SGS) methods, simulation results are also compared to examine the difference between the results. The main objective was to analyze the computational time required by both the methods with respect to different grid sizes and parallelize the Jacobi method to reduce the computational time. In common, the SGS method is faster than the SJ method but the data parallelism of Jacobi method may produce good speedup over SGS method. In this study, the feasibility of using parallel Jacobi (PJ) method is attempted in relation to SGS method. MATLAB Parallel/Distributed computing environment is used and a parallel code for SJ method is implemented. It was found that for small grid size the SGS method remains dominant over SJ method and PJ method while for large grid size both the sequential methods may take nearly too much processing time to converge. Yet, the PJ method reduces computational time to some extent for large grid sizes.

  7. Rapid transfer of short-lived radioisotopes via a 2. 4 km rabbit system

    Energy Technology Data Exchange (ETDEWEB)

    Burgerjon, J J; Gelbart, Z; Lau, V; Lehnart, D; Lenz, J; Pate, B D; Ruth, T J; Sprenger, H P; van Oers, N S.C.

    1984-09-01

    A 2.4 km long pipeline between a cyclotron and a hospital is used for the rapid transfer of short-lived radiopharmaceuticals. The vials containing the pharmaceuticals are placed inside capsules (rabbits) that are blown through a tube by means of compressed air. Travel times as short as 2 min are achieved, which makes the system suitable for the transfer of /sup 15/O, which has a 2 min half-life. The construction and test results of the system are described along with a computer model, developed to explain some properties of the system. 7 references, 15 figures, 2 tables.

  8. Critical dynamics of the Potts model: short-time Monte Carlo simulations

    International Nuclear Information System (INIS)

    Silva, Roberto da; Drugowich de Felicio, J.R.

    2004-01-01

    We calculate the new dynamic exponent θ of the 4-state Potts model, using short-time simulations. Our estimates θ1=-0.0471(33) and θ2=-0.0429(11) obtained by following the behavior of the magnetization or measuring the evolution of the time correlation function of the magnetization corroborate the conjecture by Okano et al. [Nucl. Phys. B 485 (1997) 727]. In addition, these values agree with previous estimate of the same dynamic exponent for the two-dimensional Ising model with three-spin interactions in one direction, that is known to belong to the same universality class as the 4-state Potts model. The anomalous dimension of initial magnetization x0=zθ+β/ν is calculated by an alternative way that mixes two different initial conditions. We have also estimated the values of the static exponents β and ν. They are in complete agreement with the pertinent results of the literature

  9. Time-resolved plasma spectroscopy of thin foils heated by a relativistic-intensity short-pulse laser

    International Nuclear Information System (INIS)

    Audebert, P.; Gauthier, J.-C.; Shepherd, R.; Fournier, K.B.; Price, D.; Lee, R.W.; Springer, P.; Peyrusse, O.; Klein, L.

    2002-01-01

    Time-resolved K-shell x-ray spectra are recorded from sub-100 nm aluminum foils irradiated by 150-fs laser pulses at relativistic intensities of Iλ 2 =2x10 18 W μm 2 /cm 2 . The thermal penetration depth is greater than the foil thickness in these targets so that uniform heating takes place at constant density before hydrodynamic motion occurs. The high-contrast, high-intensity laser pulse, broad spectral band, and short time resolution utilized in this experiment permit a simplified interpretation of the dynamical evolution of the radiating matter. The observed spectrum displays two distinct phases. At early time, ≤500 fs after detecting target emission, a broad quasicontinuous spectral feature with strong satellite emission from multiply excited levels is seen. At a later time, the He-like resonance line emission is dominant. The time-integrated data is in accord with previous studies with time resolution greater than 1 ps. The early time satellite emission is shown to be a signature of an initial large area, high density, low-temperature plasma created in the foil by fast electrons accelerated by the intense radiation field in the laser spot. We conclude that, because of this early time phenomenon and contrary to previous predictions, a short, high-intensity laser pulse incident on a thin foil does not create a uniform hot and dense plasma. The heating mechanism has been studied as a function of foil thickness, laser pulse length, and intensity. In addition, the spectra are found to be in broad agreement with a hydrodynamic expansion code postprocessed by a collisional-radiative model based on superconfiguration average rates and on the unresolved transition array formalism

  10. Numerical solution of stiff burnup equation with short half lived nuclides by the Krylov subspace method

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Tatsumi, Masahiro; Sugimura, Naoki

    2007-01-01

    The Krylov subspace method is applied to solve nuclide burnup equations used for lattice physics calculations. The Krylov method is an efficient approach for solving ordinary differential equations with stiff nature such as the nuclide burnup with short lived nuclides. Some mathematical fundamentals of the Krylov subspace method and its application to burnup equations are discussed. Verification calculations are carried out in a PWR pin-cell geometry with UO 2 fuel. A detailed burnup chain that includes 193 fission products and 28 heavy nuclides is used in the verification calculations. Shortest half life found in the present burnup chain is approximately 30 s ( 106 Rh). Therefore, conventional methods (e.g., the Taylor series expansion with scaling and squaring) tend to require longer computation time due to numerical stiffness. Comparison with other numerical methods (e.g., the 4-th order Runge-Kutta-Gill) reveals that the Krylov subspace method can provide accurate solution for a detailed burnup chain used in the present study with short computation time. (author)

  11. Short- and long-run time-of-use price elasticities in Swiss residential electricity demand

    International Nuclear Information System (INIS)

    Filippini, Massimo

    2011-01-01

    This paper presents an empirical analysis on the residential demand for electricity by time-of-day. This analysis has been performed using aggregate data at the city level for 22 Swiss cities for the period 2000-2006. For this purpose, we estimated two log-log demand equations for peak and off-peak electricity consumption using static and dynamic partial adjustment approaches. These demand functions were estimated using several econometric approaches for panel data, for example LSDV and RE for static models, and LSDV and corrected LSDV estimators for dynamic models. The attempt of this empirical analysis has been to highlight some of the characteristics of the Swiss residential electricity demand. The estimated short-run own price elasticities are lower than 1, whereas in the long-run these values are higher than 1. The estimated short-run and long-run cross-price elasticities are positive. This result shows that peak and off-peak electricity are substitutes. In this context, time differentiated prices should provide an economic incentive to customers so that they can modify consumption patterns by reducing peak demand and shifting electricity consumption from peak to off-peak periods. - Highlights: → Empirical analysis on the residential demand for electricity by time-of-day. → Estimators for dynamic panel data. → Peak and off-peak residential electricity are substitutes.

  12. Spectrogram analysis of selected tremor signals using short-time Fourier transform and continuous wavelet transform

    Energy Technology Data Exchange (ETDEWEB)

    Bartosch, T. [Erlanger-Nuernberg Univ., Erlanger (Germany). Lehrstul fuer Nachrichtentechnik I; Seidl, D. [Seismologisches Zentralobservatorium Graefenberg, Erlanegen (Greece). Bundesanstalt fuer Geiwissenschaften und Rohstoffe

    1999-06-01

    Among a variety of spectrogram methods short-time Fourier transform (STFT) and continuous wavelet transform (CWT) were selected to analyse transients in non-stationary signals. Depending on the properties of the tremor signals from the volcanos Mt. Stromboli, Mt. Semeru and Mt. Pinatubo were analyzed using both methods. The CWT can also be used to extend the definition of coherency into a time-varying coherency spectrogram. An example is given using array data from the volcano Mt. Stromboli (Italy).

  13. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Marlowe Thomas J.

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants. We illustrate this approach using threshhold graphs, and show that any computation of reliability using Gilbert's formula will be polynomial-time if and only if the number of invariants considered is polynomial; we then show families of graphs with polynomial-time, and non-polynomial reliability computation, and show that these encompass most previously known results. We then codify our approach to indicate how it can be used for other classes of graphs, and suggest several classes to which the technique can be applied.

  14. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  15. Real-time systems scheduling fundamentals

    CERN Document Server

    Chetto, Maryline

    2014-01-01

    Real-time systems are used in a wide range of applications, including control, sensing, multimedia, etc.  Scheduling is a central problem for these computing/communication systems since responsible of software execution in a timely manner. This book provides state of knowledge in this domain with special emphasis on the key results obtained within the last decade. This book addresses foundations as well as the latest advances and findings in Real-Time Scheduling, giving all references to important papers. But nevertheless the chapters will be short and not overloaded with confusing details.

  16. 11th International Conference on Computer and Information Science

    CERN Document Server

    Computer and Information 2012

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the 11th IEEE/ACIS International Conference on Computer and Information Science (ICIS 2012...

  17. Measurement of radon-222 concentration in environment sampled within short time using charcoal detector

    International Nuclear Information System (INIS)

    Yamasaki, Tadashi; Sekiyama, Shigenobu; Tokin, Mina; Nakayasu, Yumiko; Watanabe, Tamaki.

    1994-01-01

    The concentration of 222 Rn in air sampled within a very short period of time was measured using activated charcoal as the adsorber. The detector is the plastic canister containing mixture of the activated charcoal and the silica gel. The radon gas was adsorbed in the charcoal in the radon chamber at the temperature of 25degC. A little amount of liquid scintillation cocktail was added into the vial of liquid scintillation counter with the canister. The radon in the charcoal was extracted in the liquid scintillation cocktail. Alpha particles emitted from radon and its daughter nuclei in the cocktail were detected using the liquid scintillation counter. Present method has advantages of not only short sampling time of air but also adsorption of radon in charcoal under a constant temperature. The concentration of radon in air down to 2 Bq/m 3 could be detected. A kinetic model for adsorption of radon in the charcoal is also presented. The ratio of radon concentration in the charcoal to that in air under the equilibrium state of adsorption was estimated to be from 6.1 to 6.8 m 3 /kg at the temperature of 25degC. (author)

  18. Reactor safety: the Nova computer system

    International Nuclear Information System (INIS)

    Eisgruber, H.; Stadelmann, W.

    1991-01-01

    After instances of maloperation, the causes of defects, the effectiveness of the measures taken to control the situation, and possibilities to avoid future recurrences need to be investigated above all before the plant is restarted. The most important aspect in all these efforts is to check the sequence in time, and the completeness, of the control measures initiated automatically. For this verification, a computer system is used instead of time-consuming manual analytical techniques, which produces the necessary information almost in real time. The results are available within minutes after completion of the measures initiated automatically. As all short-term safety functions are initiated by automatic systems, their consistent and comprehensive verification results in a clearly higher level of safety. The report covers the development of the computer system, and its implementation, in the Gundremmingen nuclear power station. Similar plans are being pursued in Biblis and Muelheim-Kaerlich. (orig.) [de

  19. Computing Refined Buneman Trees in Cubic Time

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.; Östlin, A.

    2003-01-01

    Reconstructing the evolutionary tree for a set of n species based on pairwise distances between the species is a fundamental problem in bioinformatics. Neighbor joining is a popular distance based tree reconstruction method. It always proposes fully resolved binary trees despite missing evidence...... in the underlying distance data. Distance based methods based on the theory of Buneman trees and refined Buneman trees avoid this problem by only proposing evolutionary trees whose edges satisfy a number of constraints. These trees might not be fully resolved but there is strong combinatorial evidence for each...... proposed edge. The currently best algorithm for computing the refined Buneman tree from a given distance measure has a running time of O(n 5) and a space consumption of O(n 4). In this paper, we present an algorithm with running time O(n 3) and space consumption O(n 2). The improved complexity of our...

  20. Short-term Forecasting Tools for Agricultural Nutrient Management.

    Science.gov (United States)

    Easton, Zachary M; Kleinman, Peter J A; Buda, Anthony R; Goering, Dustin; Emberston, Nichole; Reed, Seann; Drohan, Patrick J; Walter, M Todd; Guinan, Pat; Lory, John A; Sommerlot, Andrew R; Sharpley, Andrew

    2017-11-01

    The advent of real-time, short-term farm management tools is motivated by the need to protect water quality above and beyond the general guidance offered by existing nutrient management plans. Advances in high-performance computing and hydrologic or climate modeling have enabled rapid dissemination of real-time information that can assist landowners and conservation personnel with short-term management planning. This paper reviews short-term decision support tools for agriculture that are under various stages of development and implementation in the United States: (i) Wisconsin's Runoff Risk Advisory Forecast (RRAF) System, (ii) New York's Hydrologically Sensitive Area Prediction Tool, (iii) Virginia's Saturated Area Forecast Model, (iv) Pennsylvania's Fertilizer Forecaster, (v) Washington's Application Risk Management (ARM) System, and (vi) Missouri's Design Storm Notification System. Although these decision support tools differ in their underlying model structure, the resolution at which they are applied, and the hydroclimates to which they are relevant, all provide forecasts (range 24-120 h) of runoff risk or soil moisture saturation derived from National Weather Service Forecast models. Although this review highlights the need for further development of robust and well-supported short-term nutrient management tools, their potential for adoption and ultimate utility requires an understanding of the appropriate context of application, the strategic and operational needs of managers, access to weather forecasts, scales of application (e.g., regional vs. field level), data requirements, and outreach communication structure. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  1. Short-term outcomes and safety of computed tomography-guided percutaneous microwave ablation of solitary adrenal metastasis from lung cancer: A multi-center retrospective study

    Energy Technology Data Exchange (ETDEWEB)

    Men, Min; Ye, Xin; Yang, Xia; Zheng, Aimin; Huang, Guang Hui; Wei, Zhigang [Dept. of Oncology, Shandong Provincial Hospital Affiliated with Shandong University, Jinan (China); Fan, Wei Jun [Imaging and Interventional Center, Sun Yat-sen University Cancer Center, Guangzhou (China); Zhang, Kaixian [Dept. of Oncology, Teng Zhou Central People' s Hospital Affiliated with Jining Medical College, Tengzhou (China); Bi, Jing Wang [Dept. of Oncology, Jinan Military General Hospital of Chinese People' s Liberation Army, Jinan (China)

    2016-11-15

    To retrospectively evaluate the short-term outcomes and safety of computed tomography (CT)-guided percutaneous microwave ablation (MWA) of solitary adrenal metastasis from lung cancer. From May 2010 to April 2014, 31 patients with unilateral adrenal metastasis from lung cancer who were treated with CT-guided percutaneous MWA were enrolled. This study was conducted with approval from local Institutional Review Board. Clinical outcomes and complications of MWA were assessed. Their tumors ranged from 1.5 to 5.4 cm in diameter. After a median follow-up period of 11.1 months, primary efficacy rate was 90.3% (28/31). Local tumor progression was detected in 7 (22.6%) of 31 cases. Their median overall survival time was 12 months. The 1-year overall survival rate was 44.3%. Median local tumor progression-free survival time was 9 months. Local tumor progression-free survival rate was 77.4%. Of 36 MWA sessions, two (5.6%) had major complications (hypertensive crisis). CT-guided percutaneous MWA may be fairly safe and effective for treating solitary adrenal metastasis from lung cancer.

  2. Computational intelligence in time series forecasting theory and engineering applications

    CERN Document Server

    Palit, Ajoy K

    2005-01-01

    Foresight in an engineering enterprise can make the difference between success and failure, and can be vital to the effective control of industrial systems. Applying time series analysis in the on-line milieu of most industrial plants has been problematic owing to the time and computational effort required. The advent of soft computing tools offers a solution. The authors harness the power of intelligent technologies individually and in combination. Examples of the particular systems and processes susceptible to each technique are investigated, cultivating a comprehensive exposition of the improvements on offer in quality, model building and predictive control and the selection of appropriate tools from the plethora available. Application-oriented engineers in process control, manufacturing, production industry and research centres will find much to interest them in this book. It is suitable for industrial training purposes, as well as serving as valuable reference material for experimental researchers.

  3. Theory and computation of disturbance invariant sets for discrete-time linear systems

    Directory of Open Access Journals (Sweden)

    Kolmanovsky Ilya

    1998-01-01

    Full Text Available This paper considers the characterization and computation of invariant sets for discrete-time, time-invariant, linear systems with disturbance inputs whose values are confined to a specified compact set but are otherwise unknown. The emphasis is on determining maximal disturbance-invariant sets X that belong to a specified subset Γ of the state space. Such d-invariant sets have important applications in control problems where there are pointwise-in-time state constraints of the form χ ( t ∈ Γ . One purpose of the paper is to unite and extend in a rigorous way disparate results from the prior literature. In addition there are entirely new results. Specific contributions include: exploitation of the Pontryagin set difference to clarify conceptual matters and simplify mathematical developments, special properties of maximal invariant sets and conditions for their finite determination, algorithms for generating concrete representations of maximal invariant sets, practical computational questions, extension of the main results to general Lyapunov stable systems, applications of the computational techniques to the bounding of state and output response. Results on Lyapunov stable systems are applied to the implementation of a logic-based, nonlinear multimode regulator. For plants with disturbance inputs and state-control constraints it enlarges the constraint-admissible domain of attraction. Numerical examples illustrate the various theoretical and computational results.

  4. Verification of short lead time forecast models: applied to Kp and Dst forecasting

    Science.gov (United States)

    Wintoft, Peter; Wik, Magnus

    2016-04-01

    In the ongoing EU/H2020 project PROGRESS models that predicts Kp, Dst, and AE from L1 solar wind data will be used as inputs to radiation belt models. The possible lead times from L1 measurements are shorter (10s of minutes to hours) than the typical duration of the physical phenomena that should be forecast. Under these circumstances several metrics fail to single out trivial cases, such as persistence. In this work we explore metrics and approaches for short lead time forecasts. We apply these to current Kp and Dst forecast models. This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 637302.

  5. Distributed Issues for Ada Real-Time Systems

    Science.gov (United States)

    1990-07-23

    NUMBERS Distributed Issues for Ada Real - Time Systems MDA 903-87- C- 0056 S. AUTHOR(S) Thomas E. Griest 7. PERFORMING ORGANiZATION NAME(S) AND ADORESS(ES) 8...considerations. I Adding to the problem of distributed real - time systems is the issue of maintaining a common sense of time among all of the processors...because -omeone is waiting for the final output of a very large set of computations. However in real - time systems , consistent meeting of short-term

  6. Ultrasonic divergent-beam scanner for time-of-flight tomography with computer evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Glover, G H

    1978-03-02

    The rotatable ultrasonic divergent-beam scanner is designed for time-of-flight tomography with computer evaluation. With it there can be measured parameters that are of importance for the structure of soft tissues, e.g. time as a function of the velocity distribution along a certain path of flight(the method is analogous to the transaxial X-ray tomography). Moreover it permits to perform the quantitative measurement of two-dimensional velocity distributions and may therefore be applied to serial examinations for detecting cancer of the breast. As computers digital memories as well as analog-digital-hybrid systems are suitable.

  7. Computing moment to moment BOLD activation for real-time neurofeedback

    Science.gov (United States)

    Hinds, Oliver; Ghosh, Satrajit; Thompson, Todd W.; Yoo, Julie J.; Whitfield-Gabrieli, Susan; Triantafyllou, Christina; Gabrieli, John D.E.

    2013-01-01

    Estimating moment to moment changes in blood oxygenation level dependent (BOLD) activation levels from functional magnetic resonance imaging (fMRI) data has applications for learned regulation of regional activation, brain state monitoring, and brain-machine interfaces. In each of these contexts, accurate estimation of the BOLD signal in as little time as possible is desired. This is a challenging problem due to the low signal-to-noise ratio of fMRI data. Previous methods for real-time fMRI analysis have either sacrificed the ability to compute moment to moment activation changes by averaging several acquisitions into a single activation estimate or have sacrificed accuracy by failing to account for prominent sources of noise in the fMRI signal. Here we present a new method for computing the amount of activation present in a single fMRI acquisition that separates moment to moment changes in the fMRI signal intensity attributable to neural sources from those due to noise, resulting in a feedback signal more reflective of neural activation. This method computes an incremental general linear model fit to the fMRI timeseries, which is used to calculate the expected signal intensity at each new acquisition. The difference between the measured intensity and the expected intensity is scaled by the variance of the estimator in order to transform this residual difference into a statistic. Both synthetic and real data were used to validate this method and compare it to the only other published real-time fMRI method. PMID:20682350

  8. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate key elements of feasibility for a high speed automated time domain terahertz computed axial tomography (TD-THz CT) non destructive...

  9. Increasing Short-Stay Unplanned Hospital Admissions among Children in England; Time Trends Analysis ’97–‘06

    Science.gov (United States)

    Saxena, Sonia; Bottle, Alex; Gilbert, Ruth; Sharland, Mike

    2009-01-01

    Background Timely care by general practitioners in the community keeps children out of hospital and provides better continuity of care. Yet in the UK, access to primary care has diminished since 2004 when changes in general practitioners' contracts enabled them to ‘opt out’ of providing out-of-hours care and since then unplanned pediatric hospital admission rates have escalated, particularly through emergency departments. We hypothesised that any increase in isolated short stay admissions for childhood illness might reflect failure to manage these cases in the community over a 10 year period spanning these changes. Methods and Findings We conducted a population based time trends study of major causes of hospital admission in children 2 days. By 2006, 67.3% of all unplanned admissions were isolated short stays <2 days. The increases in admission rates were greater for common non-infectious than infectious causes of admissions. Conclusions Short stay unplanned hospital admission rates in young children in England have increased substantially in recent years and are not accounted for by reductions in length of in-hospital stay. The majority are isolated short stay admissions for minor illness episodes that could be better managed by primary care in the community and may be evidence of a failure of primary care services. PMID:19829695

  10. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase 2 project, we propose to develop, construct, and deliver to NASA a computed axial tomography time-domain terahertz (CT TD-THz) non destructive...

  11. The reliable solution and computation time of variable parameters logistic model

    Science.gov (United States)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  12. The DIII-D Computing Environment: Characteristics and Recent Changes

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1999-01-01

    The DIII-D tokamak national fusion research facility along with its predecessor Doublet III has been operating for over 21 years. The DIII-D computing environment consists of real-time systems controlling the tokamak, heating systems, and diagnostics, and systems acquiring experimental data from instrumentation; major data analysis server nodes performing short term and long term data access and data analysis; and systems providing mechanisms for remote collaboration and the dissemination of information over the world wide web. Computer systems for the facility have undergone incredible changes over the course of time as the computer industry has changed dramatically. Yet there are certain valuable characteristics of the DIII-D computing environment that have been developed over time and have been maintained to this day. Some of these characteristics include: continuous computer infrastructure improvements, distributed data and data access, computing platform integration, and remote collaborations. These characteristics are being carried forward as well as new characteristics resulting from recent changes which have included: a dedicated storage system and a hierarchical storage management system for raw shot data, various further infrastructure improvements including deployment of Fast Ethernet, the introduction of MDSplus, LSF and common IDL based tools, and improvements to remote collaboration capabilities. This paper will describe this computing environment, important characteristics that over the years have contributed to the success of DIII-D computing systems, and recent changes to computer systems

  13. Short-time scale coupling between thermohaline and meteorological forcing in the Ría de Pontevedra

    Directory of Open Access Journals (Sweden)

    Paula C. Pardo

    2001-07-01

    Full Text Available Two cruises were performed in May-June and October-November 1997 in the Ría de Pontevedra under strong downwelling conditions. Temperature and salinity data were recorded in short sampling periods to describe the changes in thermohaline property distribution in a short time scale. In order to obtain the residual fluxes in the Ría, a bi-dimensional non-stationary salt and thermal-energy weight averaged box-model was applied. Outputs from this kinematic model were compared with Upwelling Index, river flow and density gradient, resulting in a good multiple correlation, which proves the strong coupling between thermohaline properties and meteorological variability. Ekman forcing affects the whole area but mainly controls the dynamics of outer zones. The intensity of its effect on the circulation pattern within the Ría depends on the grade of stratification of the water bodies. River flow is more relevant in inner parts. According to estimated spatially averaged velocities, water residence time is lower than two weeks in outer parts of the Ría, and decreases toward the inner zones.

  14. Quantum Vertex Model for Reversible Classical Computing

    Science.gov (United States)

    Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng

    We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.

  15. Time perspective in hereditary cancer: psychometric properties of a short form of the Zimbardo Time Perspective Inventory in a community and clinical sample.

    Science.gov (United States)

    Wakefield, Claire E; Homewood, Judi; Taylor, Alan; Mahmut, Mehmet; Meiser, Bettina

    2010-10-01

    We aimed to assess the psychometric properties of a 25-item short form of the Zimbardo Time Perspective Inventory in a community sample (N = 276) and in individuals with a strong family history of cancer, considering genetic testing for cancer risk (N = 338). In the community sample, individuals with high past-negative or present-fatalistic scores had higher levels of distress, as measured by depression, anxiety, and aggression. Similarly, in the patient sample, past-negative time perspective was positively correlated with distress, uncertainty, and postdecision regret when making a decision about genetic testing. Past-negative-oriented individuals were also more likely to be undecided about, or against, genetic testing. Hedonism was associated with being less likely to read the educational materials they received at their clinic, and fatalism was associated with having lower knowledge levels about genetic testing. The assessment of time perspective in individuals at increased risk of cancer can provide valuable clinical insights. However, further investigation of the psychometric properties of the short form of this scale is warranted, as it did not meet the currently accepted criteria for psychometric validation studies.

  16. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  17. 78 FR 38949 - Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response

    Science.gov (United States)

    2013-06-28

    ... exposed to various forms of cyber attack. In some cases, attacks can be thwarted through the use of...-3383-01] Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response... systems will be successfully attacked. When a successful attack occurs, the job of a Computer Security...

  18. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Haimin Yang

    2017-01-01

    Full Text Available Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam, for long short-term memory (LSTM to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  19. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory.

    Science.gov (United States)

    Yang, Haimin; Pan, Zhisong; Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  20. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.

    1978-03-01

    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  1. Flow around an oscillating cylinder: computational issues

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Fengjian; Gallardo, José P; Pettersen, Bjørnar [Department of Marine Technology, Norwegian University of Science and Technology, NO-7491 Trondheim (Norway); Andersson, Helge I, E-mail: fengjian.jiang@ntnu.no [Department of Energy and Process Engineering, Norwegian University of Science and Technology, NO-7491 Trondheim (Norway)

    2017-10-15

    We consider different computational issues related to the three-dimensionalities of the flow around an oscillating circular cylinder. The full time-dependent Navier–Stokes equations are directly solved in a moving reference frame by introducing a forcing term. The choice of quantitative validation criteria is discussed and discrepancies of previously published results are addressed. The development of Honji vortices shows that short simulation times may lead to incorrect quasi-stable vortex patterns. The viscous decay of already established Honji vortices is also examined. (paper)

  2. A Unified Framework for Estimating Minimum Detectable Effects for Comparative Short Interrupted Time Series Designs

    Science.gov (United States)

    Price, Cristofer; Unlu, Fatih

    2014-01-01

    The Comparative Short Interrupted Time Series (C-SITS) design is a frequently employed quasi-experimental method, in which the pre- and post-intervention changes observed in the outcome levels of a treatment group is compared with those of a comparison group where the difference between the former and the latter is attributed to the treatment. The…

  3. Computational Bench Testing to Evaluate the Short-Term Mechanical Performance of a Polymeric Stent.

    Science.gov (United States)

    Bobel, A C; Petisco, S; Sarasua, J R; Wang, W; McHugh, P E

    2015-12-01

    Over the last decade, there has been a significant volume of research focussed on the utilization of biodegradable polymers such as poly-L-lactide-acid (PLLA) for applications associated with cardiovascular disease. More specifically, there has been an emphasis on upgrading current clinical shortfalls experienced with conventional bare metal stents and drug eluting stents. One such approach, the adaption of fully formed polymeric stents has led to a small number of products being commercialized. Unfortunately, these products are still in their market infancy, meaning there is a clear non-occurrence of long term data which can support their mechanical performance in vivo. Moreover, the load carry capacity and other mechanical properties essential to a fully optimized polymeric stent are difficult, timely and costly to establish. With the aim of compiling rapid and representative performance data for specific stent geometries, materials and designs, in addition to reducing experimental timeframes, Computational bench testing via finite element analysis (FEA) offers itself as a very powerful tool. On this basis, the research presented in this paper is concentrated on the finite element simulation of the mechanical performance of PLLA, which is a fully biodegradable polymer, in the stent application, using a non-linear viscous material model. Three physical stent geometries, typically used for fully polymeric stents, are selected, and a comparative study is performed in relation to their short-term mechanical performance, with the aid of experimental data. From the simulated output results, an informed understanding can be established in relation to radial strength, flexibility and longitudinal resistance, that can be compared with conventional permanent metal stent functionality, and the results show that it is indeed possible to generate a PLLA stent with comparable and sufficient mechanical performance. The paper also demonstrates the attractiveness of FEA as a tool

  4. Utilization of Short-Simulations for Tuning High-Resolution Climate Model

    Science.gov (United States)

    Lin, W.; Xie, S.; Ma, P. L.; Rasch, P. J.; Qian, Y.; Wan, H.; Ma, H. Y.; Klein, S. A.

    2016-12-01

    Many physical parameterizations in atmospheric models are sensitive to resolution. Tuning the models that involve a multitude of parameters at high resolution is computationally expensive, particularly when relying primarily on multi-year simulations. This work describes a complementary set of strategies for tuning high-resolution atmospheric models, using ensembles of short simulations to reduce the computational cost and elapsed time. Specifically, we utilize the hindcast approach developed through the DOE Cloud Associated Parameterization Testbed (CAPT) project for high-resolution model tuning, which is guided by a combination of short (tests have been found to be effective in numerous previous studies in identifying model biases due to parameterized fast physics, and we demonstrate that it is also useful for tuning. After the most egregious errors are addressed through an initial "rough" tuning phase, longer simulations are performed to "hone in" on model features that evolve over longer timescales. We explore these strategies to tune the DOE ACME (Accelerated Climate Modeling for Energy) model. For the ACME model at 0.25° resolution, it is confirmed that, given the same parameters, major biases in global mean statistics and many spatial features are consistent between Atmospheric Model Intercomparison Project (AMIP)-type simulations and CAPT-type hindcasts, with just a small number of short-term simulations for the latter over the corresponding season. The use of CAPT hindcasts to find parameter choice for the reduction of large model biases dramatically improves the turnaround time for the tuning at high resolution. Improvement seen in CAPT hindcasts generally translates to improved AMIP-type simulations. An iterative CAPT-AMIP tuning approach is therefore adopted during each major tuning cycle, with the former to survey the likely responses and narrow the parameter space, and the latter to verify the results in climate context along with assessment in

  5. Effects of elevated CO[sub 2] on time of flowering in four short-day and four long-day species

    Energy Technology Data Exchange (ETDEWEB)

    Reekie, J.Y.C.; Hicklenton, P.R. (Agriculture Canada Research Station, Kentiville, NS (Canada)); Reekie, E.G. (Acadia Univ., Wolfville, NS (Canada))

    1994-01-01

    A study was undertaken to determine if the effect of elevated CO[sub 2] on flowering phenology is a function of the photoperiodic response of the species involved. Four long-day plants, Achillea millefolium, Callistephus chinensis, Campanula isophylla, and Trachelium caeruleum, and four short-day plants, Dendranthema grandiflora, Kalanchoe blossfeldiana, Pharbitis nil, and Xanthium pensylvanicum, were grown under inductive photoperiods (9 h for short day and 17 h for long day) at either 350 or 1000 [mu]l/l CO[sub 2]. Time of visible flower bud formation, flower opening, and final plant biomass were assessed. Elevated CO[sub 2] advanced flower opening in all four long-day species and delayed flowering in all four short-day species. In the long-day species, the effect of CO[sub 2] was primarily on bud initiation; all four species formed buds earlier at high CO[sub 2]. Bud development, the difference in time between flower opening and bud initiation, was advanced in only one long-day species, Callistephus chinensis. Mixed results were obtained for the short-day species. Elevated CO[sub 2] exerted no effects on bud initiation but delayed bud development in Dendranthema and Kalanchoe. In Xanthium, bud initiation rather than bud development was delayed. Data on bud initiation and development were not obtained for Pharbitis. The negative effect of CO[sub 2] upon phenology in the short-day species was not associated with negative effects on growth. Elevated CO[sub 2] increased plant size in both long-day and short-day species. 26 refs., 4 tabs.

  6. Short-range correlations in an extended time-dependent mean-field theory

    International Nuclear Information System (INIS)

    Madler, P.

    1982-01-01

    A generalization is performed of the time-dependent mean-field theory by an explicit inclusion of strong short-range correlations on a level of microscopic reversibility relating them to realistic nucleon-nucleon forces. Invoking a least action principle for correlated trial wave functions, equations of motion for the correlation functions and the single-particle model wave function are derived in lowest order of the FAHT cluster expansion. Higher order effects as well as long-range correlations are consider only to the extent to which they contribute to the mean field via a readjusted phenomenological effective two-body interaction. The corresponding correlated stationary problem is investigated and appropriate initial conditions to describe a heavy ion reaction are proposed. The singleparticle density matrix is evaluated

  7. SOAP2: an improved ultrafast tool for short read alignment

    DEFF Research Database (Denmark)

    Li, Ruiqiang; Yu, Chang; Li, Yingrui

    2009-01-01

    SUMMARY: SOAP2 is a significantly improved version of the short oligonucleotide alignment program that both reduces computer memory usage and increases alignment speed at an unprecedented rate. We used a Burrows Wheeler Transformation (BWT) compression index to substitute the seed strategy...... for indexing the reference sequence in the main memory. We tested it on the whole human genome and found that this new algorithm reduced memory usage from 14.7 to 5.4 GB and improved alignment speed by 20-30 times. SOAP2 is compatible with both single- and paired-end reads. Additionally, this tool now supports...... multiple text and compressed file formats. A consensus builder has also been developed for consensus assembly and SNP detection from alignment of short reads on a reference genome. AVAILABILITY: http://soap.genomics.org.cn....

  8. Computation of the target state and feedback controls for time optimal consensus in multi-agent systems

    Science.gov (United States)

    Mulla, Ameer K.; Patil, Deepak U.; Chakraborty, Debraj

    2018-02-01

    N identical agents with bounded inputs aim to reach a common target state (consensus) in the minimum possible time. Algorithms for computing this time-optimal consensus point, the control law to be used by each agent and the time taken for the consensus to occur, are proposed. Two types of multi-agent systems are considered, namely (1) coupled single-integrator agents on a plane and, (2) double-integrator agents on a line. At the initial time instant, each agent is assumed to have access to the state information of all the other agents. An algorithm, using convexity of attainable sets and Helly's theorem, is proposed, to compute the final consensus target state and the minimum time to achieve this consensus. Further, parts of the computation are parallelised amongst the agents such that each agent has to perform computations of O(N2) run time complexity. Finally, local feedback time-optimal control laws are synthesised to drive each agent to the target point in minimum time. During this part of the operation, the controller for each agent uses measurements of only its own states and does not need to communicate with any neighbouring agents.

  9. Gamma spectrometric characterization of short cooling time nuclear spent fuels using hemispheric CdZnTe detectors

    CERN Document Server

    Lebrun, A; Szabó, J L; Arenas-Carrasco, J; Arlt, R; Dubreuil, A; Esmailpur-Kazerouni, K

    2000-01-01

    After years of cooling, nuclear spent fuel gamma emissions are mainly due to caesium isotopes which are emitters at 605, 662 and 796-801 keV. Extensive work has been done on such fuels using various CdTe or CdZnTe probes. When fuels have to be measured after short cooling time (during NPP outage) the spectrum is much more complex due to the important contributions of niobium and zirconium in the 700 keV range. For the first time in a nuclear power plant, four spent fuels of the Kozloduy VVER reactor no 4 were measured during outage, 37 days after shutdown of the reactor. In such conditions, good resolution is of particular interest, so a 20 mm sup 3 hemispheric crystal was used with a resolution better than 7 keV at 662 keV. This paper presents the experimental device and analyzes the results which show that CdZnTe commercially available detectors enabled us to perform a semi-quantitative determination of the burn-up after a short cooling time. In addition, it is discussed how a burn-up evolution code (CESAR)...

  10. On some methods for improving time of reachability sets computation for the dynamic system control problem

    Science.gov (United States)

    Zimovets, Artem; Matviychuk, Alexander; Ushakov, Vladimir

    2016-12-01

    The paper presents two different approaches to reduce the time of computer calculation of reachability sets. First of these two approaches use different data structures for storing the reachability sets in the computer memory for calculation in single-threaded mode. Second approach is based on using parallel algorithms with reference to the data structures from the first approach. Within the framework of this paper parallel algorithm of approximate reachability set calculation on computer with SMP-architecture is proposed. The results of numerical modelling are presented in the form of tables which demonstrate high efficiency of parallel computing technology and also show how computing time depends on the used data structure.

  11. A computationally simple and robust method to detect determinism in a time series

    DEFF Research Database (Denmark)

    Lu, Sheng; Ju, Ki Hwan; Kanters, Jørgen K.

    2006-01-01

    We present a new, simple, and fast computational technique, termed the incremental slope (IS), that can accurately distinguish between deterministic from stochastic systems even when the variance of noise is as large or greater than the signal, and remains robust for time-varying signals. The IS ......We present a new, simple, and fast computational technique, termed the incremental slope (IS), that can accurately distinguish between deterministic from stochastic systems even when the variance of noise is as large or greater than the signal, and remains robust for time-varying signals...

  12. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  13. Real-time systems scheduling 2 focuses

    CERN Document Server

    Chetto, Maryline

    2014-01-01

    Real-time systems are used in a wide range of applications, including control, sensing, multimedia, etc. Scheduling is a central problem for these computing/communication systems since it is responsible for software execution in a timely manner. This book, the second of two volumes on the subject, brings together knowledge on specific topics and discusses the recent advances for some of them.  It addresses foundations as well as the latest advances and findings in real-time scheduling, giving comprehensive references to important papers, but the chapters are short and not overloaded with co

  14. Integrated computation model of lithium-ion battery subject to nail penetration

    International Nuclear Information System (INIS)

    Liu, Binghe; Yin, Sha; Xu, Jun

    2016-01-01

    Highlights: • A coupling model to predict battery penetration process is established. • Penetration test is designed and validates the computational model. • Governing factors of the penetration induced short-circuit is discussed. • Critical safety battery design guidance is suggested. - Abstract: The nail penetration of lithium-ion batteries (LIBs) has become a standard battery safety evaluation method to mimic the potential penetration of a foreign object into LIB, which can lead to internal short circuit with catastrophic consequences, such as thermal runaway, fire, and explosion. To provide a safe, time-efficient, and cost-effective method for studying the nail penetration problem, an integrated computational method that considers the mechanical, electrochemical, and thermal behaviors of the jellyroll was developed using a coupled 3D mechanical model, a 1D battery model, and a short circuit model. The integrated model, along with the sub-models, was validated to agree reasonably well with experimental test data. In addition, a comprehensive quantitative analysis of governing factors, e.g., shapes, sizes, and displacements of nails, states of charge, and penetration speeds, was conducted. The proposed computational framework for LIB nail penetration was first introduced. This framework can provide an accurate prediction of the time history profile of battery voltage, temperature, and mechanical behavior. The factors that affected the behavior of the jellyroll under nail penetration were discussed systematically. Results provide a solid foundation for future in-depth studies on LIB nail penetration mechanisms and safety design.

  15. Online Operation Guidance of Computer System Used in Real-Time Distance Education Environment

    Science.gov (United States)

    He, Aiguo

    2011-01-01

    Computer system is useful for improving real time and interactive distance education activities. Especially in the case that a large number of students participate in one distance lecture together and every student uses their own computer to share teaching materials or control discussions over the virtual classrooms. The problem is that within…

  16. Development of a system for real-time measurements of metabolite transport in plants using short-lived positron-emitting radiotracers

    Science.gov (United States)

    Kiser, Matthew R.

    Over the past 200 years, the Earth's atmospheric carbon dioxide (CO 2) concentration has increased by more than 35%, and climate experts predict that CO2 levels may double by the end of this century. Understanding the mechanisms of resource management in plants is fundamental for predicting how plants will respond to the increase in atmospheric CO 2. Plant productivity sustains life on Earth and is a principal component of the planet's system that regulates atmospheric CO2 concentration. As such, one of the central goals of plant science is to understand the regulatory mechanisms of plant growth in a changing environment. Short-lived positron-emitting radiotracer techniques provide time-dependent data that are critical for developing models of metabolite transport and resource distribution in plants and their microenvironments. To better understand the effects of environmental changes on resource transport and allocation in plants, we have developed a system for real-time measurements of rnetabolite transport in plants using short-lived positron-emitting radio-tracers. This thesis project includes the design, construction, and demonstration of the capabilities of this system for performing real-time measurements of metabolite transport in plants. The short-lived radiotracer system described in this dissertation takes advantage of the combined capabilities and close proximity of two research facilities at. Duke University: the Triangle Universities Nuclear Laboratory (TUNL) and the Duke University Phytotron, which are separated by approximately 100 meters. The short-lived positron-emitting radioisotopes are generated using the 10-MV tandem Van de Graaff accelerator located in the main TUNL building, which provides the capability of producing short-lived positron-emitting isotopes such as carbon-11 (11C: 20 minute half-life), nitrogen-13 (13N; 10 minute half-life), fluorine-18 (18F; 110 minute half-life), and oxygen-15 (15O; 2 minute half-life). The radioisotopes may

  17. Efficient computation in networks of spiking neurons: simulations and theory

    International Nuclear Information System (INIS)

    Natschlaeger, T.

    1999-01-01

    One of the most prominent features of biological neural systems is that individual neurons communicate via short electrical pulses, the so called action potentials or spikes. In this thesis we investigate possible mechanisms which can in principle explain how complex computations in spiking neural networks (SNN) can be performed very fast, i.e. within a few 10 milliseconds. Some of these models are based on the assumption that relevant information is encoded by the timing of individual spikes (temporal coding). We will also discuss a model which is based on a population code and still is able to perform fast complex computations. In their natural environment biological neural systems have to process signals with a rich temporal structure. Hence it is an interesting question how neural systems process time series. In this context we explore possible links between biophysical characteristics of single neurons (refractory behavior, connectivity, time course of postsynaptic potentials) and synapses (unreliability, dynamics) on the one hand and possible computations on times series on the other hand. Furthermore we describe a general model of computation that exploits dynamic synapses. This model provides a general framework for understanding how neural systems process time-varying signals. (author)

  18. A real-time computational model for estimating kinematics of ankle ligaments.

    Science.gov (United States)

    Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Sheng Quan

    2016-01-01

    An accurate assessment of ankle ligament kinematics is crucial in understanding the injury mechanisms and can help to improve the treatment of an injured ankle, especially when used in conjunction with robot-assisted therapy. A number of computational models have been developed and validated for assessing the kinematics of ankle ligaments. However, few of them can do real-time assessment to allow for an input into robotic rehabilitation programs. An ankle computational model was proposed and validated to quantify the kinematics of ankle ligaments as the foot moves in real-time. This model consists of three bone segments with three rotational degrees of freedom (DOFs) and 12 ankle ligaments. This model uses inputs for three position variables that can be measured from sensors in many ankle robotic devices that detect postures within the foot-ankle environment and outputs the kinematics of ankle ligaments. Validation of this model in terms of ligament length and strain was conducted by comparing it with published data on cadaver anatomy and magnetic resonance imaging. The model based on ligament lengths and strains is in concurrence with those from the published studies but is sensitive to ligament attachment positions. This ankle computational model has the potential to be used in robot-assisted therapy for real-time assessment of ligament kinematics. The results provide information regarding the quantification of kinematics associated with ankle ligaments related to the disability level and can be used for optimizing the robotic training trajectory.

  19. Wake force computation in the time domain for long structures

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-07-01

    One is often interested in calculating the wake potentials for short bunches in long structures using TBCI. For ultra-relativistic particles it is sufficient to solve for the fields only over a window containing the bunch and moving along with it. This technique reduces both the memory and the running time required by a factor that equals the ratio of the structure length to the window length. For example, for a bunch with sigma/sub z/ of one picosecond traversing a single SLAC cell this improvement factor is 15. It is thus possible to solve for the wakefields in very long structures: for a given problem, increasing the structure length will not change the memory required while only adding linearly to the CPU time needed

  20. Real time simulation of large systems on mini-computer

    International Nuclear Information System (INIS)

    Nakhle, Michel; Roux, Pierre.

    1979-01-01

    Most simulation languages will only accept an explicit formulation of differential equations, and logical variables hold no special status therein. The pace of the suggested methods of integration is limited by the smallest time constant of the model submitted. The NEPTUNIX 2 simulation software has a language that will take implicit equations and an integration method of which the variable pace is not limited by the time constants of the model. This, together with high time and memory ressources optimization of the code generated, makes NEPTUNIX 2 a basic tool for simulation on mini-computers. Since the logical variables are specific entities under centralized control, correct processing of discontinuities and synchronization with a real process are feasible. The NEPTUNIX 2 is the industrial version of NEPTUNIX 1 [fr

  1. Real-time computing in environmental monitoring of a nuclear power plant

    International Nuclear Information System (INIS)

    Deme, S.; Lang, E.; Nagy, Gy.

    1987-06-01

    A real-time computing method is described for calculating the environmental radiation exposure due to a nuclear power plant both at normal operation and at accident. The effects of the Gaussian plume are recalculated in every ten minutes based on meteorological parameters measured at a height of 20 and 120 m as well as on emission data. At normal operation the quantity of radioactive materials released through the stacks is measured and registered while, at an accident, the source strength is unknown and the calculated relative data are normalized to the values measured at the eight environmental monitoring stations. The doses due to noble gases and to dry and wet deposition as well as the time integral of 131 I concentration are calculated and stored by a professional personal computer for 720 points of the environment of 11 km radius. (author)

  2. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    Science.gov (United States)

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  3. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Junghoon Lee

    2011-03-01

    Full Text Available Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  4. Micooprecessor controlled facility for I.N.A.A. using short half life nuclides

    International Nuclear Information System (INIS)

    Bode, P.; Korthoven, P.J.M.; Bruin, M. de

    1986-01-01

    At IRI a new, fully atomated facility for short half life INAA is being developed and installed at the Institute 2 MW reactor. The fast rabbit transfer system is constructed only of plastic and carbonfiber parts, so that rabbit contamination is minimized. This system is automated in such a way that it can operate safely without direct supervision; the sequence of irradiations and measurements is optimized by a computer-program for a given set of samples and analysis procedures. The rabbit system is controlled by an Apple IIe-computer connected to the central PDP 11/44 system of the Radiochemistry department. For a given set of samples and required analysis procedures (irradiation-,decay-, and measurement times) the central computer calculates an optimal sequence of individual actions (transfer from and to the reactor, sample storage of detector) to be carried out by the system. This sequence is loaded into the Apple-computer as a series of commands together with timing information. Actual control of the procedure occurs through the peripheral computer, which makes the system independent of delays or break-downs of the central multi-user computer system. Hardware, software and operating characteristics of the fast rabbit system will be discussed. (author)

  5. Bound on quantum computation time: Quantum error correction in a critical environment

    International Nuclear Information System (INIS)

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-01-01

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  6. High-Specificity Targeted Functional Profiling in Microbial Communities with ShortBRED.

    Directory of Open Access Journals (Sweden)

    James Kaminski

    2015-12-01

    Full Text Available Profiling microbial community function from metagenomic sequencing data remains a computationally challenging problem. Mapping millions of DNA reads from such samples to reference protein databases requires long run-times, and short read lengths can result in spurious hits to unrelated proteins (loss of specificity. We developed ShortBRED (Short, Better Representative Extract Dataset to address these challenges, facilitating fast, accurate functional profiling of metagenomic samples. ShortBRED consists of two components: (i a method that reduces reference proteins of interest to short, highly representative amino acid sequences ("markers" and (ii a search step that maps reads to these markers to quantify the relative abundance of their associated proteins. After evaluating ShortBRED on synthetic data, we applied it to profile antibiotic resistance protein families in the gut microbiomes of individuals from the United States, China, Malawi, and Venezuela. Our results support antibiotic resistance as a core function in the human gut microbiome, with tetracycline-resistant ribosomal protection proteins and Class A beta-lactamases being the most widely distributed resistance mechanisms worldwide. ShortBRED markers are applicable to other homology-based search tasks, which we demonstrate here by identifying phylogenetic signatures of antibiotic resistance across more than 3,000 microbial isolate genomes. ShortBRED can be applied to profile a wide variety of protein families of interest; the software, source code, and documentation are available for download at http://huttenhower.sph.harvard.edu/shortbred.

  7. Vision-based online vibration estimation of the in-vessel inspection flexible robot with short-time Fourier transformation

    International Nuclear Information System (INIS)

    Wang, Hesheng; Chen, Weidong; Xu, Lifei; He, Tao

    2015-01-01

    Highlights: • Vision-based online vibration estimation method for a flexible arm is proposed. • The vibration signal is obtained by image processing in unknown environments. • Vibration parameters are estimated by short-time Fourier transformation. - Abstract: The vibration should be suppressed if it happens during the motion of a flexible robot or under the influence of external disturbance caused by its structural features and material properties, because the vibration may affect the positioning accuracy and image quality. In Tokamak environment, we need to get the real-time vibration information on vibration suppression of robotic arm, however, some sensors are not allowed in the extreme Tokamak environment. This paper proposed a vision-based method for online vibration estimation of a flexible manipulator, which is achieved by utilizing the environment image information from the end-effector camera to estimate its vibration. Short-time Fourier Transformation with adaptive window length method is used to estimate vibration parameters of non-stationary vibration signals. Experiments with one-link flexible manipulator equipped with camera are carried out to validate the feasibility of this method in this paper.

  8. Vision-based online vibration estimation of the in-vessel inspection flexible robot with short-time Fourier transformation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hesheng [Key Laboratory of System Control and Information Processing, Ministry of Education of China (China); Department of Automation, Shanghai Jiao Tong University, Shanghai 200240 (China); Chen, Weidong, E-mail: wdchen@sjtu.edu.cn [Key Laboratory of System Control and Information Processing, Ministry of Education of China (China); Department of Automation, Shanghai Jiao Tong University, Shanghai 200240 (China); Xu, Lifei; He, Tao [Key Laboratory of System Control and Information Processing, Ministry of Education of China (China); Department of Automation, Shanghai Jiao Tong University, Shanghai 200240 (China)

    2015-10-15

    Highlights: • Vision-based online vibration estimation method for a flexible arm is proposed. • The vibration signal is obtained by image processing in unknown environments. • Vibration parameters are estimated by short-time Fourier transformation. - Abstract: The vibration should be suppressed if it happens during the motion of a flexible robot or under the influence of external disturbance caused by its structural features and material properties, because the vibration may affect the positioning accuracy and image quality. In Tokamak environment, we need to get the real-time vibration information on vibration suppression of robotic arm, however, some sensors are not allowed in the extreme Tokamak environment. This paper proposed a vision-based method for online vibration estimation of a flexible manipulator, which is achieved by utilizing the environment image information from the end-effector camera to estimate its vibration. Short-time Fourier Transformation with adaptive window length method is used to estimate vibration parameters of non-stationary vibration signals. Experiments with one-link flexible manipulator equipped with camera are carried out to validate the feasibility of this method in this paper.

  9. Three-factor models versus time series models: quantifying time-dependencies of interactions between stimuli in cell biology and psychobiology for short longitudinal data.

    Science.gov (United States)

    Frank, Till D; Kiyatkin, Anatoly; Cheong, Alex; Kholodenko, Boris N

    2017-06-01

    Signal integration determines cell fate on the cellular level, affects cognitive processes and affective responses on the behavioural level, and is likely to be involved in psychoneurobiological processes underlying mood disorders. Interactions between stimuli may subjected to time effects. Time-dependencies of interactions between stimuli typically lead to complex cell responses and complex responses on the behavioural level. We show that both three-factor models and time series models can be used to uncover such time-dependencies. However, we argue that for short longitudinal data the three factor modelling approach is more suitable. In order to illustrate both approaches, we re-analysed previously published short longitudinal data sets. We found that in human embryonic kidney 293 cells cells the interaction effect in the regulation of extracellular signal-regulated kinase (ERK) 1 signalling activation by insulin and epidermal growth factor is subjected to a time effect and dramatically decays at peak values of ERK activation. In contrast, we found that the interaction effect induced by hypoxia and tumour necrosis factor-alpha for the transcriptional activity of the human cyclo-oxygenase-2 promoter in HEK293 cells is time invariant at least in the first 12-h time window after stimulation. Furthermore, we applied the three-factor model to previously reported animal studies. In these studies, memory storage was found to be subjected to an interaction effect of the beta-adrenoceptor agonist clenbuterol and certain antagonists acting on the alpha-1-adrenoceptor / glucocorticoid-receptor system. Our model-based analysis suggests that only if the antagonist drug is administer in a critical time window, then the interaction effect is relevant. © The authors 2016. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  10. Quantum computing for physics research

    International Nuclear Information System (INIS)

    Georgeot, B.

    2006-01-01

    Quantum computers hold great promises for the future of computation. In this paper, this new kind of computing device is presented, together with a short survey of the status of research in this field. The principal algorithms are introduced, with an emphasis on the applications of quantum computing to physics. Experimental implementations are also briefly discussed

  11. Mining biological information from 3D short time-series gene expression data: the OPTricluster algorithm.

    Science.gov (United States)

    Tchagang, Alain B; Phan, Sieu; Famili, Fazel; Shearer, Heather; Fobert, Pierre; Huang, Yi; Zou, Jitao; Huang, Daiqing; Cutler, Adrian; Liu, Ziying; Pan, Youlian

    2012-04-04

    Nowadays, it is possible to collect expression levels of a set of genes from a set of biological samples during a series of time points. Such data have three dimensions: gene-sample-time (GST). Thus they are called 3D microarray gene expression data. To take advantage of the 3D data collected, and to fully understand the biological knowledge hidden in the GST data, novel subspace clustering algorithms have to be developed to effectively address the biological problem in the corresponding space. We developed a subspace clustering algorithm called Order Preserving Triclustering (OPTricluster), for 3D short time-series data mining. OPTricluster is able to identify 3D clusters with coherent evolution from a given 3D dataset using a combinatorial approach on the sample dimension, and the order preserving (OP) concept on the time dimension. The fusion of the two methodologies allows one to study similarities and differences between samples in terms of their temporal expression profile. OPTricluster has been successfully applied to four case studies: immune response in mice infected by malaria (Plasmodium chabaudi), systemic acquired resistance in Arabidopsis thaliana, similarities and differences between inner and outer cotyledon in Brassica napus during seed development, and to Brassica napus whole seed development. These studies showed that OPTricluster is robust to noise and is able to detect the similarities and differences between biological samples. Our analysis showed that OPTricluster generally outperforms other well known clustering algorithms such as the TRICLUSTER, gTRICLUSTER and K-means; it is robust to noise and can effectively mine the biological knowledge hidden in the 3D short time-series gene expression data.

  12. The short-term effects of air pollutants on respiratory disease mortality in Wuhan, China: comparison of time-series and case-crossover analyses

    OpenAIRE

    Meng Ren; Na Li; Zhan Wang; Yisi Liu; Xi Chen; Yuanyuan Chu; Xiangyu Li; Zhongmin Zhu; Liqiao Tian; Hao Xiang

    2017-01-01

    Few studies have compared different methods when exploring the short-term effects of air pollutants on respiratory disease mortality in Wuhan, China. This study assesses the association between air pollutants and respiratory disease mortality with both time-series and time-stratified?case-crossover designs. The generalized additive model (GAM) and the conditional logistic regression model were used to assess the short-term effects of air pollutants on respiratory disease mortality. Stratified...

  13. The Educator´s Approach to Media Training and Computer Games within Leisure Time of School-children

    OpenAIRE

    MORAVCOVÁ, Dagmar

    2009-01-01

    The paper describes possible ways of approaching computer games playing as part of leisure time of school-children and deals with the significance of media training in leisure time. At first it specifies the concept of leisure time and its functions, then shows some positive and negative effects of the media. It further describes classical computer games, the problem of excess computer game playing and means of prevention. The paper deals with the educator's personality and the importance of ...

  14. Automated Analysis of Short Responses in an Interactive Synthetic Tutoring System for Introductory Physics

    Science.gov (United States)

    Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.

    2016-01-01

    Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part…

  15. Short-time dynamics of random-bond Potts ferromagnet with continuous self-dual quenched disorders

    OpenAIRE

    Pan, Z. Q.; Ying, H. P.; Gu, D. W.

    2001-01-01

    We present Monte Carlo simulation results of random-bond Potts ferromagnet with the Olson-Young self-dual distribution of quenched disorders in two-dimensions. By exploring the short-time scaling dynamics, we find universal power-law critical behavior of the magnetization and Binder cumulant at the critical point, and thus obtain estimates of the dynamic exponent $z$ and magnetic exponent $\\eta$, as well as the exponent $\\theta$. Our special attention is paid to the dynamic process for the $q...

  16. Flexible structure control experiments using a real-time workstation for computer-aided control engineering

    Science.gov (United States)

    Stieber, Michael E.

    1989-01-01

    A Real-Time Workstation for Computer-Aided Control Engineering has been developed jointly by the Communications Research Centre (CRC) and Ruhr-Universitaet Bochum (RUB), West Germany. The system is presently used for the development and experimental verification of control techniques for large space systems with significant structural flexibility. The Real-Time Workstation essentially is an implementation of RUB's extensive Computer-Aided Control Engineering package KEDDC on an INTEL micro-computer running under the RMS real-time operating system. The portable system supports system identification, analysis, control design and simulation, as well as the immediate implementation and test of control systems. The Real-Time Workstation is currently being used by CRC to study control/structure interaction on a ground-based structure called DAISY, whose design was inspired by a reflector antenna. DAISY emulates the dynamics of a large flexible spacecraft with the following characteristics: rigid body modes, many clustered vibration modes with low frequencies and extremely low damping. The Real-Time Workstation was found to be a very powerful tool for experimental studies, supporting control design and simulation, and conducting and evaluating tests withn one integrated environment.

  17. A Real-Time Plagiarism Detection Tool for Computer-Based Assessments

    Science.gov (United States)

    Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P.

    2018-01-01

    Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…

  18. Instructional Advice, Time Advice and Learning Questions in Computer Simulations

    Science.gov (United States)

    Rey, Gunter Daniel

    2010-01-01

    Undergraduate students (N = 97) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without instructional advice) x 2 (with or without time advice) x 2…

  19. Measurement of Ultra-Short Solitary Electromagnetic Pulses

    Directory of Open Access Journals (Sweden)

    Eva Gescheidtova

    2004-01-01

    Full Text Available In connection with the events of the last few years and with the increased number of terrorist activities, the problem of identification and measurement of electromagnetic weapons or other systems impact occurred. Among these are also microwave sources, which can reach extensive peak power of up to Pmax = 100 MW. Solitary, in some cases several times repeated, impulses lasting from tp E <1, 60>ns, cause the destruction of semiconductor junctions. These days we can find scarcely no human activity, where semiconductor structures are not used. The problem of security support of the air traffic, transportation, computer nets, banks, national strategic data canter’s, and other applications crops up. Several types of system protection from the ultra-short electromagnetic pulses present itself, passive and active protection. The analysis of the possible measuring methods, convenient for the identification and measurement of the ultra-short solitary electromagnetic pulses in presented in this paper; some of the methods were chosen and used for practical measurement. This work is part of Research object MSM262200022 "Research of microelectronic systems".

  20. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  1. Short- and long-term effects of real-time medication monitoring with short message service (SMS) reminders for missed doses on the refill adherence of people with Type 2 diabetes: evidence from a randomised controlled trial.

    NARCIS (Netherlands)

    Vervloet, M.; Dijk, L. van; Bakker, D.H. de; Souverein, P.C.; Santen-Reestman, J.; Vlijmen, B. van; Aarle, M.C.W. van; Hoek, L.S. van der; Bouvy, M.L.

    2014-01-01

    Aims: To investigate short- and long-term effects of real-time monitoring medication use combined with short message service (SMS) reminders for missed doses on refill adherence to oral anti-diabetic medication. Methods: A randomized controlled trial with two intervention groups and one control

  2. Spectrogram analysis of selected tremor signals using short-time Fourier transform and continuous wavelet transform

    Directory of Open Access Journals (Sweden)

    D. Seidl

    1999-06-01

    Full Text Available Among a variety of spectrogram methods Short-Time Fourier Transform (STFT and Continuous Wavelet Transform (CWT were selected to analyse transients in non-stationary tremor signals. Depending on the properties of the tremor signal a more suitable representation of the signal is gained by CWT. Three selected broadband tremor signals from the volcanos Mt. Stromboli, Mt. Semeru and Mt. Pinatubo were analyzed using both methods. The CWT can also be used to extend the definition of coherency into a time-varying coherency spectrogram. An example is given using array data from the volcano Mt. Stromboli.

  3. Applying a new computer-aided detection scheme generated imaging marker to predict short-term breast cancer risk

    Science.gov (United States)

    Mirniaharikandehei, Seyedehnafiseh; Hollingsworth, Alan B.; Patel, Bhavika; Heidari, Morteza; Liu, Hong; Zheng, Bin

    2018-05-01

    This study aims to investigate the feasibility of identifying a new quantitative imaging marker based on false-positives generated by a computer-aided detection (CAD) scheme to help predict short-term breast cancer risk. An image dataset including four view mammograms acquired from 1044 women was retrospectively assembled. All mammograms were originally interpreted as negative by radiologists. In the next subsequent mammography screening, 402 women were diagnosed with breast cancer and 642 remained negative. An existing CAD scheme was applied ‘as is’ to process each image. From CAD-generated results, four detection features including the total number of (1) initial detection seeds and (2) the final detected false-positive regions, (3) average and (4) sum of detection scores, were computed from each image. Then, by combining the features computed from two bilateral images of left and right breasts from either craniocaudal or mediolateral oblique view, two logistic regression models were trained and tested using a leave-one-case-out cross-validation method to predict the likelihood of each testing case being positive in the next subsequent screening. The new prediction model yielded the maximum prediction accuracy with an area under a ROC curve of AUC  =  0.65  ±  0.017 and the maximum adjusted odds ratio of 4.49 with a 95% confidence interval of (2.95, 6.83). The results also showed an increasing trend in the adjusted odds ratio and risk prediction scores (p  breast cancer risk.

  4. Cross-Term Suppression in Time Order Distribution for AWGN Signal

    Directory of Open Access Journals (Sweden)

    WAQAS MAHMOOD

    2017-04-01

    Full Text Available A technique of cross-term suppression in WD (Wigner Distribution for a multi-component signal that is embedded WGN (White Gaussian Noise is proposed. In this technique, an optimized algorithm is developed for time-varying noisy signal and a CAD (Computer Aided Design simulator is designed for Numerical simulations of synthetic signal. In proposed technique, signal components are localized in tf (time frequency plane by STFT (Short Time Fourier Transform. Rectified STFT is computed and Spectral Kurtosis is used to separate a signal components from noise in t-f plane. The t-f plane is segmented and then signal components are filtered out by FFT (Fractional Fourier Transform. Finally, WD (free of cross terms of isolated signal component is computed to obtain high resolution in t-f plane.

  5. Alternative majority-voting methods for real-time computing systems

    Science.gov (United States)

    Shin, Kang G.; Dolter, James W.

    1989-01-01

    Two techniques that provide a compromise between the high time overhead in maintaining synchronous voting and the difficulty of combining results in asynchronous voting are proposed. These techniques are specifically suited for real-time applications with a single-source/single-sink structure that need instantaneous error masking. They provide a compromise between a tightly synchronized system in which the synchronization overhead can be quite high, and an asynchronous system which lacks suitable algorithms for combining the output data. Both quorum-majority voting (QMV) and compare-majority voting (CMV) are most applicable to distributed real-time systems with single-source/single-sink tasks. All real-time systems eventually have to resolve their outputs into a single action at some stage. The development of the advanced information processing system (AIPS) and other similar systems serve to emphasize the importance of these techniques. Time bounds suggest that it is possible to reduce the overhead for quorum-majority voting to below that for synchronous voting. All the bounds assume that the computation phase is nonpreemptive and that there is no multitasking.

  6. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  7. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  8. Timing intervals using population synchrony and spike timing dependent plasticity

    Directory of Open Access Journals (Sweden)

    Wei Xu

    2016-12-01

    Full Text Available We present a computational model by which ensembles of regularly spiking neurons can encode different time intervals through synchronous firing. We show that a neuron responding to a large population of convergent inputs has the potential to learn to produce an appropriately-timed output via spike-time dependent plasticity. We explain why temporal variability of this population synchrony increases with increasing time intervals. We also show that the scalar property of timing and its violation at short intervals can be explained by the spike-wise accumulation of jitter in the inter-spike intervals of timing neurons. We explore how the challenge of encoding longer time intervals can be overcome and conclude that this may involve a switch to a different population of neurons with lower firing rate, with the added effect of producing an earlier bias in response. Experimental data on human timing performance show features in agreement with the model’s output.

  9. Computational Challenge of Fractional Differential Equations and the Potential Solutions: A Survey

    Directory of Open Access Journals (Sweden)

    Chunye Gong

    2015-01-01

    Full Text Available We present a survey of fractional differential equations and in particular of the computational cost for their numerical solutions from the view of computer science. The computational complexities of time fractional, space fractional, and space-time fractional equations are O(N2M, O(NM2, and O(NM(M + N compared with O(MN for the classical partial differential equations with finite difference methods, where M, N are the number of space grid points and time steps. The potential solutions for this challenge include, but are not limited to, parallel computing, memory access optimization (fractional precomputing operator, short memory principle, fast Fourier transform (FFT based solutions, alternating direction implicit method, multigrid method, and preconditioner technology. The relationships of these solutions for both space fractional derivative and time fractional derivative are discussed. The authors pointed out that the technologies of parallel computing should be regarded as a basic method to overcome this challenge, and some attention should be paid to the fractional killer applications, high performance iteration methods, high order schemes, and Monte Carlo methods. Since the computation of fractional equations with high dimension and variable order is even heavier, the researchers from the area of mathematics and computer science have opportunity to invent cornerstones in the area of fractional calculus.

  10. Computer simulations of disordering kinetics in irradiated intermetallic compounds

    International Nuclear Information System (INIS)

    Spaczer, M.; Caro, A.; Victoria, M.; Diaz de la Rubia, T.

    1994-01-01

    Molecular-dynamics computer simulations of collision cascades in intermetallic Cu 3 Au, Ni 3 Al, and NiAl have been performed to study the nature of the disordering processes in the collision cascade. The choice of these systems was suggested by the quite accurate description of the thermodynamic properties obtained using embedded-atom-type potentials. Since melting occurs in the core of the cascades, interesting effects appear as a result of the superposition of the loss (and subsequent recovery) of the crystalline order and the evolution of the chemical order, both processes being developed on different time scales. In our previous simulations on Ni 3 Al and Cu 3 Au [T. Diaz de la Rubia, A. Caro, and M. Spaczer, Phys. Rev. B 47, 11 483 (1993)] we found a significant difference between the time evolution of the chemical short-range order (SRO) and the crystalline order in the cascade core for both alloys, namely the complete loss of the crystalline structure but only partial chemical disordering. Recent computer simulations in NiAl show the same phenomena. To understand these features we study the liquid phase of these three alloys and present simulation results concerning the dynamical melting of small samples, examining the atomic mobility, the relaxation time, and the saturation value of the chemical short-range order. An analytic model for the time evolution of the SRO is given

  11. A full automatic system controlled with IBM-PC/XT micro-computer for neutron activation analysis

    International Nuclear Information System (INIS)

    Song Quanxun

    1992-01-01

    A full automatic system controlled with micro-computers for NAA is described. All processes are automatically completed with an IBM-PC/XT micro-computer. The device is stable, reliable, flexible and convenient for use and has many functions and applications in automatical analysis of long, middle and short lived nuclides. Due to a high working efficiency of the instrument and micro-computers, both time and power can be saved. This method can be applied in other nuclear analysis techniques

  12. Improved Multiscale Entropy Technique with Nearest-Neighbor Moving-Average Kernel for Nonlinear and Nonstationary Short-Time Biomedical Signal Analysis

    Directory of Open Access Journals (Sweden)

    S. P. Arunachalam

    2018-01-01

    Full Text Available Analysis of biomedical signals can yield invaluable information for prognosis, diagnosis, therapy evaluation, risk assessment, and disease prevention which is often recorded as short time series data that challenges existing complexity classification algorithms such as Shannon entropy (SE and other techniques. The purpose of this study was to improve previously developed multiscale entropy (MSE technique by incorporating nearest-neighbor moving-average kernel, which can be used for analysis of nonlinear and non-stationary short time series physiological data. The approach was tested for robustness with respect to noise analysis using simulated sinusoidal and ECG waveforms. Feasibility of MSE to discriminate between normal sinus rhythm (NSR and atrial fibrillation (AF was tested on a single-lead ECG. In addition, the MSE algorithm was applied to identify pivot points of rotors that were induced in ex vivo isolated rabbit hearts. The improved MSE technique robustly estimated the complexity of the signal compared to that of SE with various noises, discriminated NSR and AF on single-lead ECG, and precisely identified the pivot points of ex vivo rotors by providing better contrast between the rotor core and the peripheral region. The improved MSE technique can provide efficient complexity analysis of variety of nonlinear and nonstationary short-time biomedical signals.

  13. Practical clinical applications of the computer in nuclear medicine

    International Nuclear Information System (INIS)

    Price, R.R.; Erickson, J.J.; Patton, J.A.; Jones, J.P.; Lagan, J.E.; Rollo, F.D.

    1978-01-01

    The impact of the computer on the practice of nuclear medicine has been felt primarily in the area of rapid dynamic studies. At this time it is difficult to find a clinic which routinely performs computer processing of static images. The general purpose digital computer is a sophisticated and flexible instrument. The number of applications for which one can use the computer to augment data acquisition, analysis, or display is essentially unlimited. In this light, the purpose of this exhibit is not to describe all possible applications of the computer in nuclear medicine but rather to illustrate those applications which have generally been accepted as practical in the routine clinical environment. Specifically, we have chosen examples of computer augmented cardiac, and renal function studies as well as examples of relative organ blood flow studies. In addition, a short description of basic computer components and terminology along with a few examples of non-imaging applications are presented

  14. Geometric patterns of time-delay plots from different cardiac rhythms and arrhythmias using short-term EKG signals.

    Science.gov (United States)

    Borracci, Raúl A; Montoya Pulvet, José D; Ingino, Carlos A; Fitz Maurice, Mario; Hirschon Prado, Alfredo; Dominé, Enrique

    2017-12-27

    To date, no systematic work has been intended to describe spatio-temporal patterns of cardiac rhythms using only short series of RR intervals, to facilitate visual or computerized-aided identification of EKG motifs for use in clinical practice. The aim of this study was to detect and classify eye-catching geometric patterns of Poincaré time-delay plots from different types of cardiac rhythms and arrhythmias using short-term EKG signals. Approximately 150-300 representative, consecutive beats were retrieved from 24-h Holter registers of 100 patients with different heart rhythms. Two-dimensional Poincaré charts were created, and the resulting geometric patterns were transformed into representative familiar eye-catching drawings to interpret different arrhythmias. Poincaré plot representation of RR interval data revealed a wide variety of visual patterns: (i) comet-shaped for sinus rhythm; (ii) torpedo-shaped for sinus bradycardia; (iii) cigarette-shaped for sinus tachycardia; (iv) butterfly-shaped for sinus tachycardia and isolated atrial premature complexes; (v) arrow-shaped for isolated premature complexes and inappropriate sinus tachycardia; (vi) inverted fan-shaped for sinus rhythm with frequent atrial premature complexes; (vii) tornado-shaped for atrial flutter and atrial tachycardia; and (viii) fan-shaped for atrial fibrillation. Modified Poincaré plots with smoothed lines connecting successive points could accurately classify different types of arrhythmias based on short RR interval sequence variability. Characteristic emergent patterns can be visually identified and eventually could be distinguished by an automatic classification system able to discern between arrhythmias. This work provides an alternative method to interpret time-delay plots obtained from short-term EKG signal recordings. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  15. The relationship between TV/computer time and adolescents' health-promoting behavior: a secondary data analysis.

    Science.gov (United States)

    Chen, Mei-Yen; Liou, Yiing-Mei; Wu, Jen-Yee

    2008-03-01

    Television and computers provide significant benefits for learning about the world. Some studies have linked excessive television (TV) watching or computer game playing to disadvantage of health status or some unhealthy behavior among adolescents. However, the relationships between watching TV/playing computer games and adolescents adopting health promoting behavior were limited. This study aimed to discover the relationship between time spent on watching TV and on leisure use of computers and adolescents' health promoting behavior, and associated factors. This paper used secondary data analysis from part of a health promotion project in Taoyuan County, Taiwan. A cross-sectional design was used and purposive sampling was conducted among adolescents in the original project. A total of 660 participants answered the questions appropriately for this work between January and June 2004. Findings showed the mean age of the respondents was 15.0 +/- 1.7 years. The mean numbers of TV watching hours were 2.28 and 4.07 on weekdays and weekends respectively. The mean hours of leisure (non-academic) computer use were 1.64 and 3.38 on weekdays and weekends respectively. Results indicated that adolescents spent significant time watching TV and using the computer, which was negatively associated with adopting health-promoting behaviors such as life appreciation, health responsibility, social support and exercise behavior. Moreover, being boys, being overweight, living in a rural area, and being middle-school students were significantly associated with spending long periods watching TV and using the computer. Therefore, primary health care providers should record the TV and non-academic computer time of youths when conducting health promotion programs, and educate parents on how to become good and healthy electronic media users.

  16. A high rate gamma spectroscopy system for activation analysis of short lived isomeric transitions

    Energy Technology Data Exchange (ETDEWEB)

    Westphal, G P [Atominstitut, Vienna (Austria)

    1976-07-01

    A high rate spectroscopy system specially suited for measurement of short-lived isomeric transitions is described, which, as part of a fast activation analysis facility at the TRIGA Mark II reactor, provides for automatic recording and immediate evaluation of gamma spectra taken from nuclides activated at stationary or pulsed reactor power. The system consists of a commercial DC-coupled Ge(Li)-detector of 70 cm{sup 3} modified for recycling operation for input rates in excess of 500,000 c/s Co-60, a time variant trapezoidal shaping section and a fast constant dead-time ADC coupled to a programmed multi-channel analyzer. Novel circuits for efficient pile-up rejection and time variant base line restoration extend the concept of gated integration up to count rates of more than 300,000 c/s Co-60. Time-sequenced recording of spectra is performed by a mini computer operated as a front-end processor of a larger laboratory computer, where final data processing takes place. New concepts for very simple and cost-effective implementation of multi-channel analyzers by means of general purpose small computers are described. (author)

  17. Joint Time-Frequency-Space Classification of EEG in a Brain-Computer Interface Application

    Directory of Open Access Journals (Sweden)

    Molina Gary N Garcia

    2003-01-01

    Full Text Available Brain-computer interface is a growing field of interest in human-computer interaction with diverse applications ranging from medicine to entertainment. In this paper, we present a system which allows for classification of mental tasks based on a joint time-frequency-space decorrelation, in which mental tasks are measured via electroencephalogram (EEG signals. The efficiency of this approach was evaluated by means of real-time experimentations on two subjects performing three different mental tasks. To do so, a number of protocols for visualization, as well as training with and without feedback, were also developed. Obtained results show that it is possible to obtain good classification of simple mental tasks, in view of command and control, after a relatively small amount of training, with accuracies around 80%, and in real time.

  18. Computational model for real-time determination of tritium inventory in a detritiation installation

    International Nuclear Information System (INIS)

    Bornea, Anisia; Stefanescu, Ioan; Zamfirache, Marius; Stefan, Iuliana; Sofalca, Nicolae; Bidica, Nicolae

    2008-01-01

    Full text: At ICIT Rm.Valcea an experimental pilot plant was built having as main objective the development of a technology for detritiation of heavy water processed in the CANDU-type reactors of the nuclear power plant at Cernavoda, Romania. The aspects related to safeguards and safety for such a detritiation installation being of great importance, a complex computational model has been developed. The model allows real-time calculation of tritium inventory in a working installation. The applied detritiation technology is catalyzed isotopic exchange coupled with cryogenic distillation. Computational models for non-steady working conditions have been developed for each process of isotopic exchange. By coupling these processes tritium inventory can be determined in real-time. The computational model was developed based on the experience gained on the pilot installation. The model uses a set of parameters specific to isotopic exchange processes. These parameters were experimentally determined in the pilot installation. The model is included in the monitoring system and uses as input data the parameters acquired in real-time from automation system of the pilot installation. A friendly interface has been created to visualize the final results as data or graphs. (authors)

  19. Short-distance expansion for the electromagnetic half-space Green's tensor: general results and an application to radiative lifetime computations

    International Nuclear Information System (INIS)

    Panasyuk, George Y; Schotland, John C; Markel, Vadim A

    2009-01-01

    We obtain a short-distance expansion for the half-space, frequency domain electromagnetic Green's tensor. The small parameter of the theory is ωε 1 L/c, where ω is the frequency, ε 1 is the permittivity of the upper half-space, in which both the source and the point of observation are located, and which is assumed to be transparent, c is the speed of light in vacuum and L is a characteristic length, defined as the distance from the point of observation to the reflected (with respect to the planar interface) position of the source. In the case when the lower half-space (the substrate) is characterized by a complex permittivity ε 2 , we compute the expansion to third order. For the case when the substrate is a transparent dielectric, we compute the imaginary part of the Green's tensor to seventh order. The analytical calculations are verified numerically. The practical utility of the obtained expansion is demonstrated by computing the radiative lifetime of two electromagnetically interacting molecules in the vicinity of a transparent dielectric substrate. The computation is performed in the strong interaction regime when the quasi-particle pole approximation is inapplicable. In this regime, the integral representation for the half-space Green's tensor is difficult to use while its electrostatic limiting expression is grossly inadequate. However, the analytical expansion derived in this paper can be used directly and efficiently. The results of this study are also relevant to nano-optics and near-field imaging, especially when tomographic image reconstruction is involved

  20. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  1. An accident diagnosis algorithm using long short-term memory

    Directory of Open Access Journals (Sweden)

    Jaemin Yang

    2018-05-01

    Full Text Available Accident diagnosis is one of the complex tasks for nuclear power plant (NPP operators. In abnormal or emergency situations, the diagnostic activity of the NPP states is burdensome though necessary. Numerous computer-based methods and operator support systems have been suggested to address this problem. Among them, the recurrent neural network (RNN has performed well at analyzing time series data. This study proposes an algorithm for accident diagnosis using long short-term memory (LSTM, which is a kind of RNN, which improves the limitation for time reflection. The algorithm consists of preprocessing, the LSTM network, and postprocessing. In the LSTM-based algorithm, preprocessed input variables are calculated to output the accident diagnosis results. The outputs are also postprocessed using softmax to determine the ranking of accident diagnosis results with probabilities. This algorithm was trained using a compact nuclear simulator for several accidents: a loss of coolant accident, a steam generator tube rupture, and a main steam line break. The trained algorithm was also tested to demonstrate the feasibility of diagnosing NPP accidents. Keywords: Accident Diagnosis, Long Short-term Memory, Recurrent Neural Network, Softmax

  2. Genometa--a fast and accurate classifier for short metagenomic shotgun reads.

    Science.gov (United States)

    Davenport, Colin F; Neugebauer, Jens; Beckmann, Nils; Friedrich, Benedikt; Kameri, Burim; Kokott, Svea; Paetow, Malte; Siekmann, Björn; Wieding-Drewes, Matthias; Wienhöfer, Markus; Wolf, Stefan; Tümmler, Burkhard; Ahlers, Volker; Sprengel, Frauke

    2012-01-01

    Metagenomic studies use high-throughput sequence data to investigate microbial communities in situ. However, considerable challenges remain in the analysis of these data, particularly with regard to speed and reliable analysis of microbial species as opposed to higher level taxa such as phyla. We here present Genometa, a computationally undemanding graphical user interface program that enables identification of bacterial species and gene content from datasets generated by inexpensive high-throughput short read sequencing technologies. Our approach was first verified on two simulated metagenomic short read datasets, detecting 100% and 94% of the bacterial species included with few false positives or false negatives. Subsequent comparative benchmarking analysis against three popular metagenomic algorithms on an Illumina human gut dataset revealed Genometa to attribute the most reads to bacteria at species level (i.e. including all strains of that species) and demonstrate similar or better accuracy than the other programs. Lastly, speed was demonstrated to be many times that of BLAST due to the use of modern short read aligners. Our method is highly accurate if bacteria in the sample are represented by genomes in the reference sequence but cannot find species absent from the reference. This method is one of the most user-friendly and resource efficient approaches and is thus feasible for rapidly analysing millions of short reads on a personal computer. The Genometa program, a step by step tutorial and Java source code are freely available from http://genomics1.mh-hannover.de/genometa/ and on http://code.google.com/p/genometa/. This program has been tested on Ubuntu Linux and Windows XP/7.

  3. Genometa--a fast and accurate classifier for short metagenomic shotgun reads.

    Directory of Open Access Journals (Sweden)

    Colin F Davenport

    Full Text Available Metagenomic studies use high-throughput sequence data to investigate microbial communities in situ. However, considerable challenges remain in the analysis of these data, particularly with regard to speed and reliable analysis of microbial species as opposed to higher level taxa such as phyla. We here present Genometa, a computationally undemanding graphical user interface program that enables identification of bacterial species and gene content from datasets generated by inexpensive high-throughput short read sequencing technologies. Our approach was first verified on two simulated metagenomic short read datasets, detecting 100% and 94% of the bacterial species included with few false positives or false negatives. Subsequent comparative benchmarking analysis against three popular metagenomic algorithms on an Illumina human gut dataset revealed Genometa to attribute the most reads to bacteria at species level (i.e. including all strains of that species and demonstrate similar or better accuracy than the other programs. Lastly, speed was demonstrated to be many times that of BLAST due to the use of modern short read aligners. Our method is highly accurate if bacteria in the sample are represented by genomes in the reference sequence but cannot find species absent from the reference. This method is one of the most user-friendly and resource efficient approaches and is thus feasible for rapidly analysing millions of short reads on a personal computer.The Genometa program, a step by step tutorial and Java source code are freely available from http://genomics1.mh-hannover.de/genometa/ and on http://code.google.com/p/genometa/. This program has been tested on Ubuntu Linux and Windows XP/7.

  4. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  5. Effect of low-temperature long-time and high-temperature short-time blanching and frying treatments on the French fry quality of six Irish potato cultivars

    OpenAIRE

    Ngobese, Nomali Ziphorah; Workneh, Tilahun Seyoum; Siwela, Muthulisi

    2017-01-01

    Processing conditions are an important determinant of French fry quality. However, the effect of low-temperature long-time (LTLT) and high-temperature short-time (HTST) blanching and frying treatments has not been investigated in many cultivars. The current study investigates the effect of the sequential application of these treatments on French fries processed from six Irish potato cultivars (Fianna, Innovator, Mondial, Navigator, Panamera and Savanna). Blanching was effected at 75 °C for 10...

  6. Computational imaging with multi-camera time-of-flight systems

    KAUST Repository

    Shrestha, Shikhar

    2016-07-11

    Depth cameras are a ubiquitous technology used in a wide range of applications, including robotic and machine vision, human computer interaction, autonomous vehicles as well as augmented and virtual reality. In this paper, we explore the design and applications of phased multi-camera time-of-flight (ToF) systems. We develop a reproducible hardware system that allows for the exposure times and waveforms of up to three cameras to be synchronized. Using this system, we analyze waveform interference between multiple light sources in ToF applications and propose simple solutions to this problem. Building on the concept of orthogonal frequency design, we demonstrate state-of-the-art results for instantaneous radial velocity capture via Doppler time-of-flight imaging and we explore new directions for optically probing global illumination, for example by de-scattering dynamic scenes and by non-line-of-sight motion detection via frequency gating. © 2016 ACM.

  7. A neural computational model for animal's time-to-collision estimation.

    Science.gov (United States)

    Wang, Ling; Yao, Dezhong

    2013-04-17

    The time-to-collision (TTC) is the time elapsed before a looming object hits the subject. An accurate estimation of TTC plays a critical role in the survival of animals in nature and acts as an important factor in artificial intelligence systems that depend on judging and avoiding potential dangers. The theoretic formula for TTC is 1/τ≈θ'/sin θ, where θ and θ' are the visual angle and its variation, respectively, and the widely used approximation computational model is θ'/θ. However, both of these measures are too complex to be implemented by a biological neuronal model. We propose a new simple computational model: 1/τ≈Mθ-P/(θ+Q)+N, where M, P, Q, and N are constants that depend on a predefined visual angle. This model, weighted summation of visual angle model (WSVAM), can achieve perfect implementation through a widely accepted biological neuronal model. WSVAM has additional merits, including a natural minimum consumption and simplicity. Thus, it yields a precise and neuronal-implemented estimation for TTC, which provides a simple and convenient implementation for artificial vision, and represents a potential visual brain mechanism.

  8. Organization of the secure distributed computing based on multi-agent system

    Science.gov (United States)

    Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera

    2018-04-01

    Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.

  9. Computational Science Research in Support of Petascale Electromagnetic Modeling

    International Nuclear Information System (INIS)

    Lee, L.-Q.

    2008-01-01

    Computational science research components were vital parts of the SciDAC-1 accelerator project and are continuing to play a critical role in newly-funded SciDAC-2 accelerator project, the Community Petascale Project for Accelerator Science and Simulation (ComPASS). Recent advances and achievements in the area of computational science research in support of petascale electromagnetic modeling for accelerator design analysis are presented, which include shape determination of superconducting RF cavities, mesh-based multilevel preconditioner in solving highly-indefinite linear systems, moving window using h- or p- refinement for time-domain short-range wakefield calculations, and improved scalable application I/O

  10. First Isochronous Time-of-Flight Mass Measurements of Short-Lived Projectile Fragments in the ESR

    International Nuclear Information System (INIS)

    Stadlmann, J.; Geissel, H.; Hausmann, M.; Nolden, F.; Radon, T.; Schatz, H.; Scheidenberger, C.; Attallah, F.; Beckert, K.; Bosch, F.; Falch, M.; Franczak, B.; Franzke, B.; Kerscher, Th.; Klepper, O.; Kluge, H.J.; Kozhuharov, C.; Loebner, K.E.G.; Muenzenberg, G.; Novikov, Yu.N.; Steck, M.; Sun, Z.; Suemmerer, K.; Weick, H.; Wollnik, H.

    2000-01-01

    A new method for precise mass measurements of short-lived hot nuclei is presented. These nuclei were produced via projectile fragmentation, separated with the FRS and injected into the storage ring ESR being operated in the isochronous mode. The revolution time of the ions is measured with a time-of-flight detector sensitive to single particles. This new method allows access to exotic nuclei with half-lives in the microsecond region. First results from this novel method obtained with measurements on neutron-deficient fragments of a chromium primary beam with half-lives down to 50 ms are reported. A precision of deltam/m ≤ 5 · 10 -6 has been achieved

  11. Metabolic changes in the normal ageing brain: Consistent findings from short and long echo time proton spectroscopy

    International Nuclear Information System (INIS)

    Gruber, S.; Pinker, K.; Riederer, F.; Chmelik, M.; Stadlbauer, A.; Bittsansky, M.; Mlynarik, V.; Frey, R.; Serles, W.; Bodamer, O.; Moser, E.

    2008-01-01

    Objectives: Sixty three healthy subjects were measured to assess dependence of brain metabolites on age using short- and long echo time spectroscopy in different brain regions. Material and methods: Younger and elderly humans were measured with long echo time (TE = 135 ms) 3D-MR-spectroscopic imaging (MRSI) (10 subjects) and with ultra-short echo (TE = 11 ms) time 2D-MRSI (7 subjects). In addition, results from single voxel 1 H-spectroscopy (TE = 20 ms) of two cohorts of 46 healthy subjects were retrospectively correlated with age. Results: 3D-MR SI revealed reduced NAA/Cr in the older group in the frontal lobe (-22%; p < 0.01), parietal lobe (-28%; p < 0.01) and semiovale (-9%; p < 0.01) compared to the younger group. Cho/Cr was elevated in the semiovale (+35%; p < 0.01) and in the n. lentiformis (+42%; p < 0.01) in the older group. NAA/Cho was reduced in all regions measured, except the thalamus, in the older group compared to the younger group (from -21 to -49%; p < 0.01). 2D-MRSI revealed decreased total NAA (-3.1% per decade; p < 0.01) and NAA/Cr (-3.8% per decade; p < 0.01), increased total Cho (+3.6% per decade; p < 0.01) and Cho/Cr (+4.6% per decade; p < 0.01) and increased total myo-Inositol (mI, +4.7% per decade; p < 0.01) and mI/Cr (+5.4% per decade; p < 0.01) and decreased NAA/Cho (-8% per decade; p < 0.01) in semiovale WM. Results from single voxel spectroscopy revealed a significantly negative correlation of NAA/Cho in frontal (-13% per decade; p < 0.01) and in temporal lobe (-7.4% per decade; p < 0.01) as well as increased total Cr (10% per decade; p < 0.01) in frontal lobe. Other results from single voxel measurements were not significant, but trends were comparable to that from multivoxel spectroscopy. Conclusion: Age-related changes measured with long echo time and short echo time 1H-MRS were comparable and cannot, therefore, be caused by different T2 relaxation times in young and old subjects, as suggested previously

  12. Computer simulation to predict energy use, greenhouse gas emissions and costs for production of fluid milk using alternative processing methods

    Science.gov (United States)

    Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...

  13. Short educational programs in optical design and engineering

    Science.gov (United States)

    Voznesenskaya, Anna; Romanova, Galina; Bakholdin, Alexey; Tolstoba, Nadezhda; Ezhova, Kseniia

    2016-09-01

    Globalization and diversification of education in optical engineering causes a number of new phenomena in students' learning paths. Many students have an interest to get some courses in other universities, to study in international environment, to broaden not only professional skills but social links and see the sights as well etc. Participation in short educational programs (e.g. summer / winter schools, camps etc.) allows students from different universities to learn specific issues in their or in some neighbor field and also earn some ECTS for the transcript of records. ITMO University provides a variety of short educational programs in optical design and engineering oriented for different background level, such are: Introduction into optical engineering, Introduction into applied and computer optics, Optical system design, Image modeling and processing, Design of optical devices and components. Depending on students' educational background these programs are revised and adopted each time. Usually the short educational programs last 4 weeks and provide 4 ECTS. The short programs utilize a set of out-of date educational technologies like problem-based learning, case-study and distance-learning and evaluation. Practically, these technologies provide flexibility of the educational process and intensive growth of the learning outcomes. Students are satisfied with these programs very much. In their feedbacks they point a high level of practical significance, experienced teaching staff, scholarship program, excellent educational environment, as well as interesting social program and organizational support.

  14. Probabilistic eruption forecasting at short and long time scales

    Science.gov (United States)

    Marzocchi, Warner; Bebbington, Mark S.

    2012-10-01

    Any effective volcanic risk mitigation strategy requires a scientific assessment of the future evolution of a volcanic system and its eruptive behavior. Some consider the onus should be on volcanologists to provide simple but emphatic deterministic forecasts. This traditional way of thinking, however, does not deal with the implications of inherent uncertainties, both aleatoric and epistemic, that are inevitably present in observations, monitoring data, and interpretation of any natural system. In contrast to deterministic predictions, probabilistic eruption forecasting attempts to quantify these inherent uncertainties utilizing all available information to the extent that it can be relied upon and is informative. As with many other natural hazards, probabilistic eruption forecasting is becoming established as the primary scientific basis for planning rational risk mitigation actions: at short-term (hours to weeks or months), it allows decision-makers to prioritize actions in a crisis; and at long-term (years to decades), it is the basic component for land use and emergency planning. Probabilistic eruption forecasting consists of estimating the probability of an eruption event and where it sits in a complex multidimensional time-space-magnitude framework. In this review, we discuss the key developments and features of models that have been used to address the problem.

  15. Short-term Power Load Forecasting Based on Balanced KNN

    Science.gov (United States)

    Lv, Xianlong; Cheng, Xingong; YanShuang; Tang, Yan-mei

    2018-03-01

    To improve the accuracy of load forecasting, a short-term load forecasting model based on balanced KNN algorithm is proposed; According to the load characteristics, the historical data of massive power load are divided into scenes by the K-means algorithm; In view of unbalanced load scenes, the balanced KNN algorithm is proposed to classify the scene accurately; The local weighted linear regression algorithm is used to fitting and predict the load; Adopting the Apache Hadoop programming framework of cloud computing, the proposed algorithm model is parallelized and improved to enhance its ability of dealing with massive and high-dimension data. The analysis of the household electricity consumption data for a residential district is done by 23-nodes cloud computing cluster, and experimental results show that the load forecasting accuracy and execution time by the proposed model are the better than those of traditional forecasting algorithm.

  16. Variable dead time counters: 2. A computer simulation

    International Nuclear Information System (INIS)

    Hooton, B.W.; Lees, E.W.

    1980-09-01

    A computer model has been developed to give a pulse train which simulates that generated by a variable dead time counter (VDC) used in safeguards determination of Pu mass. The model is applied to two algorithms generally used for VDC analysis. It is used to determine their limitations at high counting rates and to investigate the effects of random neutrons from (α,n) reactions. Both algorithms are found to be deficient for use with masses of 240 Pu greater than 100g and one commonly used algorithm is shown, by use of the model and also by theory, to yield a result which is dependent on the random neutron intensity. (author)

  17. Short Term Prediction of PM10 Concentrations Using Seasonal Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Hamid Hazrul Abdul

    2016-01-01

    Full Text Available Air pollution modelling is one of an important tool that usually used to make short term and long term prediction. Since air pollution gives a big impact especially to human health, prediction of air pollutants concentration is needed to help the local authorities to give an early warning to people who are in risk of acute and chronic health effects from air pollution. Finding the best time series model would allow prediction to be made accurately. This research was carried out to find the best time series model to predict the PM10 concentrations in Nilai, Negeri Sembilan, Malaysia. By considering two seasons which is wet season (north east monsoon and dry season (south west monsoon, seasonal autoregressive integrated moving average model were used to find the most suitable model to predict the PM10 concentrations in Nilai, Negeri Sembilan by using three error measures. Based on AIC statistics, results show that ARIMA (1, 1, 1 × (1, 0, 012 is the most suitable model to predict PM10 concentrations in Nilai, Negeri Sembilan.

  18. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  19. High rate gamma spectroscopy system for activation analysis of short-lived isomeric transitions

    Energy Technology Data Exchange (ETDEWEB)

    Westphall, G P [Atominstitut der Oesterreichischen Hochschulen, Vienna

    1976-07-15

    A high rate spectroscopy system specially suited for measurement of short-lived isomeric transitions is described, which, as part of a fast activation analysis facility at the TRIGA Mark II reactor, provides for automatic recording and immediate evaluation of gamma spectra taken from nuclides activated at stationary or pulsed reactor power. The system consists of a commercial de-coupled Ge(Li)-detector of 70 cm/sup 3/ modified for recycling operation for input rates in excess of 500000 c/s /sup 60/Co, a time variant trapezoidal shaping section and a fast constant dead-time ADC coupled to a programmed multichannel analyzer. Novel circuits for efficient pile-up rejection and time variant base line restoration extend the concept of gated integration up to count rates of more than 200000 c/s /sup 60/Co. Time-sequenced recording of spectra is performed by a minicomputer operated as a front-end processor of a larger laboratory computer, where final data processing takes place. New concepts for very simple and cost-effective implementation of multichannel analyzers by means of general purpose small computers are described.

  20. Predicting Short-Term Electricity Demand by Combining the Advantages of ARMA and XGBoost in Fog Computing Environment

    Directory of Open Access Journals (Sweden)

    Chuanbin Li

    2018-01-01

    Full Text Available With the rapid development of IoT, the disadvantages of Cloud framework have been exposed, such as high latency, network congestion, and low reliability. Therefore, the Fog Computing framework has emerged, with an extended Fog Layer between the Cloud and terminals. In order to address the real-time prediction on electricity demand, we propose an approach based on XGBoost and ARMA in Fog Computing environment. By taking the advantages of Fog Computing framework, we first propose a prototype-based clustering algorithm to divide enterprise users into several categories based on their total electricity consumption; we then propose a model selection approach by analyzing users’ historical records of electricity consumption and identifying the most important features. Generally speaking, if the historical records pass the test of stationarity and white noise, ARMA is used to model the user’s electricity consumption in time sequence; otherwise, if the historical records do not pass the test, and some discrete features are the most important, such as weather and whether it is weekend, XGBoost will be used. The experiment results show that our proposed approach by combining the advantage of ARMA and XGBoost is more accurate than the classical models.

  1. Estimating return periods of extreme values from relatively short time series of winds

    Science.gov (United States)

    Jonasson, Kristjan; Agustsson, Halfdan; Rognvaldsson, Olafur; Arfeuille, Gilles

    2013-04-01

    An important factor for determining the prospect of individual wind farm sites is the frequency of extreme winds at hub height. Here, extreme winds are defined as the value of the highest 10 minutes averaged wind speed with a 50 year return period, i.e. annual exceeding probability of 2% (Rodrigo, 2010). A frequently applied method to estimate winds in the lowest few hundred meters above ground is to extrapolate observed 10-meter winds logarithmically to higher altitudes. Recent study by Drechsel et al. (2012) showed however that this methodology is not as accurate as interpolating simulated results from the global ECMWF numerical weather prediction (NWP) model to the desired height. Observations of persistent low level jets near Colima in SW-Mexico also show that the logarithmic approach can give highly inaccurate results for some regions (Arfeuille et al., 2012). To address these shortcomings of limited, and/or poorly representative, observations and extrapolations of winds one can use NWP models to dynamically scale down relatively coarse resolution atmospheric analysis. In the case of limited computing resources one has typically to make a compromise between spatial resolution and the duration of the simulated period, both of which can limit the quality of the wind farm siting. A common method to estimate maximum winds is to fit an extreme value distribution (e.g. Gumbel, gev or Pareto) to the maximum values of each year of available data, or the tail of these values. If data are only available for a short period, e.g. 10 or 15 years, then this will give a rather inaccurate estimate. It is possible to deal with this problem by utilizing monthly or weekly maxima, but this introduces new problems: seasonal variation, autocorrelation of neighboring values, and increased discrepancy between data and fitted distribution. We introduce a new method to estimate return periods of extreme values of winds at hub height from relatively short time series of winds, simulated

  2. Short-time dynamics of lysozyme solutions with competing short-range attraction and long-range repulsion: Experiment and theory

    Science.gov (United States)

    Riest, Jonas; Nägele, Gerhard; Liu, Yun; Wagner, Norman J.; Godfrin, P. Douglas

    2018-02-01

    Recently, atypical static features of microstructural ordering in low-salinity lysozyme protein solutions have been extensively explored experimentally and explained theoretically based on a short-range attractive plus long-range repulsive (SALR) interaction potential. However, the protein dynamics and the relationship to the atypical SALR structure remain to be demonstrated. Here, the applicability of semi-analytic theoretical methods predicting diffusion properties and viscosity in isotropic particle suspensions to low-salinity lysozyme protein solutions is tested. Using the interaction potential parameters previously obtained from static structure factor measurements, our results of Monte Carlo simulations representing seven experimental lysoyzme samples indicate that they exist either in dispersed fluid or random percolated states. The self-consistent Zerah-Hansen scheme is used to describe the static structure factor, S(q), which is the input to our calculation schemes for the short-time hydrodynamic function, H(q), and the zero-frequency viscosity η. The schemes account for hydrodynamic interactions included on an approximate level. Theoretical predictions for H(q) as a function of the wavenumber q quantitatively agree with experimental results at small protein concentrations obtained using neutron spin echo measurements. At higher concentrations, qualitative agreement is preserved although the calculated hydrodynamic functions are overestimated. We attribute the differences for higher concentrations and lower temperatures to translational-rotational diffusion coupling induced by the shape and interaction anisotropy of particles and clusters, patchiness of the lysozyme particle surfaces, and the intra-cluster dynamics, features not included in our simple globular particle model. The theoretical results for the solution viscosity, η, are in qualitative agreement with our experimental data even at higher concentrations. We demonstrate that semi

  3. A Statistical and Spectral Model for Representing Noisy Sounds with Short-Time Sinusoids

    Directory of Open Access Journals (Sweden)

    Myriam Desainte-Catherine

    2005-07-01

    Full Text Available We propose an original model for noise analysis, transformation, and synthesis: the CNSS model. Noisy sounds are represented with short-time sinusoids whose frequencies and phases are random variables. This spectral and statistical model represents information about the spectral density of frequencies. This perceptually relevant property is modeled by three mathematical parameters that define the distribution of the frequencies. This model also represents the spectral envelope. The mathematical parameters are defined and the analysis algorithms to extract these parameters from sounds are introduced. Then algorithms for generating sounds from the parameters of the model are presented. Applications of this model include tools for composers, psychoacoustic experiments, and pedagogy.

  4. Rotor-System Log-Decrement Identification Using Short-Time Fourier-Transform Filter

    Directory of Open Access Journals (Sweden)

    Qihang Li

    2015-01-01

    Full Text Available With the increase of the centrifugal compressor capability, such as large scale LNG and CO2 reinjection, the stability margin evaluation is crucial to assure the compressor work in the designed operating conditions in field. Improving the precision of parameter identification of stability is essential and necessary as well. Based on the time-varying characteristics of response vibration during the sine-swept process, a short-time Fourier transform (STFT filter was introduced to increase the signal-noise ratio and improve the accuracy of the estimated stability parameters. A finite element model was established to simulate the sine-swept process, and the simulated vibration signals were used to study the filtering effect and demonstrate the feasibility to identify the stability parameters by using Multiple-Input and Multiple-Output system identification method that combines the prediction error method and instrumental variable method. Simulation results show that the identification method with STFT filter improves the estimated accuracy much well and makes the curves of frequency response function clearer. Experiment was carried out on a test rig as well, which indicates the identification method is feasible in stability identification, and the results of experiment indicate that STFT filter works very well.

  5. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  6. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  7. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  8. PIXAN: the Lucas Heights PIXE analysis computer package

    International Nuclear Information System (INIS)

    Clayton, E.

    1986-11-01

    To fully utilise the multielement capability and short measurement time of PIXE it is desirable to have an automated computer evaluation of the measured spectra. Because of the complex nature of PIXE spectra, a critical step in the analysis is the data reduction, in which the areas of characteristic peaks in the spectrum are evaluated. In this package the computer program BATTY is presented for such an analysis. The second step is to determine element concentrations, knowing the characteristic peak areas in the spectrum. This requires a knowledge of the expected X-ray yield for that element in the sample. The computer program THICK provides that information for both thick and thin PIXE samples. Together, these programs form the package PIXAN used at Lucas Heights for PIXE analysis

  9. Television Viewing, Computer Use, Time Driving and All‐Cause Mortality: The SUN Cohort

    Science.gov (United States)

    Basterra‐Gortari, Francisco Javier; Bes‐Rastrollo, Maira; Gea, Alfredo; Núñez‐Córdoba, Jorge María; Toledo, Estefanía; Martínez‐González, Miguel Ángel

    2014-01-01

    Background Sedentary behaviors have been directly associated with all‐cause mortality. However, little is known about different types of sedentary behaviors in relation to overall mortality. Our objective was to assess the association between different sedentary behaviors and all‐cause mortality. Methods and Results In this prospective, dynamic cohort study (the SUN Project) 13 284 Spanish university graduates with a mean age of 37 years were followed‐up for a median of 8.2 years. Television, computer, and driving time were assessed at baseline. Poisson regression models were fitted to examine the association between each sedentary behavior and total mortality. All‐cause mortality incidence rate ratios (IRRs) per 2 hours per day were 1.40 (95% confidence interval (CI): 1.06 to 1.84) for television viewing, 0.96 (95% CI: 0.79 to 1.18) for computer use, and 1.14 (95% CI: 0.90 to 1.44) for driving, after adjustment for age, sex, smoking status, total energy intake, Mediterranean diet adherence, body mass index, and physical activity. The risk of mortality was twofold higher for participants reporting ≥3 h/day of television viewing than for those reporting Television viewing was directly associated with all‐cause mortality. However, computer use and time spent driving were not significantly associated with higher mortality. Further cohort studies and trials designed to assess whether reductions in television viewing are able to reduce mortality are warranted. The lack of association between computer use or time spent driving and mortality needs further confirmation. PMID:24965030

  10. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  11. Analysis of genetic polymorphism of nine short tandem repeat loci in ...

    African Journals Online (AJOL)

    Yomi

    2012-03-15

    Mar 15, 2012 ... Key words: short tandem repeat, repeat motif, genetic polymorphism, Han population, forensic genetics. INTRODUCTION. Short tandem repeat (STR) is widely .... Data analysis. The exact test of Hardy-Weinberg equilibrium was conducted with. Arlequin version 3.5 software (Computational and Molecular.

  12. 21 CFR 10.20 - Submission of documents to Division of Dockets Management; computation of time; availability for...

    Science.gov (United States)

    2010-04-01

    ... Management; computation of time; availability for public disclosure. 10.20 Section 10.20 Food and Drugs FOOD... Management; computation of time; availability for public disclosure. (a) A submission to the Division of Dockets Management of a petition, comment, objection, notice, compilation of information, or any other...

  13. On the Option Effects of Short-Time Work Arrangements

    NARCIS (Netherlands)

    Huisman, Kuno; Thijssen, J.J.J.

    2018-01-01

    We analyse the short term work (STW) regulations that several OECD countries introduced after the 2007 financial crisis. We view these measures as a collection of real options and study the dynamic effect of STW on the endogenous liquidation decision of the firm. While STW delays a firm’s

  14. 25 CFR 26.30 - Does the Job Training Program provide part-time training or short-term training?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Does the Job Training Program provide part-time training or short-term training? 26.30 Section 26.30 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR HUMAN SERVICES JOB PLACEMENT AND TRAINING PROGRAM Training Services § 26.30 Does the Job Training...

  15. Construction of renormalized coefficient functions of the Feynman diagrams by means of a computer

    International Nuclear Information System (INIS)

    Tarasov, O.V.

    1978-01-01

    An algorithm and short description of computer program, written in SCHOONSCHIP, are given. The program is assigned for construction of integrands of renormalized coefficient functions of the Feynman diagrams in scalar theories in the case of arbitrary subtraction point. For the given Feynman graph computer completely realizes the R-operation of Bogolubov-Parasjuk and gives the result as an integral over Feynman parameters. With the help of the program the time construction of the whole renormalized coefficient function is equal approximately 30 s on the CDC-6500 computer

  16. High-Capacity Short-Range Optical Communication Links

    DEFF Research Database (Denmark)

    Tatarczak, Anna

    Over the last decade, we have observed a tremendous spread of end-user mobile devices. The user base of a mobile application can grow or shrink by millions per day. This situation creates a pressing need for highly scalable server infrastructure; a need nowadays satisfied through cloud computing...... offered by data centers. As the popularity of cloud computing soars, the demand for high-speed, short-range data center links grows. Vertical cavity surface emitting lasers (VCSEL) and multimode fibers (MMF) prove especially well-suited for such scenarios. VCSELs have high modulation bandwidths......, we achieve 10 Gbps over 400 m and then conrm the approach in an optimized system at 25 Gbps over 300 m. The techniques described in this thesis leverage additional degrees of freedom to better utilize the available resources of short-range links. The proposed schemes enable higher speeds and longer...

  17. High-Temperature Short-Time Pasteurization System for Donor Milk in a Human Milk Bank Setting

    OpenAIRE

    Diana Escuder-Vieco; Irene Espinosa-Martos; Juan M. Rodríguez; Nieves Corzo; Antonia Montilla; Pablo Siegfried; Carmen R. Pallás-Alonso; Carmen R. Pallás-Alonso; Leónides Fernández

    2018-01-01

    Donor milk is the best alternative for the feeding of preterm newborns when mother's own milk is unavailable. For safety reasons, it is usually pasteurized by the Holder method (62.5°C for 30 min). Holder pasteurization results in a microbiological safe product but impairs the activity of many biologically active compounds such as immunoglobulins, enzymes, cytokines, growth factors, hormones or oxidative stress markers. High-temperature short-time (HTST) pasteurization has been proposed as an...

  18. Shorts due to diagnostic leads

    International Nuclear Information System (INIS)

    Ellis, J.F.; Lubell, M.S.; Pillsbury, R.D.; Shen, S.S.; Thome, R.J.; Walstrom, P.L.

    1985-01-01

    The superconducting toroidal field coils that are being tested in the Large Coil Test Facility (LCTF) are heavily instrumented. General Electric coil, a lead wire of an internal sensor became shorted across an estimated three or four turns of the pancake winding. This short occurred during the final stages of the winding fabrication and was not accessible for repair. Resistance, voltage gradient, and transient voltage decay measurements were performed to characterize the short and the magnetic damping of the large steel bobbin and outer structural ring. The 32-gage wire causing the short was estimated to be about 10 cm long, with a resistance of 55 mΩ. As a safety measure, we decided to burn out the shorted wire at room temperature before installing the coil in LCTF. Tests were made to determine the energy needed to vaporize a small wire. Computer calculations indicated that within the voltage limits set for the coil, it was not feasible to burn out the wire by rapidly dumping the coil from a low-current dc charge-up. We accomplished the burnout by applying 800 V at 3.25 A, and 60 Hz for about 1 s. Transient voltage decay measurements made after the burnout and compared with those made before the attempt confirmed that the short had indeed been opened

  19. Reliability of real-time computing with radiation data feedback at accidental release

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Lang, E.

    1990-01-01

    At the first workshop in 1985 we reported on the real-time dose computing method used at the Paks Nuclear Power Plant and on the telemetric system developed for the normalization of the computed data. At present, the computing method normalized for the telemetric data represents the primary information for deciding on any necessary counter measures in case of a nuclear reactor accident. In this connection we analyzed the reliability of the results obtained in this manner. The points of the analysis were: how the results are influenced by the choice of certain parameters that cannot be determined by direct methods and how the improperly chosen diffusion parameters would distort the determination of environmental radiation parameters normalized on the basis of the measurements ( 131 I activity concentration, gamma dose rate) at points lying at a given distance from the measuring stations. A further source of errors may be that, when determining the level of gamma radiation, the radionuclide doses in the cloud and on the ground surface are measured together by the environmental monitoring stations, whereas these doses appear separately in the computations. At the Paks NPP it is the time integral of the aiborne activity concentration of vapour form 131 I which is determined. This quantity includes neither the other physical and chemical forms of 131 I nor the other isotopes of radioiodine. We gave numerical examples for the uncertainties due to the above factors. As a result, we arrived at the conclusions that there is a need to decide on accident-related measures based on the computing method that the dose uncertainties may reach one order of magnitude for points lying far from the monitoring stations. Different measures are discussed to make the uncertainties significantly lower

  20. [Possibilities in the differential diagnosis of brain neoplasms using the long and short time sequences of proton magnetic resonance spectroscopy

    NARCIS (Netherlands)

    Gajewicz, W.; Goraj, B.M.

    2004-01-01

    Currently to perform proton magnetic resonance spectroscopy (1H MRS) with single voxel spectroscopy (SVS) technique long and/or short echo time sequences are used in order to provide complementary information. PURPOSE: The aim of the study was to compare the usefulness of STEAM (time echo, TE, 20

  1. Pricing decision model for new and remanufactured short-life cycle products with time-dependent demand

    Directory of Open Access Journals (Sweden)

    Shu San Gan

    2015-12-01

    Full Text Available In this study we develop a model that optimizes the price for new and remanufactured short life-cycle products where demands are time-dependent and price sensitive. While there has been very few published works that attempt to model remanufacturing decisions for products with short life cycle, we believe that there are many situations where remanufacturing short life cycle products is rewarding economically as well as environmentally. The system that we model consists of a retailer, a manufacturer, and a collector of used product from the end customers. Two different scenarios are evaluated for the system. The first is the independent situation where each party attempts to maximize his/her own total profit and the second is the joint profit model where we optimize the combined total profit for all three members of the supply chain. Manufacturer acts as the Stackelberg leader in the independently optimized scenario, while in the other the intermediate prices are determined by coordinated pricing policy. The results suggest that (i reducing the price of new products during the decline phase does not give better profit for the whole system, (ii the total profit obtained from optimizing each player is lower than the total profit of the integrated model, and (iii speed of change in demand influences the robustness of the prices as well as the total profit gained.

  2. An Efficient Integer Coding and Computing Method for Multiscale Time Segment

    Directory of Open Access Journals (Sweden)

    TONG Xiaochong

    2016-12-01

    Full Text Available This article focus on the exist problem and status of current time segment coding, proposed a new set of approach about time segment coding: multi-scale time segment integer coding (MTSIC. This approach utilized the tree structure and the sort by size formed among integer, it reflected the relationship among the multi-scale time segments: order, include/contained, intersection, etc., and finally achieved an unity integer coding processing for multi-scale time. On this foundation, this research also studied the computing method for calculating the time relationships of MTSIC, to support an efficient calculation and query based on the time segment, and preliminary discussed the application method and prospect of MTSIC. The test indicated that, the implement of MTSIC is convenient and reliable, and the transformation between it and the traditional method is convenient, it has the very high efficiency in query and calculating.

  3. Simulation Studies of Diffusion-Release and Effusive-Flow of Short-Lived Radioactive Isotopes

    CERN Document Server

    Zhang, Yan; Kawai, Yoko

    2005-01-01

    Delay times associated with diffusion release from targets and effusive-flow transport of radioactive isotopes to ion sources are principal intensity limiters at ISOL-based radioactive ion beam facilities, and simulation studies with computer models are cost effective methods for designing targets and vapor transport systems with minimum delay times to avoid excessive decay losses of short lived ion species. A finite difference code, Diffuse II, was recently developed at the Oak Ridge National Laboratory to study diffusion-release of short-lived species from three principal target geometries. Simulation results are in close agreement with analytical solutions to Fick’s second equation. Complementary to the development of Diffuse II, the Monte-Carlo code, Effusion, was developed to address issues related to the design of fast vapor transport systems. Results, derived by using Effusion, are also found to closely agree with experimental measurements. In this presentation, the codes will be used in conc...

  4. Rigorous bounds on survival times in circular accelerators and efficient computation of fringe-field transfer maps

    International Nuclear Information System (INIS)

    Hoffstaetter, G.H.

    1994-12-01

    Analyzing stability of particle motion in storage rings contributes to the general field of stability analysis in weakly nonlinear motion. A method which we call pseudo invariant estimation (PIE) is used to compute lower bounds on the survival time in circular accelerators. The pseudeo invariants needed for this approach are computed via nonlinear perturbative normal form theory and the required global maxima of the highly complicated multivariate functions could only be rigorously bound with an extension of interval arithmetic. The bounds on the survival times are large enough to the relevant; the same is true for the lower bounds on dynamical aperatures, which can be computed. The PIE method can lead to novel design criteria with the objective of maximizing the survival time. A major effort in the direction of rigourous predictions only makes sense if accurate models of accelerators are available. Fringe fields often have a significant influence on optical properties, but the computation of fringe-field maps by DA based integration is slower by several orders of magnitude than DA evaluation of the propagator for main-field maps. A novel computation of fringe-field effects called symplectic scaling (SYSCA) is introduced. It exploits the advantages of Lie transformations, generating functions, and scaling properties and is extremely accurate. The computation of fringe-field maps is typically made nearly two orders of magnitude faster. (orig.)

  5. Design considerations for computationally constrained two-way real-time video communication

    Science.gov (United States)

    Bivolarski, Lazar M.; Saunders, Steven E.; Ralston, John D.

    2009-08-01

    Today's video codecs have evolved primarily to meet the requirements of the motion picture and broadcast industries, where high-complexity studio encoding can be utilized to create highly-compressed master copies that are then broadcast one-way for playback using less-expensive, lower-complexity consumer devices for decoding and playback. Related standards activities have largely ignored the computational complexity and bandwidth constraints of wireless or Internet based real-time video communications using devices such as cell phones or webcams. Telecommunications industry efforts to develop and standardize video codecs for applications such as video telephony and video conferencing have not yielded image size, quality, and frame-rate performance that match today's consumer expectations and market requirements for Internet and mobile video services. This paper reviews the constraints and the corresponding video codec requirements imposed by real-time, 2-way mobile video applications. Several promising elements of a new mobile video codec architecture are identified, and more comprehensive computational complexity metrics and video quality metrics are proposed in order to support the design, testing, and standardization of these new mobile video codecs.

  6. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks.

    Science.gov (United States)

    Zazo, Ruben; Lozano-Diez, Alicia; Gonzalez-Dominguez, Javier; Toledano, Doroteo T; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources (a single GPU) that outperforms a reference i-vector system on a subset of the NIST Language Recognition Evaluation (8 target languages, 3s task) by up to a 26%. This result is in line with previously published research using proprietary LSTM implementations and huge computational resources, which made these former results hardly reproducible. Further, we extend those previous experiments modeling unseen languages (out of set, OOS, modeling), which is crucial in real applications. Results show that a LSTM RNN with OOS modeling is able to detect these languages and generalizes robustly to unseen OOS languages. Finally, we also analyze the effect of even more limited test data (from 2.25s to 0.1s) proving that with as little as 0.5s an accuracy of over 50% can be achieved.

  7. Highly efficient indoor air purification using adsorption-enhanced-photocatalysis-based microporous TiO2 at short residence time.

    Science.gov (United States)

    Lv, Jinze; Zhu, Lizhong

    2013-01-01

    A short residence time is a key design parameter for the removal of organic pollutants in catalyst-based indoor air purification systems. In this study, we synthesized a series of TiO2 with different micropore volumes and studied their removal efficiency of indoor carbonyl pollutants at a short residence time. Our results indicated that the superior adsorption capability of TiO2 with micropores improved its performance in the photocatalytic degradation of cyclohexanone, while the photocatalytic removal of the pollutant successfully kept porous TiO2 from becoming saturated. When treated with 1 mg m(-3) cyclohexanone at a relatively humidity of 18%, the adsorption amount on microporous TiO2 was 5.4-7.9 times higher than that on P25. Removal efficiency via photocatalysis followed'the same order as the adsorption amount: TiO2-5 > TiO2-20 > TiO2-60 > TiO2-180 > P25. The advantage of microporous TiO2 over P25 became more pronounced when the residence time declined from 0.072 to 0.036 s. Moreover, as the concentration of cyclohexanone deceased from 1000 ppb to 500 ppb, removal efficiency by microporous TiO2 increased more rapidly than P25.

  8. Limitations of the time slide method of background estimation

    International Nuclear Information System (INIS)

    Was, Michal; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Robinet, Florent; Vavoulidis, Miltiadis

    2010-01-01

    Time shifting the output of gravitational wave detectors operating in coincidence is a convenient way of estimating the background in a search for short-duration signals. In this paper, we show how non-stationary data affect the background estimation precision. We present a method of measuring the fluctuations of the data and computing its effects on a coincident search. In particular, we show that for fluctuations of moderate amplitude, time slides larger than the fluctuation time scales can be used. We also recall how the false alarm variance saturates with the number of time shifts.

  9. Limitations of the time slide method of background estimation

    Energy Technology Data Exchange (ETDEWEB)

    Was, Michal; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Robinet, Florent; Vavoulidis, Miltiadis, E-mail: mwas@lal.in2p3.f [LAL, Universite Paris-Sud, CNRS/IN2P3, Orsay (France)

    2010-10-07

    Time shifting the output of gravitational wave detectors operating in coincidence is a convenient way of estimating the background in a search for short-duration signals. In this paper, we show how non-stationary data affect the background estimation precision. We present a method of measuring the fluctuations of the data and computing its effects on a coincident search. In particular, we show that for fluctuations of moderate amplitude, time slides larger than the fluctuation time scales can be used. We also recall how the false alarm variance saturates with the number of time shifts.

  10. Child-Computer Interaction: ICMI 2012 special session

    NARCIS (Netherlands)

    Nijholt, Antinus; Morency, L.P.; Bohus, L.; Aghajan, H.; Nijholt, Antinus; Cassell, J.; Epps, J.

    2012-01-01

    This is a short introduction to the special session on child computer interaction at the International Conference on Multimodal Interaction 2012 (ICMI 2012). In human-computer interaction users have become participants in the design process. This is not different for child computer interaction

  11. Computer-games for gravitational wave science outreach: Black Hole Pong and Space Time Quest

    International Nuclear Information System (INIS)

    Carbone, L; Bond, C; Brown, D; Brückner, F; Grover, K; Lodhia, D; Mingarelli, C M F; Fulda, P; Smith, R J E; Unwin, R; Vecchio, A; Wang, M; Whalley, L; Freise, A

    2012-01-01

    We have established a program aimed at developing computer applications and web applets to be used for educational purposes as well as gravitational wave outreach activities. These applications and applets teach gravitational wave physics and technology. The computer programs are generated in collaboration with undergraduates and summer students as part of our teaching activities, and are freely distributed on a dedicated website. As part of this program, we have developed two computer-games related to gravitational wave science: 'Black Hole Pong' and 'Space Time Quest'. In this article we present an overview of our computer related outreach activities and discuss the games and their educational aspects, and report on some positive feedback received.

  12. Computing and Visualizing Reachable Volumes for Maneuvering Satellites

    Science.gov (United States)

    Jiang, M.; de Vries, W.; Pertica, A.; Olivier, S.

    2011-09-01

    Detecting and predicting maneuvering satellites is an important problem for Space Situational Awareness. The spatial envelope of all possible locations within reach of such a maneuvering satellite is known as the Reachable Volume (RV). As soon as custody of a satellite is lost, calculating the RV and its subsequent time evolution is a critical component in the rapid recovery of the satellite. In this paper, we present a Monte Carlo approach to computing the RV for a given object. Essentially, our approach samples all possible trajectories by randomizing thrust-vectors, thrust magnitudes and time of burn. At any given instance, the distribution of the "point-cloud" of the virtual particles defines the RV. For short orbital time-scales, the temporal evolution of the point-cloud can result in complex, multi-reentrant manifolds. Visualization plays an important role in gaining insight and understanding into this complex and evolving manifold. In the second part of this paper, we focus on how to effectively visualize the large number of virtual trajectories and the computed RV. We present a real-time out-of-core rendering technique for visualizing the large number of virtual trajectories. We also examine different techniques for visualizing the computed volume of probability density distribution, including volume slicing, convex hull and isosurfacing. We compare and contrast these techniques in terms of computational cost and visualization effectiveness, and describe the main implementation issues encountered during our development process. Finally, we will present some of the results from our end-to-end system for computing and visualizing RVs using examples of maneuvering satellites.

  13. Computing and Visualizing Reachable Volumes for Maneuvering Satellites

    International Nuclear Information System (INIS)

    Jiang, M.; de Vries, W.H.; Pertica, A.J.; Olivier, S.S.

    2011-01-01

    Detecting and predicting maneuvering satellites is an important problem for Space Situational Awareness. The spatial envelope of all possible locations within reach of such a maneuvering satellite is known as the Reachable Volume (RV). As soon as custody of a satellite is lost, calculating the RV and its subsequent time evolution is a critical component in the rapid recovery of the satellite. In this paper, we present a Monte Carlo approach to computing the RV for a given object. Essentially, our approach samples all possible trajectories by randomizing thrust-vectors, thrust magnitudes and time of burn. At any given instance, the distribution of the 'point-cloud' of the virtual particles defines the RV. For short orbital time-scales, the temporal evolution of the point-cloud can result in complex, multi-reentrant manifolds. Visualization plays an important role in gaining insight and understanding into this complex and evolving manifold. In the second part of this paper, we focus on how to effectively visualize the large number of virtual trajectories and the computed RV. We present a real-time out-of-core rendering technique for visualizing the large number of virtual trajectories. We also examine different techniques for visualizing the computed volume of probability density distribution, including volume slicing, convex hull and isosurfacing. We compare and contrast these techniques in terms of computational cost and visualization effectiveness, and describe the main implementation issues encountered during our development process. Finally, we will present some of the results from our end-to-end system for computing and visualizing RVs using examples of maneuvering satellites.

  14. Double peak-induced distance error in short-time-Fourier-transform-Brillouin optical time domain reflectometers event detection and the recovery method.

    Science.gov (United States)

    Yu, Yifei; Luo, Linqing; Li, Bo; Guo, Linfeng; Yan, Jize; Soga, Kenichi

    2015-10-01

    The measured distance error caused by double peaks in the BOTDRs (Brillouin optical time domain reflectometers) system is a kind of Brillouin scattering spectrum (BSS) deformation, discussed and simulated for the first time in the paper, to the best of the authors' knowledge. Double peak, as a kind of Brillouin spectrum deformation, is important in the enhancement of spatial resolution, measurement accuracy, and crack detection. Due to the variances of the peak powers of the BSS along the fiber, the measured starting point of a step-shape frequency transition region is shifted and results in distance errors. Zero-padded short-time-Fourier-transform (STFT) can restore the transition-induced double peaks in the asymmetric and deformed BSS, thus offering more accurate and quicker measurements than the conventional Lorentz-fitting method. The recovering method based on the double-peak detection and corresponding BSS deformation can be applied to calculate the real starting point, which can improve the distance accuracy of the STFT-based BOTDR system.

  15. Kajian dan Implementasi Real TIME Operating System pada Single Board Computer Berbasis Arm

    OpenAIRE

    A, Wiedjaja; M, Handi; L, Jonathan; Christian, Benyamin; Kristofel, Luis

    2014-01-01

    Operating System is an important software in computer system. For personal and office use the operating system is sufficient. However, to critical mission applications such as nuclear power plants and braking system on the car (auto braking system) which need a high level of reliability, it requires operating system which operates in real time. The study aims to assess the implementation of the Linux-based operating system on a Single Board Computer (SBC) ARM-based, namely Pandaboard ES with ...

  16. Time-frequency analysis of pediatric murmurs

    Science.gov (United States)

    Lombardo, Joseph S.; Blodgett, Lisa A.; Rosen, Ron S.; Najmi, Amir-Homayoon; Thompson, W. Reid

    1998-05-01

    Technology has provided many new tools to assist in the diagnosis of pathologic conditions of the heart. Echocardiography, Ultrafast CT, and MRI are just a few. While these tools are a valuable resource, they are typically too expensive, large and complex in operation for use in rural, homecare, and physician's office settings. Recent advances in computer performance, miniaturization, and acoustic signal processing, have yielded new technologies that when applied to heart sounds can provide low cost screening for pathologic conditions. The short duration and transient nature of these signals requires processing techniques that provide high resolution in both time and frequency. Short-time Fourier transforms, Wigner distributions, and wavelet transforms have been applied to signals form hearts with various pathologic conditions. While no single technique provides the ideal solution, the combination of tools provides a good representation of the acoustic features of the pathologies selected.

  17. Deterministic computation of functional integrals

    International Nuclear Information System (INIS)

    Lobanov, Yu.Yu.

    1995-09-01

    A new method of numerical integration in functional spaces is described. This method is based on the rigorous definition of a functional integral in complete separable metric space and on the use of approximation formulas which we constructed for this kind of integral. The method is applicable to solution of some partial differential equations and to calculation of various characteristics in quantum physics. No preliminary discretization of space and time is required in this method, as well as no simplifying assumptions like semi-classical, mean field approximations, collective excitations, introduction of ''short-time'' propagators, etc are necessary in our approach. The constructed approximation formulas satisfy the condition of being exact on a given class of functionals, namely polynomial functionals of a given degree. The employment of these formulas replaces the evaluation of a functional integral by computation of the ''ordinary'' (Riemannian) integral of a low dimension, thus allowing to use the more preferable deterministic algorithms (normally - Gaussian quadratures) in computations rather than traditional stochastic (Monte Carlo) methods which are commonly used for solution of the problem under consideration. The results of application of the method to computation of the Green function of the Schroedinger equation in imaginary time as well as the study of some models of Euclidean quantum mechanics are presented. The comparison with results of other authors shows that our method gives significant (by an order of magnitude) economy of computer time and memory versus other known methods while providing the results with the same or better accuracy. The funcitonal measure of the Gaussian type is considered and some of its particular cases, namely conditional Wiener measure in quantum statistical mechanics and functional measure in a Schwartz distribution space in two-dimensional quantum field theory are studied in detail. Numerical examples demonstrating the

  18. Intense field stabilization in circular polarization: Three-dimensional time-dependent dynamics

    International Nuclear Information System (INIS)

    Choi, Dae-Il; Chism, Will

    2002-01-01

    We investigate the stabilization of hydrogen atoms in a circularly polarized laser field. We use a three-dimensional, time-dependent approach to study the quantum dynamics of hydrogen atoms subject to high-intensity, short-wavelength, laser pulses. We find an enhanced survival probability as the field is increased under fixed envelope conditions. We also confirm wave packet behaviors previously seen in two-dimensional time-dependent computations

  19. A short course in computational geometry and topology

    CERN Document Server

    Edelsbrunner, Herbert

    2014-01-01

    With the aim to bring the subject of Computational Geometry and Topology closer to the scientific audience, this book is written in thirteen ready-to-teach sections organized in four parts: tessellations, complexes, homology, persistence. To speak to the non-specialist, detailed formalisms are often avoided in favor of lively 2- and 3-dimensional illustrations. The book is warmly recommended to everybody who loves geometry and the fascinating world of shapes.

  20. Computational plasma physics

    International Nuclear Information System (INIS)

    Killeen, J.

    1975-08-01

    The behavior of a plasma confined by a magnetic field is simulated by a variety of numerical models. Some models used on a short time scale give detailed knowledge of the plasma on a microscopic scale, while other models used on much longer time scales compute macroscopic properties of the plasma dynamics. In the last two years there has been a substantial increase in the numerical modelling of fusion devices. The status of MHD, transport, equilibrium, stability, Vlasov, Fokker-Planck, and Hybrid codes is reviewed. These codes have already been essential in the design and understanding of low and high beta toroidal experiments and mirror systems. The design of the next generation of fusion experiments and fusion test reactors will require continual development of these numerical models in order to include the best available plasma physics description and also to increase the geometric complexity of the model. (auth)

  1. Computer-assisted training experiment used in the field of thermal energy production (EDF)

    International Nuclear Information System (INIS)

    Felgines, R.

    1982-01-01

    In 1981, the EDF carried out an experiment with computer-assisted training (EAO). This new approach, which continued until June 1982, involved about 700 employees all of whom operated nuclear power stations. The different stages of this experiment and the lessons which can be drawn from it are given the lessons were of a positive nature and make it possible to envisage complete coverage of all nuclear power stations by computer-assisted training within a very short space of time [fr

  2. SHORT DISSIPATION TIMES OF PROTO-PLANETARY DISKS: AN ARTIFACT OF SELECTION EFFECTS?

    International Nuclear Information System (INIS)

    Pfalzner, Susanne; Steinhausen, Manuel; Menten, Karl

    2014-01-01

    The frequency of disks around young stars, a key parameter for understanding planet formation, is most readily determined in young stellar clusters where many relatively coeval stars are located in close proximity. Observational studies seem to show that the disk frequency decreases rapidly with cluster age with <10% of cluster stars retaining their disks for longer than 2-6 Myr. Given that at least half of all stars in the field seem to harbor one or more planets, this would imply extremely fast disk dispersal and rapid planet growth. Here we question the validity of this constraint by demonstrating that the short disk dissipation times inferred to date might have been heavily underestimated by selection effects. Critically, for ages >3 Myr only stars that originally populated the densest areas of very populous clusters, which are prone to disk erosion, are actually considered. This tiny sample may not be representative of the majority of stars. In fact, the higher disk fractions in co-moving groups indicate that it is likely that over 30% of all field stars retain their disks well beyond 10 Myr, leaving ample time for planet growth. Equally, our solar system, with a likely formation time >10 Myr, need no longer be an exception but in fact typical of planetary systems

  3. Computing time-series suspended-sediment concentrations and loads from in-stream turbidity-sensor and streamflow data

    Science.gov (United States)

    Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Doug; Ziegler, Andrew C.

    2010-01-01

    Over the last decade, use of a method for computing suspended-sediment concentration and loads using turbidity sensors—primarily nephelometry, but also optical backscatter—has proliferated. Because an in- itu turbidity sensor is capa le of measuring turbidity instantaneously, a turbidity time series can be recorded and related directly to time-varying suspended-sediment concentrations. Depending on the suspended-sediment characteristics of the measurement site, this method can be more reliable and, in many cases, a more accurate means for computing suspended-sediment concentrations and loads than traditional U.S. Geological Survey computational methods. Guidelines and procedures for estimating time s ries of suspended-sediment concentration and loading as a function of turbidity and streamflow data have been published in a U.S. Geological Survey Techniques and Methods Report, Book 3, Chapter C4. This paper is a summary of these guidelines and discusses some of the concepts, s atistical procedures, and techniques used to maintain a multiyear suspended sediment time series.

  4. A review of metaheuristic scheduling techniques in cloud computing

    Directory of Open Access Journals (Sweden)

    Mala Kalra

    2015-11-01

    Full Text Available Cloud computing has become a buzzword in the area of high performance distributed computing as it provides on-demand access to shared pool of resources over Internet in a self-service, dynamically scalable and metered manner. Cloud computing is still in its infancy, so to reap its full benefits, much research is required across a broad array of topics. One of the important research issues which need to be focused for its efficient performance is scheduling. The goal of scheduling is to map tasks to appropriate resources that optimize one or more objectives. Scheduling in cloud computing belongs to a category of problems known as NP-hard problem due to large solution space and thus it takes a long time to find an optimal solution. There are no algorithms which may produce optimal solution within polynomial time to solve these problems. In cloud environment, it is preferable to find suboptimal solution, but in short period of time. Metaheuristic based techniques have been proved to achieve near optimal solutions within reasonable time for such problems. In this paper, we provide an extensive survey and comparative analysis of various scheduling algorithms for cloud and grid environments based on three popular metaheuristic techniques: Ant Colony Optimization (ACO, Genetic Algorithm (GA and Particle Swarm Optimization (PSO, and two novel techniques: League Championship Algorithm (LCA and BAT algorithm.

  5. Determination of gamma-ray exposure rate from short-lived fission products under criticality accident conditions

    International Nuclear Information System (INIS)

    Yanagisawa, Hiroshi; Ohno, Akio; Aizawa, Eijyu

    2002-01-01

    For the assessment of γ-ray doses from short-lived fission products (FPs) under criticality accident conditions, γ-ray exposure rates varying with time were experimentally determined in the Transient Experiment Critical Facility (TRACY). The data were obtained by reactivity insertion in the range of 1.50 to 2.93$. It was clarified from the experiments that the contribution of γ-ray from short-lived FPs to total exposure during the experiments was evaluated to be 15 to 17%. Hence, the contribution cannot be neglected for the assessment of γ-ray doses under criticality accident conditions. Computational analyses also indicated that γ-ray exposure rates from short-lived FPs calculated with the Monte Carlo code, MCNP4B, and photon sources based on the latest FP decay data, the JENDL FP Decay Data File 2000, well agreed with the experimental results. The exposure rates were, however, extremely underestimated when the photon sources were obtained by the ORIGEN2 code. The underestimation is due to lack of energy-dependent photon emission data for major short-lived FP nuclides in the photon database attached to the ORIGEN2 code. It was also confirmed that the underestimation arose in 1,000 or less of time lapse after an initial power burst. (author)

  6. Construction of a flash-photolysis apparatus having a short discharge time

    International Nuclear Information System (INIS)

    Devillers, C.

    1964-01-01

    Flash photolysis aims at reaching directly the primary mechanisms resulting from the action of light on an absorbent matter. This makes it necessary to produce a flash as short and as bright as possible. Our main effort was directed towards reducing the duration of the flash by decreasing the self-inductance of the discharge circuit. A description of this circuit and study of the characteristics of the apparatus are followed by a short description of the two analytical methods: flash spectrography and absorption spectrophotometry at a given wave-length. (author) [fr

  7. Green computing: power optimisation of vfi-based real-time multiprocessor dataflow applications

    NARCIS (Netherlands)

    Ahmad, W.; Holzenspies, P.K.F.; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2015-01-01

    Execution time is no longer the only performance metric for computer systems. In fact, a trend is emerging to trade raw performance for energy savings. Techniques like Dynamic Power Management (DPM, switching to low power state) and Dynamic Voltage and Frequency Scaling (DVFS, throttling processor

  8. In this issue: Time to replace doctors’ judgement with computers

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2015-11-01

    Full Text Available Informaticians continue to rise to the challenge, set by the English Health Minister, of trying to replace doctors’ judgement with computers. This issue describes successes and where there are barriers. However, whilst there is progress this tends to be incremental and there are grand challenges to be overcome before computers can replace clinician. These grand challenges include: (1 improving usability so it is possible to more readily incorporate technology into clinical workflow; (2 rigorous new analytic methods that make use of the mass of available data, ‘Big data’, to create real-world evidence; (3 faster ways of meeting regulatory and legal requirements including ensuring privacy; (4 provision of reimbursement models to fund innovative technology that can substitute for clinical time and (5 recognition that innovations that improve quality also often increase cost. Informatics more is likely to support and augment clinical decision making rather than replace clinicians.

  9. Time-of-Flight Sensors in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2009-01-01

    , including Computer Graphics, Computer Vision and Man Machine Interaction (MMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real...

  10. Short communication

    African Journals Online (AJOL)

    abp

    2017-09-04

    Sep 4, 2017 ... Face-to-face interviews were conducted using a standardized ... Short communication. Open Access ... clinic during the time of the study and were invited to participate in the study. .... consume them. This is another ...

  11. EXPLORING THE POTENTIAL OF SHORT-TIME FOURIER TRANSFORMS FOR ANALYZING SKIN CONDUCTANCE AND PUPILLOMETRY IN REAL-TIME APPLICATIONS

    International Nuclear Information System (INIS)

    Roger Lew; Brian P. Dyre; Steffen Werner; Jeffrey C. Joe; Brian Wotring; Tuan Tran

    2008-01-01

    The development of real-time predictors of mental workload is critical for the practical application of augmented cognition to human-machine systems. This paper explores a novel method based on a short-time Fourier transform (STFT) for analyzing galvanic skin conductance (SC) and pupillometry time-series data to extract estimates of mental workload with temporal bandwidth high-enough to be useful for augmented cognition applications. We tested the method in the context of a process control task based on the DURESS simulation developed by Vincente and Pawlak (1994; ported to Java by Cosentino, and Ross, 1999). SC, pupil dilation, blink rate, and visual scanning patterns were measured for four participants actively engaged in controlling the simulation. Fault events were introduced that required participants to diagnose errors and make control adjustments to keep the simulator operating within a target range. We were interested in whether the STFT of these measures would produce visible effects of the increase in mental workload and stress associated with these events. Graphical exploratory data analysis of the STFT showed visible increases in the power spectrum across a range of frequencies directly following fault events. We believe this approach shows potential as a relatively unobtrusive, low-cost, high bandwidth measure of mental workload that could be particularly useful for the application of augmented cognition to human-machine systems

  12. EXPLORING THE POTENTIAL OF SHORT-TIME FOURIER TRANSFORMS FOR ANALYZING SKIN CONDUCTANCE AND PUPILLOMETRY IN REAL-TIME APPLICATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Roger Lew; Brian P. Dyre; Steffen Werner; Jeffrey C. Joe; Brian Wotring; Tuan Tran

    2008-09-01

    The development of real-time predictors of mental workload is critical for the practical application of augmented cognition to human-machine systems. This paper explores a novel method based on a short-time Fourier transform (STFT) for analyzing galvanic skin conductance (SC) and pupillometry time-series data to extract estimates of mental workload with temporal bandwidth high-enough to be useful for augmented cognition applications. We tested the method in the context of a process control task based on the DURESS simulation developed by Vincente and Pawlak (1994; ported to Java by Cosentino,& Ross, 1999). SC, pupil dilation, blink rate, and visual scanning patterns were measured for four participants actively engaged in controlling the simulation. Fault events were introduced that required participants to diagnose errors and make control adjustments to keep the simulator operating within a target range. We were interested in whether the STFT of these measures would produce visible effects of the increase in mental workload and stress associated with these events. Graphical exploratory data analysis of the STFT showed visible increases in the power spectrum across a range of frequencies directly following fault events. We believe this approach shows potential as a relatively unobtrusive, low-cost, high bandwidth measure of mental workload that could be particularly useful for the application of augmented cognition to human-machine systems.

  13. Individual and family environmental correlates of television and computer time in 10- to 12-year-old European children: the ENERGY-project.

    Science.gov (United States)

    Verloigne, Maïté; Van Lippevelde, Wendy; Bere, Elling; Manios, Yannis; Kovács, Éva; Grillenberger, Monika; Maes, Lea; Brug, Johannes; De Bourdeaudhuij, Ilse

    2015-09-18

    The aim was to investigate which individual and family environmental factors are related to television and computer time separately in 10- to-12-year-old children within and across five European countries (Belgium, Germany, Greece, Hungary, Norway). Data were used from the ENERGY-project. Children and one of their parents completed a questionnaire, including questions on screen time behaviours and related individual and family environmental factors. Family environmental factors included social, political, economic and physical environmental factors. Complete data were obtained from 2022 child-parent dyads (53.8 % girls, mean child age 11.2 ± 0.8 years; mean parental age 40.5 ± 5.1 years). To examine the association between individual and family environmental factors (i.e. independent variables) and television/computer time (i.e. dependent variables) in each country, multilevel regression analyses were performed using MLwiN 2.22, adjusting for children's sex and age. In all countries, children reported more television and/or computer time, if children and their parents thought that the maximum recommended level for watching television and/or using the computer was higher and if children had a higher preference for television watching and/or computer use and a lower self-efficacy to control television watching and/or computer use. Most physical and economic environmental variables were not significantly associated with television or computer time. Slightly more individual factors were related to children's computer time and more parental social environmental factors to children's television time. We also found different correlates across countries: parental co-participation in television watching was significantly positively associated with children's television time in all countries, except for Greece. A higher level of parental television and computer time was only associated with a higher level of children's television and computer time in Hungary. Having rules

  14. Short-term memory in networks of dissociated cortical neurons.

    Science.gov (United States)

    Dranias, Mark R; Ju, Han; Rajaram, Ezhilarasan; VanDongen, Antonius M J

    2013-01-30

    Short-term memory refers to the ability to store small amounts of stimulus-specific information for a short period of time. It is supported by both fading and hidden memory processes. Fading memory relies on recurrent activity patterns in a neuronal network, whereas hidden memory is encoded using synaptic mechanisms, such as facilitation, which persist even when neurons fall silent. We have used a novel computational and optogenetic approach to investigate whether these same memory processes hypothesized to support pattern recognition and short-term memory in vivo, exist in vitro. Electrophysiological activity was recorded from primary cultures of dissociated rat cortical neurons plated on multielectrode arrays. Cultures were transfected with ChannelRhodopsin-2 and optically stimulated using random dot stimuli. The pattern of neuronal activity resulting from this stimulation was analyzed using classification algorithms that enabled the identification of stimulus-specific memories. Fading memories for different stimuli, encoded in ongoing neural activity, persisted and could be distinguished from each other for as long as 1 s after stimulation was terminated. Hidden memories were detected by altered responses of neurons to additional stimulation, and this effect persisted longer than 1 s. Interestingly, network bursts seem to eliminate hidden memories. These results are similar to those that have been reported from similar experiments in vivo and demonstrate that mechanisms of information processing and short-term memory can be studied using cultured neuronal networks, thereby setting the stage for therapeutic applications using this platform.

  15. Does exposure to GSM 900 MHz mobile phone radiation affect short-term memory of elementary school students?

    Science.gov (United States)

    Movvahedi, M M; Tavakkoli-Golpayegani, A; Mortazavi, S A R; Haghani, M; Razi, Z; Shojaie-Fard, M B; Zare, M; Mina, E; Mansourabadi, L; Nazari-Jahromi; Safari, A; Shokrpour, N; Mortazavi, S M J

    2014-05-01

    Now-a-days, children are exposed to mobile phone radiation at a very early age. We have previously shown that a large proportion of children in the city of Shiraz, Iran use mobile phones. Furthermore, we have indicated that the visual reaction time (VRT) of university students was significantly affected by a 10 min real/sham exposure to electromagnetic fields emitted by mobile phone. We found that these exposures decreased the reaction time which might lead to a better response to different hazards. We have also revealed that occupational exposures to radar radiations decreased the reaction time in radar workers. The purpose of this study was to investigate whether short-term exposure of elementary school students to radiofrequency (RF) radiation leads to changes in their reaction time and short-term memory. A total of 60 elementary school children ages ranging from 8 to 10 years studying at a public elementary school in Shiraz, Iran were enrolled in this study. Standardized computer-based tests of VRT and short-term memory (modified for children) were administered. The students were asked to perform some preliminary tests for orientation with the VRT test. After orientation, to reduce the random variation of measurements, each test was repeated ten times in both real and sham exposure phases. The time interval between the two subsequent sham and real exposure phases was 30 min. The mean ± standard deviation reaction times after a 10 min talk period and after a 10 min sham exposure (switched off mobile) period were 249.0 ± 82.3 ms and 252.9 ± 68.2 ms (P = 0.629), respectively. On the other hand, the mean short-term memory scores after the talk and sham exposure periods were 1062.60 ± 305.39, and 1003.84 ± 339.68 (P = 0.030), respectively. To the best of our knowledge, this is the first study to show that short-term exposure of elementary school students to RF radiation leads to the better performance of their short-term memory.

  16. Statistical Multiplexing of Computations in C-RAN with Tradeoffs in Latency and Energy

    DEFF Research Database (Denmark)

    Kalør, Anders Ellersgaard; Agurto Agurto, Mauricio Ignacio; Pratas, Nuno

    2017-01-01

    frame duration, then this may result in additional access latency and limit the energy savings. In this paper we investigate the tradeoff by considering two extreme time-scales for the resource multiplexing: (i) long-term, where the computational resources are adapted over periods much larger than...... the access frame durations; (ii) short-term, where the adaption is below the access frame duration.We develop a general C-RAN queuing model that models the access latency and show, for Poisson arrivals, that long-term multiplexing achieves savings comparable to short-term multiplexing, while offering low...

  17. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  18. Femtosecond timing-jitter between photo-cathode laser and ultra-short electron bunches by means of hybrid compression

    CERN Document Server

    Pompili, Riccardo; Bellaveglia, M; Biagioni, A; Castorina, G; Chiadroni, E; Cianchi, A; Croia, M; Di Giovenale, D; Ferrario, M; Filippi, F; Gallo, A; Gatti, G; Giorgianni, F; Giribono, A; Li, W; Lupi, S; Mostacci, A; Petrarca, M; Piersanti, L; Di Pirro, G; Romeo, S; Scifo, J; Shpakov, V; Vaccarezza, C; Villa, F

    2017-01-01

    The generation of ultra-short electron bunches with ultra-low timing-jitter relative to the photo-cathode (PC) laser has been experimentally proved for the first time at the SPARC_LAB test-facility (INFN-LNF, Frascati) exploiting a two-stage hybrid compression scheme. The first stage employs RF-based compression (velocity-bunching), which shortens the bunch and imprints an energy chirp on it. The second stage is performed in a non-isochronous dogleg line, where the compression is completed resulting in a final bunch duration below 90 fs (rms). At the same time, the beam arrival timing-jitter with respect to the PC laser has been measured to be lower than 20 fs (rms). The reported results have been validated with numerical simulations.

  19. Femtosecond timing-jitter between photo-cathode laser and ultra-short electron bunches by means of hybrid compression

    International Nuclear Information System (INIS)

    Pompili, R; Anania, M P; Bellaveglia, M; Biagioni, A; Castorina, G; Chiadroni, E; Croia, M; Giovenale, D Di; Ferrario, M; Gallo, A; Gatti, G; Cianchi, A; Filippi, F; Giorgianni, F; Giribono, A; Lupi, S; Mostacci, A; Petrarca, M; Piersanti, L; Li, W

    2016-01-01

    The generation of ultra-short electron bunches with ultra-low timing-jitter relative to the photo-cathode (PC) laser has been experimentally proved for the first time at the SPARC-LAB test-facility (INFN-LNF, Frascati) exploiting a two-stage hybrid compression scheme. The first stage employs RF-based compression (velocity-bunching), which shortens the bunch and imprints an energy chirp on it. The second stage is performed in a non-isochronous dogleg line, where the compression is completed resulting in a final bunch duration below 90 fs (rms). At the same time, the beam arrival timing-jitter with respect to the PC laser has been measured to be lower than 20 fs (rms). The reported results have been validated with numerical simulations. (paper)

  20. A Short Review of FDTD-Based Methods for Uncertainty Quantification in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Theodoros T. Zygiridis

    2017-01-01

    Full Text Available We provide a review of selected computational methodologies that are based on the deterministic finite-difference time-domain algorithm and are suitable for the investigation of electromagnetic problems involving uncertainties. As it will become apparent, several alternatives capable of performing uncertainty quantification in a variety of cases exist, each one exhibiting different qualities and ranges of applicability, which we intend to point out here. Given the numerous available approaches, the purpose of this paper is to clarify the main strengths and weaknesses of the described methodologies and help the potential readers to safely select the most suitable approach for their problem under consideration.

  1. The study of the possibility to use CAMEX chips in collider experiments with short bunch crossing time

    International Nuclear Information System (INIS)

    Aulchenko, V.M.; Chilingarov, A.G.; Serbo, V.V.; Titov, V.M.

    1993-01-01

    The study of the possibility to use CAMEX chips in several systems of the detector KEDR at the e + e - collider VEPP-4M was performed. The relatively short bunch crossing time at VEPP-4M 60 ns leads to some problems with the use of CAMEX in the standard mode. The different ways to overcome these difficulties are investigated and compared. (orig.)

  2. Scalable data-driven short-term traffic prediction

    NARCIS (Netherlands)

    Friso, K.; Wismans, L. J.J.; Tijink, M. B.

    2017-01-01

    Short-term traffic prediction has a lot of potential for traffic management. However, most research has traditionally focused on either traffic models-which do not scale very well to large networks, computationally-or on data-driven methods for freeways, leaving out urban arterials completely. Urban

  3. Real-time computer treatment of THz passive device images with the high image quality

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  4. Time domain numerical calculations of the short electron bunch wakefields in resistive structures

    Energy Technology Data Exchange (ETDEWEB)

    Tsakanian, Andranik

    2010-10-15

    The acceleration of electron bunches with very small longitudinal and transverse phase space volume is one of the most actual challenges for the future International Linear Collider and high brightness X-Ray Free Electron Lasers. The exact knowledge on the wake fields generated by the ultra-short electron bunches during its interaction with surrounding structures is a very important issue to prevent the beam quality degradation and to optimize the facility performance. The high accuracy time domain numerical calculations play the decisive role in correct evaluation of the wake fields in advanced accelerators. The thesis is devoted to the development of a new longitudinally dispersion-free 3D hybrid numerical scheme in time domain for wake field calculation of ultra short bunches in structures with walls of finite conductivity. The basic approaches used in the thesis to solve the problem are the following. For materials with high but finite conductivity the model of the plane wave reflection from a conducting half-space is used. It is shown that in the conductive half-space the field components perpendicular to the interface can be neglected. The electric tangential component on the surface contributes to the tangential magnetic field in the lossless area just before the boundary layer. For high conducting media, the task is reduced to 1D electromagnetic problem in metal and the so-called 1D conducting line model can be applied instead of a full 3D space description. Further, a TE/TM (''transverse electric - transverse magnetic'') splitting implicit numerical scheme along with 1D conducting line model is applied to develop a new longitudinally dispersion-free hybrid numerical scheme in the time domain. The stability of the new hybrid numerical scheme in vacuum, conductor and bound cell is studied. The convergence of the new scheme is analyzed by comparison with the well-known analytical solutions. The wakefield calculations for a number of

  5. An assessment of the real-time application capabilities of the SIFT computer system

    Science.gov (United States)

    Butler, R. W.

    1982-01-01

    The real-time capabilities of the SIFT computer system, a highly reliable multicomputer architecture developed to support the flight controls of a relaxed static stability aircraft, are discussed. The SIFT computer system was designed to meet extremely high reliability requirements and to facilitate a formal proof of its correctness. Although SIFT represents a significant achievement in fault-tolerant system research it presents an unusual and restrictive interface to its users. The characteristics of the user interface and its impact on application system design are assessed.

  6. Manual cross check of computed dose times for motorised wedged fields

    International Nuclear Information System (INIS)

    Porte, J.

    2001-01-01

    If a mass of tissue equivalent material is exposed in turn to wedged and open radiation fields of the same size, for equal times, it is incorrect to assume that the resultant isodose pattern will be effectively that of a wedge having half the angle of the wedged field. Computer programs have been written to address the problem of creating an intermediate wedge field, commonly known as a motorized wedge. The total exposure time is apportioned between the open and wedged fields, to produce a beam modification equivalent to that of a wedged field of a given wedge angle. (author)

  7. ADAPTATION OF JOHNSON SEQUENCING ALGORITHM FOR JOB SCHEDULING TO MINIMISE THE AVERAGE WAITING TIME IN CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    SOUVIK PAL

    2016-09-01

    Full Text Available Cloud computing is an emerging paradigm of Internet-centric business computing where Cloud Service Providers (CSPs are providing services to the customer according to their needs. The key perception behind cloud computing is on-demand sharing of resources available in the resource pool provided by CSP, which implies new emerging business model. The resources are provisioned when jobs arrive. The job scheduling and minimization of waiting time are the challenging issue in cloud computing. When a large number of jobs are requested, they have to wait for getting allocated to the servers which in turn may increase the queue length and also waiting time. This paper includes system design for implementation which is concerned with Johnson Scheduling Algorithm that provides the optimal sequence. With that sequence, service times can be obtained. The waiting time and queue length can be reduced using queuing model with multi-server and finite capacity which improves the job scheduling model.

  8. Dynamic Performance Optimization for Cloud Computing Using M/M/m Queueing System

    Directory of Open Access Journals (Sweden)

    Lizheng Guo

    2014-01-01

    Full Text Available Successful development of cloud computing has attracted more and more people and enterprises to use it. On one hand, using cloud computing reduces the cost; on the other hand, using cloud computing improves the efficiency. As the users are largely concerned about the Quality of Services (QoS, performance optimization of the cloud computing has become critical to its successful application. In order to optimize the performance of multiple requesters and services in cloud computing, by means of queueing theory, we analyze and conduct the equation of each parameter of the services in the data center. Then, through analyzing the performance parameters of the queueing system, we propose the synthesis optimization mode, function, and strategy. Lastly, we set up the simulation based on the synthesis optimization mode; we also compare and analyze the simulation results to the classical optimization methods (short service time first and first in, first out method, which show that the proposed model can optimize the average wait time, average queue length, and the number of customer.

  9. Reservoir computer predictions for the Three Meter magnetic field time evolution

    Science.gov (United States)

    Perevalov, A.; Rojas, R.; Lathrop, D. P.; Shani, I.; Hunt, B. R.

    2017-12-01

    The source of the Earth's magnetic field is the turbulent flow of liquid metal in the outer core. Our experiment's goal is to create Earth-like dynamo, to explore the mechanisms and to understand the dynamics of the magnetic and velocity fields. Since it is a complicated system, predictions of the magnetic field is a challenging problem. We present results of mimicking the three Meter experiment by a reservoir computer deep learning algorithm. The experiment is a three-meter diameter outer sphere and a one-meter diameter inner sphere with the gap filled with liquid sodium. The spheres can rotate up to 4 and 14 Hz respectively, giving a Reynolds number near to 108. Two external electromagnets apply magnetic fields, while an array of 31 external and 2 internal Hall sensors measure the resulting induced fields. We use this magnetic probe data to train a reservoir computer to predict the 3M time evolution and mimic waves in the experiment. Surprisingly accurate predictions can be made for several magnetic dipole time scales. This shows that such a complicated MHD system's behavior can be predicted. We gratefully acknowledge support from NSF EAR-1417148.

  10. A stable computational scheme for stiff time-dependent constitutive equations

    International Nuclear Information System (INIS)

    Shih, C.F.; Delorenzi, H.G.; Miller, A.K.

    1977-01-01

    Viscoplasticity and creep type constitutive equations are increasingly being employed in finite element codes for evaluating the deformation of high temperature structural members. These constitutive equations frequently exhibit stiff regimes which makes an analytical assessment of the structure very costly. A computational scheme for handling deformation in stiff regimes is proposed in this paper. By the finite element discretization, the governing partial differential equations in the spatial (x) and time (t) variables are reduced to a system of nonlinear ordinary differential equations in the independent variable t. The constitutive equations are expanded in a Taylor's series about selected values of t. The resulting system of differential equations are then integrated by an implicit scheme which employs a predictor technique to initiate the Newton-Raphson procedure. To examine the stability and accuracy of the computational scheme, a series of calculations were carried out for uniaxial specimens and thick wall tubes subjected to mechanical and thermal loading. (Auth.)

  11. Decreasing Transition Times in Elementary School Classrooms: Using Computer-Assisted Instruction to Automate Intervention Components

    Science.gov (United States)

    Hine, Jeffrey F.; Ardoin, Scott P.; Foster, Tori E.

    2015-01-01

    Research suggests that students spend a substantial amount of time transitioning between classroom activities, which may reduce time spent academically engaged. This study used an ABAB design to evaluate the effects of a computer-assisted intervention that automated intervention components previously shown to decrease transition times. We examined…

  12. Microbial inactivation of paprika by a high-temperature short-X time treatment. Influence on color properties.

    Science.gov (United States)

    Almela, Luis; Nieto-Sandoval, José M; Fernández López, José A

    2002-03-13

    High-temperature short-time (HTST) treatments have been used to destroy the bioburden of paprika. With this in mind, we have designed a device to treat samples of paprika with a gas whose temperature, pressure, and composition can be selected. Temperatures and treatment times ranged from 130 to 170 degrees C and 4 to 6 s, respectively. The survival of the most commonly found microorganisms in paprika and any alteration in extractable and superficial color were examined. Data showed that the optimum HTST conditions were 145 degrees C, 1.5 kg/cm2 of overpressure, 6 s operation time, and a thermal fluid of saturated steam. No microbial growth was detected during storage after thermal treatment. To minimize the color losses, treated (HTST) paprika samples should be kept under refrigeration.

  13. The use of short-echo-time 1H MRS for childhood cerebellar tumours prior to histopathological diagnosis

    International Nuclear Information System (INIS)

    Harris, Lisa M.; Peet, Andrew C.; Davies, Nigel; Natarajan, Kal; MacPherson, Lesley; Foster, Katharine; Lateef, Shaheen; Sgouros, Spyridon; Brundler, Marie-Anne; Arvanitis, Theodoros N.; Grundy, Richard G.

    2007-01-01

    Proton magnetic resonance spectroscopy (MRS) measures concentrations of metabolites in vivo and provides a powerful method for identifying tumours. MRS has not entered routine clinical use partly due to the difficulty of analysing the spectra. To create a straightforward method for interpreting short-echo-time MRS of childhood cerebellar tumours. Single-voxel MRS (1.5-T Siemens Symphony NUM4, TR/TE 1,500/30 ms) was performed at presentation in 30 children with cerebellar tumours. The MRS results were analysed for comparison with histological diagnosis. Peak heights for N-acetyl aspartate (NAA), creatine (Cr), choline (Cho) and myo-inositol (mIns) were determined and receiver operator characteristic curves used to select ratios that best discriminated between the tumour types. The method was implemented by a group of clinicians and scientists, blinded to the results. A total of 27 MRS studies met the quality control criteria. NAA/Cr >4.0 distinguished all but one of the astrocytomas from the other tumours. A combination of Cr/Cho <0.75 and mIns/NAA <2.1 separated all the medulloblastomas from the ependymomas. Peak height ratios from short-echo-time MRS can accurately predict the histopathology of childhood cerebellar tumours. (orig.)

  14. The impact of short night-time naps on performance, sleepiness and mood during a simulated night shift.

    Science.gov (United States)

    Centofanti, Stephanie A; Hilditch, Cassie J; Dorrian, Jillian; Banks, Siobhan

    2016-01-01

    Short naps on night shift are recommended in some industries. There is a paucity of evidence to verify the sustained recovery benefits of short naps in the last few hours of the night shift. Therefore, the current study aimed to investigate the sustained recovery benefits of 30 and 10-min nap opportunities during a simulated night shift. Thirty-one healthy participants (18F, 21-35 y) completed a 3-day, between-groups laboratory study with one baseline night (22:00-07:00 h time in bed), followed by one night awake (time awake from 07:00 h on day two through 10:00 h day three) with random allocation to: a 10-min nap opportunity ending at 04:00 h, a 30-min nap opportunity ending at 04:00 h or no nap (control). A neurobehavioral test bout was administered approximately every 2 h during wake periods. There were no significant differences between nap conditions for post-nap psychomotor vigilance performance after controlling for pre-nap scores (p > 0.05). The 30-min nap significantly improved subjective sleepiness compared to the 10-min nap and no-nap control (p effect.

  15. X-ray short-time lags in the Fe-K energy band produced by scattering clouds in active galactic nuclei

    Science.gov (United States)

    Mizumoto, Misaki; Done, Chris; Hagino, Kouichi; Ebisawa, Ken; Tsujimoto, Masahiro; Odaka, Hirokazu

    2018-05-01

    X-rays illuminating the accretion disc in active galactic nuclei give rise to an iron K line and its associated reflection spectrum which are lagged behind the continuum variability by the light-travel time from the source to the disc. The measured lag timescales in the iron band can be as short as ˜Rg/c, where Rg is the gravitational radius, which is often interpreted as evidence for a very small continuum source close to the event horizon of a rapidly spinning black hole. However, the short lags can also be produced by reflection from more distant material, because the primary photons with no time-delay dilute the time-lags caused by the reprocessed photons. We perform a Monte-Carlo simulation to calculate the dilution effect in the X-ray reverberation lags from a half-shell of neutral material placed at 100 Rg from the central source. This gives lags of ˜2 Rg/c, but the iron line is a distinctly narrow feature in the lag-energy plot, whereas the data often show a broader line. We show that both the short lag and the line broadening can be reproduced if the scattering material is outflowing at ˜0.1c. The velocity structure in the wind can also give shifts in the line profile in the lag-energy plot calculated at different frequencies. Hence we propose that the observed broad iron reverberation lags and shifts in profile as a function of frequency of variability can arise from a disc wind at fairly large distances from the X-ray source.

  16. A Cloud-Based Infrastructure for Near-Real-Time Processing and Dissemination of NPP Data

    Science.gov (United States)

    Evans, J. D.; Valente, E. G.; Chettri, S. S.

    2011-12-01

    We are building a scalable cloud-based infrastructure for generating and disseminating near-real-time data products from a variety of geospatial and meteorological data sources, including the new National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP). Our approach relies on linking Direct Broadcast and other data streams to a suite of scientific algorithms coordinated by NASA's International Polar-Orbiter Processing Package (IPOPP). The resulting data products are directly accessible to a wide variety of end-user applications, via industry-standard protocols such as OGC Web Services, Unidata Local Data Manager, or OPeNDAP, using open source software components. The processing chain employs on-demand computing resources from Amazon.com's Elastic Compute Cloud and NASA's Nebula cloud services. Our current prototype targets short-term weather forecasting, in collaboration with NASA's Short-term Prediction Research and Transition (SPoRT) program and the National Weather Service. Direct Broadcast is especially crucial for NPP, whose current ground segment is unlikely to deliver data quickly enough for short-term weather forecasters and other near-real-time users. Direct Broadcast also allows full local control over data handling, from the receiving antenna to end-user applications: this provides opportunities to streamline processes for data ingest, processing, and dissemination, and thus to make interpreted data products (Environmental Data Records) available to practitioners within minutes of data capture at the sensor. Cloud computing lets us grow and shrink computing resources to meet large and rapid fluctuations in data availability (twice daily for polar orbiters) - and similarly large fluctuations in demand from our target (near-real-time) users. This offers a compelling business case for cloud computing: the processing or dissemination systems can grow arbitrarily large to sustain near-real time data access despite surges in

  17. SPEEDI: a computer code system for the real-time prediction of radiation dose to the public due to an accidental release

    International Nuclear Information System (INIS)

    Imai, Kazuhiko; Chino, Masamichi; Ishikawa, Hirohiko

    1985-10-01

    SPEEDI, a computer code system for prediction of environmental doses from radioactive materials accidentally released from a nuclear plant has been developed to assist the organizations responsible for an emergency planning. For realistic simulation, have been developed a model which statistically predicts the basic wind data and then calculates the three-dimensional mass consistent wind field by interpolating these predicted data, and a model for calculation of the diffusion of released materials using a combined model of random-walk and PICK methods. These calculation in the system is carried out in conversational mode with a computer so that we may use the system with ease in an emergency. SPEEDI has also versatile files, which make it easy to control the complicated flows of calculation. In order to attain a short computation time, a large-scale computer with performance of 25 MIPS and a vector processor of maximum 250 MFLOPS are used for calculation of the models so that quick responses have been made. Simplified models are also prepared for calculation in a minicomputer widely used by local governments and research institutes, although the precision of calculation as same with the above models can not be expected to obtain. The present report outlines the structure and functions of SPEEDI, methods for prediction of the wind field and the models for calculation of the concentration of released materials in air and on the ground, and the doses to the public. Some of the diffusion models have been compared with the field experiments which had been carried out as a part of the SPEEDI development program. The report also discusses the reliability of the diffusion models on the basis of the compared results, and shows that they can reasonably simulate the diffusion in the internal boundary layer which commonly occurs near the coastal region. (J.P.N.)

  18. Antimicrobial and antiviral effect of high-temperature short-time (HTST) pasteurization applied to human milk.

    Science.gov (United States)

    Terpstra, Fokke G; Rechtman, David J; Lee, Martin L; Hoeij, Klaske Van; Berg, Hijlkeline; Van Engelenberg, Frank A C; Van't Wout, Angelica B

    2007-03-01

    In the United States, concerns over the transmission of infectious diseases have led to donor human milk generally being subjected to pasteurization prior to distribution and use. The standard method used by North American milk banks is Holder pasteurization (63 degrees C for 30 minutes). The authors undertook an experiment to validate the effects of a high-temperature short-time (HTST) pasteurization process (72 degrees C for 16 seconds) on the bioburden of human milk. It was concluded that HTST is effective in the elimination of bacteria as well as of certain important pathogenic viruses.

  19. Real-time dynamics of lattice gauge theories with a few-qubit quantum computer

    Science.gov (United States)

    Martinez, Esteban A.; Muschik, Christine A.; Schindler, Philipp; Nigg, Daniel; Erhard, Alexander; Heyl, Markus; Hauke, Philipp; Dalmonte, Marcello; Monz, Thomas; Zoller, Peter; Blatt, Rainer

    2016-06-01

    Gauge theories are fundamental to our understanding of interactions between the elementary constituents of matter as mediated by gauge bosons. However, computing the real-time dynamics in gauge theories is a notorious challenge for classical computational methods. This has recently stimulated theoretical effort, using Feynman’s idea of a quantum simulator, to devise schemes for simulating such theories on engineered quantum-mechanical devices, with the difficulty that gauge invariance and the associated local conservation laws (Gauss laws) need to be implemented. Here we report the experimental demonstration of a digital quantum simulation of a lattice gauge theory, by realizing (1 + 1)-dimensional quantum electrodynamics (the Schwinger model) on a few-qubit trapped-ion quantum computer. We are interested in the real-time evolution of the Schwinger mechanism, describing the instability of the bare vacuum due to quantum fluctuations, which manifests itself in the spontaneous creation of electron-positron pairs. To make efficient use of our quantum resources, we map the original problem to a spin model by eliminating the gauge fields in favour of exotic long-range interactions, which can be directly and efficiently implemented on an ion trap architecture. We explore the Schwinger mechanism of particle-antiparticle generation by monitoring the mass production and the vacuum persistence amplitude. Moreover, we track the real-time evolution of entanglement in the system, which illustrates how particle creation and entanglement generation are directly related. Our work represents a first step towards quantum simulation of high-energy theories using atomic physics experiments—the long-term intention is to extend this approach to real-time quantum simulations of non-Abelian lattice gauge theories.

  20. Simulation of Rn-222 decay products concentration deposited on a filter. Description of radon1.pas computer program

    International Nuclear Information System (INIS)

    Machaj, B.

    1996-01-01

    A computer program allowing simulation of activity distribution of 222 Rn short lived decay products deposited on a filter against time is presented, for any radiation equilibrium degree of the decay products. Deposition of the decay products is simulated by summing discrete samples every 1/10 min in the sampling time from 1 to 10 min. The concentration (activity) of the decay products is computed in one minute intervals in the range 1 - 100 min. The alpha concentration and the total activity of 218 Po + 214 Po produced are computed in the range 1 to 100 min as well. (author). 10 refs, 4 figs