WorldWideScience

Sample records for rapid computing times

  1. Computationally designed libraries for rapid enzyme stabilization

    NARCIS (Netherlands)

    Wijma, Hein J.; Floor, Robert J.; Jekel, Peter A.; Baker, David; Marrink, Siewert J.; Janssen, Dick B.

    The ability to engineer enzymes and other proteins to any desired stability would have wide-ranging applications. Here, we demonstrate that computational design of a library with chemically diverse stabilizing mutations allows the engineering of drastically stabilized and fully functional variants

  2. Missile signal processing common computer architecture for rapid technology upgrade

    Science.gov (United States)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.

  3. Rapid computation of chemical equilibrium composition - An application to hydrocarbon combustion

    Science.gov (United States)

    Erickson, W. D.; Prabhu, R. K.

    1986-01-01

    A scheme for rapidly computing the chemical equilibrium composition of hydrocarbon combustion products is derived. A set of ten governing equations is reduced to a single equation that is solved by the Newton iteration method. Computation speeds are approximately 80 times faster than the often used free-energy minimization method. The general approach also has application to many other chemical systems.

  4. Time-Predictable Computer Architecture

    Directory of Open Access Journals (Sweden)

    Schoeberl Martin

    2009-01-01

    Full Text Available Today's general-purpose processors are optimized for maximum throughput. Real-time systems need a processor with both a reasonable and a known worst-case execution time (WCET. Features such as pipelines with instruction dependencies, caches, branch prediction, and out-of-order execution complicate WCET analysis and lead to very conservative estimates. In this paper, we evaluate the issues of current architectures with respect to WCET analysis. Then, we propose solutions for a time-predictable computer architecture. The proposed architecture is evaluated with implementation of some features in a Java processor. The resulting processor is a good target for WCET analysis and still performs well in the average case.

  5. 12 CFR 1102.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1102.27 Section 1102.27 Banks... for Proceedings § 1102.27 Computing time. (a) General rule. In computing any period of time prescribed... time begins to run is not included. The last day so computed is included, unless it is a Saturday...

  6. 12 CFR 622.21 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Computing time. 622.21 Section 622.21 Banks and... Formal Hearings § 622.21 Computing time. (a) General rule. In computing any period of time prescribed or... run is not to be included. The last day so computed shall be included, unless it is a Saturday, Sunday...

  7. A Matter of Computer Time

    Science.gov (United States)

    Celano, Donna; Neuman, Susan B.

    2010-01-01

    Many low-income children do not have the opportunity to develop the computer skills necessary to succeed in our technological economy. Their only access to computers and the Internet--school, afterschool programs, and community organizations--is woefully inadequate. Educators must work to close this knowledge gap and to ensure that low-income…

  8. 12 CFR 908.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 908.27 Section 908.27 Banks and... PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.27 Computing time. (a) General rule. In computing any period of time prescribed or allowed by this subpart, the date of the act or event...

  9. 12 CFR 1780.11 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1780.11 Section 1780.11 Banks... time. (a) General rule. In computing any period of time prescribed or allowed by this subpart, the date of the act or event that commences the designated period of time is not included. The last day so...

  10. Development of a rapid multi-line detector for industrial computed tomography

    International Nuclear Information System (INIS)

    Nachtrab, Frank; Firsching, Markus; Hofmann, Thomas; Uhlmann, Norman; Neubauer, Harald; Nowak, Arne

    2015-01-01

    In this paper we present the development of a rapid multi-row detector is optimized for industrial computed tomography. With a high frame rate, high spatial resolution and the ability to use up to 450 kVp it is particularly suitable for applications such as fast acquisition of large objects, inline CT or time-resolved 4D CT. (Contains PowerPoint slides). [de

  11. A novel technique for presurgical nasoalveolar molding using computer-aided reverse engineering and rapid prototyping.

    Science.gov (United States)

    Yu, Quan; Gong, Xin; Wang, Guo-Min; Yu, Zhe-Yuan; Qian, Yu-Fen; Shen, Gang

    2011-01-01

    To establish a new method of presurgical nasoalveolar molding (NAM) using computer-aided reverse engineering and rapid prototyping technique in infants with unilateral cleft lip and palate (UCLP). Five infants (2 males and 3 females with mean age of 1.2 w) with complete UCLP were recruited. All patients were subjected to NAM before the cleft lip repair. The upper denture casts were recorded using a three-dimensional laser scanner within 2 weeks after birth in UCLP infants. A digital model was constructed and analyzed to simulate the NAM procedure with reverse engineering software. The digital geometrical data were exported to print the solid model with rapid prototyping system. The whole set of appliances was fabricated based on these solid models. Laser scanning and digital model construction simplified the NAM procedure and estimated the treatment objective. The appliances were fabricated based on the rapid prototyping technique, and for each patient, the complete set of appliances could be obtained at one time. By the end of presurgical NAM treatment, the cleft was narrowed, and the malformation of nasoalveolar segments was aligned normally. We have developed a novel technique of presurgical NAM based on a computer-aided design. The accurate digital denture model of UCLP infants could be obtained with laser scanning. The treatment design and appliance fabrication could be simplified with a computer-aided reverse engineering and rapid prototyping technique.

  12. General purpose computers in real time

    International Nuclear Information System (INIS)

    Biel, J.R.

    1989-01-01

    I see three main trends in the use of general purpose computers in real time. The first is more processing power. The second is the use of higher speed interconnects between computers (allowing more data to be delivered to the processors). The third is the use of larger programs running in the computers. Although there is still work that needs to be done, I believe that all indications are that the online need for general purpose computers should be available for the SCC and LHC machines. 2 figs

  13. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  14. Computational System For Rapid CFD Analysis In Engineering

    Science.gov (United States)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  15. Rapid Time Response: A solution for Manufacturing Issue

    Directory of Open Access Journals (Sweden)

    Norazlin N.

    2017-01-01

    Full Text Available Respond time in manufacturing give the major impact that able to contribute too many manufacturing issues. Based on two worst case scenario occurred where Toyota in 2009 made a massive vehicles call due to car complexity of 11 major models and over 9 million vehicles. The recalls cost at least $2 billion in cost of repair, lost deals and result in lost 5% of its market share in United State of America, while A380 was reported on missing target in new production and leads to delayed market entry due to their weak product life cycle management (PLM. These cases give a sign to all industries to possess and optimize the facilities for better traceability in shortest time period. In Industry 4.0, the traceability and time respond become the factors for high performance manufacturing and rapid time respond able to expedite the traceability process and strengthen the communication level between man, machine and management. The round trip time (RTT experiment gives variant time respond between two difference operating system for intra and inter-platform signal. If this rapid time respond is adopted in any manufacturing process, the delay in traceability on every issue that lead to losses can be successfully avoided.

  16. Instruction timing for the CDC 7600 computer

    International Nuclear Information System (INIS)

    Lipps, H.

    1975-01-01

    This report provides timing information for all instructions of the Control Data 7600 computer, except for instructions of type 01X, to enable the optimization of 7600 programs. The timing rules serve as background information for timing charts which are produced by a program (TIME76) of the CERN Program Library. The rules that co-ordinate the different sections of the CPU are stated in as much detail as is necessary to time the flow of instructions for a given sequence of code. Instruction fetch, instruction issue, and access to small core memory are treated at length, since details are not available from the computer manuals. Annotated timing charts are given for 24 examples, chosen to display the full range of timing considerations. (Author)

  17. Rapid mental computation system as a tool for algorithmic thinking of elementary school students development

    OpenAIRE

    Ziatdinov, Rushan; Musa, Sajid

    2013-01-01

    In this paper, we describe the possibilities of using a rapid mental computation system in elementary education. The system consists of a number of readily memorized operations that allow one to perform arithmetic computations very quickly. These operations are actually simple algorithms which can develop or improve the algorithmic thinking of pupils. Using a rapid mental computation system allows forming the basis for the study of computer science in secondary school.

  18. Fast algorithms for computing phylogenetic divergence time.

    Science.gov (United States)

    Crosby, Ralph W; Williams, Tiffani L

    2017-12-06

    The inference of species divergence time is a key step in most phylogenetic studies. Methods have been available for the last ten years to perform the inference, but the performance of the methods does not yet scale well to studies with hundreds of taxa and thousands of DNA base pairs. For example a study of 349 primate taxa was estimated to require over 9 months of processing time. In this work, we present a new algorithm, AncestralAge, that significantly improves the performance of the divergence time process. As part of AncestralAge, we demonstrate a new method for the computation of phylogenetic likelihood and our experiments show a 90% improvement in likelihood computation time on the aforementioned dataset of 349 primates taxa with over 60,000 DNA base pairs. Additionally, we show that our new method for the computation of the Bayesian prior on node ages reduces the running time for this computation on the 349 taxa dataset by 99%. Through the use of these new algorithms we open up the ability to perform divergence time inference on large phylogenetic studies.

  19. Computer network time synchronization the network time protocol

    CERN Document Server

    Mills, David L

    2006-01-01

    What started with the sundial has, thus far, been refined to a level of precision based on atomic resonance: Time. Our obsession with time is evident in this continued scaling down to nanosecond resolution and beyond. But this obsession is not without warrant. Precision and time synchronization are critical in many applications, such as air traffic control and stock trading, and pose complex and important challenges in modern information networks.Penned by David L. Mills, the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol

  20. Computational area measurement of orbital floor fractures: Reliability, accuracy and rapidity

    International Nuclear Information System (INIS)

    Schouman, Thomas; Courvoisier, Delphine S.; Imholz, Benoit; Van Issum, Christopher; Scolozzi, Paolo

    2012-01-01

    Objective: To evaluate the reliability, accuracy and rapidity of a specific computational method for assessing the orbital floor fracture area on a CT scan. Method: A computer assessment of the area of the fracture, as well as that of the total orbital floor, was determined on CT scans taken from ten patients. The ratio of the fracture's area to the orbital floor area was also calculated. The test–retest precision of measurement calculations was estimated using the Intraclass Correlation Coefficient (ICC) and Dahlberg's formula to assess the agreement across observers and across measures. The time needed for the complete assessment was also evaluated. Results: The Intraclass Correlation Coefficient across observers was 0.92 [0.85;0.96], and the precision of the measures across observers was 4.9%, according to Dahlberg's formula .The mean time needed to make one measurement was 2 min and 39 s (range, 1 min and 32 s to 4 min and 37 s). Conclusion: This study demonstrated that (1) the area of the orbital floor fracture can be rapidly and reliably assessed by using a specific computer system directly on CT scan images; (2) this method has the potential of being routinely used to standardize the post-traumatic evaluation of orbital fractures

  1. Rapid prototyping of an EEG-based brain-computer interface (BCI).

    Science.gov (United States)

    Guger, C; Schlögl, A; Neuper, C; Walterspacher, D; Strein, T; Pfurtscheller, G

    2001-03-01

    The electroencephalogram (EEG) is modified by motor imagery and can be used by patients with severe motor impairments (e.g., late stage of amyotrophic lateral sclerosis) to communicate with their environment. Such a direct connection between the brain and the computer is known as an EEG-based brain-computer interface (BCI). This paper describes a new type of BCI system that uses rapid prototyping to enable a fast transition of various types of parameter estimation and classification algorithms to real-time implementation and testing. Rapid prototyping is possible by using Matlab, Simulink, and the Real-Time Workshop. It is shown how to automate real-time experiments and perform the interplay between on-line experiments and offline analysis. The system is able to process multiple EEG channels on-line and operates under Windows 95 in real-time on a standard PC without an additional digital signal processor (DSP) board. The BCI can be controlled over the Internet, LAN or modem. This BCI was tested on 3 subjects whose task it was to imagine either left or right hand movement. A classification accuracy between 70% and 95% could be achieved with two EEG channels after some sessions with feedback using an adaptive autoregressive (AAR) model and linear discriminant analysis (LDA).

  2. Late-time dynamics of rapidly rotating black holes

    International Nuclear Information System (INIS)

    Glampedakis, K.; Andersson, N.

    2001-01-01

    We study the late-time behaviour of a dynamically perturbed rapidly rotating black hole. Considering an extreme Kerr black hole, we show that the large number of virtually undamped quasinormal modes (that exist for nonzero values of the azimuthal eigenvalue m) combine in such a way that the field (as observed at infinity) oscillates with an amplitude that decays as 1/t at late times. For a near extreme black hole, these modes, collectively, give rise to an exponentially decaying field which, however, is considerably 'long-lived'. Our analytic results are verified using numerical time-evolutions of the Teukolsky equation. Moreover, we argue that the physical mechanism behind the observed behaviour is the presence of a 'superradiance resonance cavity' immediately outside the black hole. We present this new feature in detail, and discuss whether it may be relevant for astrophysical black holes. (author)

  3. Rapid deuterium exchange-in time for probing conformational change

    International Nuclear Information System (INIS)

    Dharmasiri, K.; Smith, D.L.

    1995-01-01

    Isotopic exchange of protein backbone amide hydrogens has been used extensively as a sensitive probe of protein structure. One of the salient features of hydrogen exchange is the vast range of exchange rates in one protein. Isotopic exchange methods have been used to study the structural features including protein folding and unfolding (1), functionally different forms of proteins (2), protein-protein complexation (3), and protein stability parameter. Many backbone amide protons that are surface accessible and are not involved in hydrogen bonding undergo rapid deuterium exchange. In order to study, fast exchanging amide protons, fast exchange-in times are necessary

  4. A novel brain-computer interface based on the rapid serial visual presentation paradigm.

    Science.gov (United States)

    Acqualagna, Laura; Treder, Matthias Sebastian; Schreuder, Martijn; Blankertz, Benjamin

    2010-01-01

    Most present-day visual brain computer interfaces (BCIs) suffer from the fact that they rely on eye movements, are slow-paced, or feature a small vocabulary. As a potential remedy, we explored a novel BCI paradigm consisting of a central rapid serial visual presentation (RSVP) of the stimuli. It has a large vocabulary and realizes a BCI system based on covert non-spatial selective visual attention. In an offline study, eight participants were presented sequences of rapid bursts of symbols. Two different speeds and two different color conditions were investigated. Robust early visual and P300 components were elicited time-locked to the presentation of the target. Offline classification revealed a mean accuracy of up to 90% for selecting the correct symbol out of 30 possibilities. The results suggest that RSVP-BCI is a promising new paradigm, also for patients with oculomotor impairments.

  5. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  6. Real-Time Thevenin Impedance Computation

    DEFF Research Database (Denmark)

    Sommer, Stefan Horst; Jóhannsson, Hjörtur

    2013-01-01

    operating state, and strict time constraints are difficult to adhere to as the complexity of the grid increases. Several suggested approaches for real-time stability assessment require Thevenin impedances to be determined for the observed system conditions. By combining matrix factorization, graph reduction......, and parallelization, we develop an algorithm for computing Thevenin impedances an order of magnitude faster than previous approaches. We test the factor-and-solve algorithm with data from several power grids of varying complexity, and we show how the algorithm allows realtime stability assessment of complex power...

  7. Graphics processor efficiency for realization of rapid tabular computations

    International Nuclear Information System (INIS)

    Dudnik, V.A.; Kudryavtsev, V.I.; Us, S.A.; Shestakov, M.V.

    2016-01-01

    Capabilities of graphics processing units (GPU) and central processing units (CPU) have been investigated for realization of fast-calculation algorithms with the use of tabulated functions. The realization of tabulated functions is exemplified by the GPU/CPU architecture-based processors. Comparison is made between the operating efficiencies of GPU and CPU, employed for tabular calculations at different conditions of use. Recommendations are formulated for the use of graphical and central processors to speed up scientific and engineering computations through the use of tabulated functions

  8. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  9. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    Science.gov (United States)

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  10. Quantitative evaluation of the disintegration of orally rapid disintegrating tablets by X-ray computed tomography.

    Science.gov (United States)

    Otsuka, Makoto; Yamanaka, Azusa; Uchino, Tomohiro; Otsuka, Kuniko; Sadamoto, Kiyomi; Ohshima, Hiroyuki

    2012-01-01

    To measure the rapid disintegration of Oral Disintegrating Tablets (ODT), a new test (XCT) was developed using X-ray computing tomography (X-ray CT). Placebo ODT, rapid disintegration candy (RDC) and Gaster®-D-Tablets (GAS) were used as model samples. All these ODTs were used to measure oral disintegration time (DT) in distilled water at 37±2°C by XCT. DTs were affected by the width of mesh screens, and degree to which the tablet holder vibrated from air bubbles. An in-vivo tablet disintegration test was performed for RDC using 11 volunteers. DT by the in-vivo method was significantly longer than that using the conventional tester. The experimental conditions for XCT such as the width of the mesh screen and degree of vibration were adjusted to be consistent with human DT values. Since DTs by the XCT method were almost the same as the human data, this method was able to quantitatively evaluate the rapid disintegration of ODT under the same conditions as inside the oral cavity. The DTs of four commercially available ODTs were comparatively evaluated by the XCT method, conventional tablet disintegration test and in-vivo method.

  11. Computing Refined Buneman Trees in Cubic Time

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.; Östlin, A.

    2003-01-01

    Reconstructing the evolutionary tree for a set of n species based on pairwise distances between the species is a fundamental problem in bioinformatics. Neighbor joining is a popular distance based tree reconstruction method. It always proposes fully resolved binary trees despite missing evidence...... in the underlying distance data. Distance based methods based on the theory of Buneman trees and refined Buneman trees avoid this problem by only proposing evolutionary trees whose edges satisfy a number of constraints. These trees might not be fully resolved but there is strong combinatorial evidence for each...... proposed edge. The currently best algorithm for computing the refined Buneman tree from a given distance measure has a running time of O(n 5) and a space consumption of O(n 4). In this paper, we present an algorithm with running time O(n 3) and space consumption O(n 2). The improved complexity of our...

  12. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    Science.gov (United States)

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  13. Real-Time Accumulative Computation Motion Detectors

    Directory of Open Access Journals (Sweden)

    Saturnino Maldonado-Bascón

    2009-12-01

    Full Text Available The neurally inspired accumulative computation (AC method and its application to motion detection have been introduced in the past years. This paper revisits the fact that many researchers have explored the relationship between neural networks and finite state machines. Indeed, finite state machines constitute the best characterized computational model, whereas artificial neural networks have become a very successful tool for modeling and problem solving. The article shows how to reach real-time performance after using a model described as a finite state machine. This paper introduces two steps towards that direction: (a A simplification of the general AC method is performed by formally transforming it into a finite state machine. (b A hardware implementation in FPGA of such a designed AC module, as well as an 8-AC motion detector, providing promising performance results. We also offer two case studies of the use of AC motion detectors in surveillance applications, namely infrared-based people segmentation and color-based people tracking, respectively.

  14. Rapid genetic algorithm optimization of a mouse computational model: Benefits for anthropomorphization of neonatal mouse cardiomyocytes

    Directory of Open Access Journals (Sweden)

    Corina Teodora Bot

    2012-11-01

    Full Text Available While the mouse presents an invaluable experimental model organism in biology, its usefulness in cardiac arrhythmia research is limited in some aspects due to major electrophysiological differences between murine and human action potentials (APs. As previously described, these species-specific traits can be partly overcome by application of a cell-type transforming clamp (CTC to anthropomorphize the murine cardiac AP. CTC is a hybrid experimental-computational dynamic clamp technique, in which a computationally calculated time-dependent current is inserted into a cell in real time, to compensate for the differences between sarcolemmal currents of that cell (e.g., murine and the desired species (e.g., human. For effective CTC performance, mismatch between the measured cell and a mathematical model used to mimic the measured AP must be minimal. We have developed a genetic algorithm (GA approach that rapidly tunes a mathematical model to reproduce the AP of the murine cardiac myocyte under study. Compared to a prior implementation that used a template-based model selection approach, we show that GA optimization to a cell-specific model results in a much better recapitulation of the desired AP morphology with CTC. This improvement was more pronounced when anthropomorphizing neonatal mouse cardiomyocytes to human-like APs than to guinea pig APs. CTC may be useful for a wide range of applications, from screening effects of pharmaceutical compounds on ion channel activity, to exploring variations in the mouse or human genome. Rapid GA optimization of a cell-specific mathematical model improves CTC performance and may therefore expand the applicability and usage of the CTC technique.

  15. Cluster Computing for Embedded/Real-Time Systems

    Science.gov (United States)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  16. A rapid method for the computation of equilibrium chemical composition of air to 15000 K

    Science.gov (United States)

    Prabhu, Ramadas K.; Erickson, Wayne D.

    1988-01-01

    A rapid computational method has been developed to determine the chemical composition of equilibrium air to 15000 K. Eleven chemically reacting species, i.e., O2, N2, O, NO, N, NO+, e-, N+, O+, Ar, and Ar+ are included. The method involves combining algebraically seven nonlinear equilibrium equations and four linear elemental mass balance and charge neutrality equations. Computational speeds for determining the equilibrium chemical composition are significantly faster than the often used free energy minimization procedure. Data are also included from which the thermodynamic properties of air can be computed. A listing of the computer program together with a set of sample results are included.

  17. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  18. Computer-controlled system for rapid soil analysis of 226Ra

    International Nuclear Information System (INIS)

    Doane, R.W.; Berven, B.A.; Blair, M.S.

    1984-01-01

    A computer-controlled multichannel analysis system has been developed by the Radiological Survey Activities Group at Oak Ridge National Laboratory (ORNL) for the Department of Energy (DOE) in support of the DOE's remedial action programs. The purpose of this system is to provide a rapid estimate of the 226 Ra concentration in soil samples using a 6 x 9-in. NaI(Tl) crystal containing a 3.25-in. deep by 3.5-in. diameter well. This gamma detection system is controlled by a mini-computer with a dual floppy disk storage medium. A two-chip interface was also designed at ORNL which handles all control signals generated from the computer keyboard. These computer-generated control signals are processed in machine language for rapid data transfer and BASIC language is used for data processing

  19. 7 CFR 1.603 - How are time periods computed?

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false How are time periods computed? 1.603 Section 1.603... Licenses General Provisions § 1.603 How are time periods computed? (a) General. Time periods are computed as follows: (1) The day of the act or event from which the period begins to run is not included. (2...

  20. 50 CFR 221.3 - How are time periods computed?

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false How are time periods computed? 221.3... Provisions § 221.3 How are time periods computed? (a) General. Time periods are computed as follows: (1) The day of the act or event from which the period begins to run is not included. (2) The last day of the...

  1. 6 CFR 13.27 - Computation of time.

    Science.gov (United States)

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Computation of time. 13.27 Section 13.27 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY PROGRAM FRAUD CIVIL REMEDIES § 13.27 Computation of time. (a) In computing any period of time under this part or in an order issued...

  2. Computed tomographic demonstration of rapid changes in fatty infiltration of the liver

    International Nuclear Information System (INIS)

    Bashist, B.; Hecht, H.L.; Harely, W.D.

    1982-01-01

    Two alcoholic patients in whom computed tomography (CT) demonstrated reversal of fatty infiltration of the liver are described. The rapid reversibility of fatty infiltration can be useful in monitoring alcoholics with fatty livers. Focal fatty infiltration can mimic focal hepatic lesions and repeat scans can be utilized to assess changes in CT attenuation values when this condition is suspected

  3. Time evolution of the wave equation using rapid expansion method

    KAUST Repository

    Pestana, Reynam C.; Stoffa, Paul L.

    2010-01-01

    Forward modeling of seismic data and reverse time migration are based on the time evolution of wavefields. For the case of spatially varying velocity, we have worked on two approaches to evaluate the time evolution of seismic wavefields. An exact solution for the constant-velocity acoustic wave equation can be used to simulate the pressure response at any time. For a spatially varying velocity, a one-step method can be developed where no intermediate time responses are required. Using this approach, we have solved for the pressure response at intermediate times and have developed a recursive solution. The solution has a very high degree of accuracy and can be reduced to various finite-difference time-derivative methods, depending on the approximations used. Although the two approaches are closely related, each has advantages, depending on the problem being solved. © 2010 Society of Exploration Geophysicists.

  4. Time evolution of the wave equation using rapid expansion method

    KAUST Repository

    Pestana, Reynam C.

    2010-07-01

    Forward modeling of seismic data and reverse time migration are based on the time evolution of wavefields. For the case of spatially varying velocity, we have worked on two approaches to evaluate the time evolution of seismic wavefields. An exact solution for the constant-velocity acoustic wave equation can be used to simulate the pressure response at any time. For a spatially varying velocity, a one-step method can be developed where no intermediate time responses are required. Using this approach, we have solved for the pressure response at intermediate times and have developed a recursive solution. The solution has a very high degree of accuracy and can be reduced to various finite-difference time-derivative methods, depending on the approximations used. Although the two approaches are closely related, each has advantages, depending on the problem being solved. © 2010 Society of Exploration Geophysicists.

  5. Rapid Modeling of and Response to Large Earthquakes Using Real-Time GPS Networks (Invited)

    Science.gov (United States)

    Crowell, B. W.; Bock, Y.; Squibb, M. B.

    2010-12-01

    Real-time GPS networks have the advantage of capturing motions throughout the entire earthquake cycle (interseismic, seismic, coseismic, postseismic), and because of this, are ideal for real-time monitoring of fault slip in the region. Real-time GPS networks provide the perfect supplement to seismic networks, which operate with lower noise and higher sampling rates than GPS networks, but only measure accelerations or velocities, putting them at a supreme disadvantage for ascertaining the full extent of slip during a large earthquake in real-time. Here we report on two examples of rapid modeling of recent large earthquakes near large regional real-time GPS networks. The first utilizes Japan’s GEONET consisting of about 1200 stations during the 2003 Mw 8.3 Tokachi-Oki earthquake about 100 km offshore Hokkaido Island and the second investigates the 2010 Mw 7.2 El Mayor-Cucapah earthquake recorded by more than 100 stations in the California Real Time Network. The principal components of strain were computed throughout the networks and utilized as a trigger to initiate earthquake modeling. Total displacement waveforms were then computed in a simulated real-time fashion using a real-time network adjustment algorithm that fixes a station far away from the rupture to obtain a stable reference frame. Initial peak ground displacement measurements can then be used to obtain an initial size through scaling relationships. Finally, a full coseismic model of the event can be run minutes after the event, given predefined fault geometries, allowing emergency first responders and researchers to pinpoint the regions of highest damage. Furthermore, we are also investigating using total displacement waveforms for real-time moment tensor inversions to look at spatiotemporal variations in slip.

  6. Rapid Reconstitution Packages (RRPs) implemented by integration of computational fluid dynamics (CFD) and 3D printed microfluidics.

    Science.gov (United States)

    Chi, Albert; Curi, Sebastian; Clayton, Kevin; Luciano, David; Klauber, Kameron; Alexander-Katz, Alfredo; D'hers, Sebastian; Elman, Noel M

    2014-08-01

    Rapid Reconstitution Packages (RRPs) are portable platforms that integrate microfluidics for rapid reconstitution of lyophilized drugs. Rapid reconstitution of lyophilized drugs using standard vials and syringes is an error-prone process. RRPs were designed using computational fluid dynamics (CFD) techniques to optimize fluidic structures for rapid mixing and integrating physical properties of targeted drugs and diluents. Devices were manufactured using stereo lithography 3D printing for micrometer structural precision and rapid prototyping. Tissue plasminogen activator (tPA) was selected as the initial model drug to test the RRPs as it is unstable in solution. tPA is a thrombolytic drug, stored in lyophilized form, required in emergency settings for which rapid reconstitution is of critical importance. RRP performance and drug stability were evaluated by high-performance liquid chromatography (HPLC) to characterize release kinetics. In addition, enzyme-linked immunosorbent assays (ELISAs) were performed to test for drug activity after the RRPs were exposed to various controlled temperature conditions. Experimental results showed that RRPs provided effective reconstitution of tPA that strongly correlated with CFD results. Simulation and experimental results show that release kinetics can be adjusted by tuning the device structural dimensions and diluent drug physical parameters. The design of RRPs can be tailored for a number of applications by taking into account physical parameters of the active pharmaceutical ingredients (APIs), excipients, and diluents. RRPs are portable platforms that can be utilized for reconstitution of emergency drugs in time-critical therapies.

  7. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    Science.gov (United States)

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  8. Rapid Response Teams: Is it Time to Reframe the Questions of Rapid Response Team Measurement?

    Science.gov (United States)

    Salvatierra, Gail G; Bindler, Ruth C; Daratha, Kenn B

    2016-11-01

    The purpose of this article is to present an overview of rapid response team (RRT) history in the United States, provide a review of prior RRT effectiveness research, and propose the reframing of four new questions of RRT measurement that are designed to better understand RRTs in the context of contemporary nursing practice as well as patient outcomes. RRTs were adopted in the United States because of their intuitive appeal, and despite a lack of evidence for their effectiveness. Subsequent studies used mortality and cardiac arrest rates to measure whether or not RRTs "work." Few studies have thoroughly examined the effect of RRTs on nurses and on nursing practice. An extensive literature review provided the background. Suppositions and four critical, unanswered questions arising from the literature are suggested. The results of RRT effectiveness, which have focused on patient-oriented outcomes, have been ambiguous, contradictory, and difficult to interpret. Additionally, they have not taken into account the multiple ways in which these teams have impacted nurses and nursing practice as well as patient outcomes. What happens in terms of RRT process and utilization is likely to have a major impact on nurses and nursing care on general medical and surgical wards. What that impact will be depends on what we can learn from measuring with an expanded yardstick, in order to answer the question, "Do RRTs work?" Evidence for the benefits of RRTs depends on proper framing of questions relating to their effectiveness, including the multiple ways RRTs contribute to nursing efficacy. © 2016 Sigma Theta Tau International.

  9. Real-time earthquake monitoring: Early warning and rapid response

    Science.gov (United States)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  10. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  11. Real-time data-intensive computing

    Energy Technology Data Exchange (ETDEWEB)

    Parkinson, Dilworth Y., E-mail: dyparkinson@lbl.gov; Chen, Xian; Hexemer, Alexander; MacDowell, Alastair A.; Padmore, Howard A.; Shapiro, David; Tamura, Nobumichi [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Beattie, Keith; Krishnan, Harinarayan; Patton, Simon J.; Perciano, Talita; Stromsness, Rune; Tull, Craig E.; Ushizima, Daniela [Computational Research Division, Lawrence Berkeley National Laboratory Berkeley CA 94720 (United States); Correa, Joaquin; Deslippe, Jack R. [National Energy Research Scientific Computing Center, Berkeley, CA 94720 (United States); Dart, Eli; Tierney, Brian L. [Energy Sciences Network, Berkeley, CA 94720 (United States); Daurer, Benedikt J.; Maia, Filipe R. N. C. [Uppsala University, Uppsala (Sweden); and others

    2016-07-27

    Today users visit synchrotrons as sources of understanding and discovery—not as sources of just light, and not as sources of data. To achieve this, the synchrotron facilities frequently provide not just light but often the entire end station and increasingly, advanced computational facilities that can reduce terabytes of data into a form that can reveal a new key insight. The Advanced Light Source (ALS) has partnered with high performance computing, fast networking, and applied mathematics groups to create a “super-facility”, giving users simultaneous access to the experimental, computational, and algorithmic resources to make this possible. This combination forms an efficient closed loop, where data—despite its high rate and volume—is transferred and processed immediately and automatically on appropriate computing resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beamtime. We will describe our work at the ALS ptychography, scattering, micro-diffraction, and micro-tomography beamlines.

  12. Real time computer system with distributed microprocessors

    International Nuclear Information System (INIS)

    Heger, D.; Steusloff, H.; Syrbe, M.

    1979-01-01

    The usual centralized structure of computer systems, especially of process computer systems, cannot sufficiently use the progress of very large-scale integrated semiconductor technology with respect to increasing the reliability and performance and to decreasing the expenses especially of the external periphery. This and the increasing demands on process control systems has led the authors to generally examine the structure of such systems and to adapt it to the new surroundings. Computer systems with distributed, optical fibre-coupled microprocessors allow a very favourable problem-solving with decentralized controlled buslines and functional redundancy with automatic fault diagnosis and reconfiguration. A fit programming system supports these hardware properties: PEARL for multicomputer systems, dynamic loader, processor and network operating system. The necessary design principles for this are proved mainly theoretically and by value analysis. An optimal overall system of this new generation of process control systems was established, supported by results of 2 PDV projects (modular operating systems, input/output colour screen system as control panel), for the purpose of testing by apllying the system for the control of 28 pit furnaces of a steel work. (orig.) [de

  13. Spying on real-time computers to improve performance

    International Nuclear Information System (INIS)

    Taff, L.M.

    1975-01-01

    The sampled program-counter histogram, an established technique for shortening the execution times of programs, is described for a real-time computer. The use of a real-time clock allows particularly easy implementation. (Auth.)

  14. Computation of a long-time evolution in a Schroedinger system

    International Nuclear Information System (INIS)

    Girard, R.; Kroeger, H.; Labelle, P.; Bajzer, Z.

    1988-01-01

    We compare different techniques for the computation of a long-time evolution and the S matrix in a Schroedinger system. As an application we consider a two-nucleon system interacting via the Yamaguchi potential. We suggest computation of the time evolution for a very short time using Pade approximants, the long-time evolution being obtained by iterative squaring. Within the technique of strong approximation of Moller wave operators (SAM) we compare our calculation with computation of the time evolution in the eigenrepresentation of the Hamiltonian and with the standard Lippmann-Schwinger solution for the S matrix. We find numerical agreement between these alternative methods for time-evolution computation up to half the number of digits of internal machine precision, and fairly rapid convergence of both techniques towards the Lippmann-Schwinger solution

  15. Effectiveness Analysis of a Part-Time Rapid Response System During Operation Versus Nonoperation.

    Science.gov (United States)

    Kim, Youlim; Lee, Dong Seon; Min, Hyunju; Choi, Yun Young; Lee, Eun Young; Song, Inae; Park, Jong Sun; Cho, Young-Jae; Jo, You Hwan; Yoon, Ho Il; Lee, Jae Ho; Lee, Choon-Taek; Do, Sang Hwan; Lee, Yeon Joo

    2017-06-01

    To evaluate the effect of a part-time rapid response system on the occurrence rate of cardiopulmonary arrest by comparing the times of rapid response system operation versus nonoperation. Retrospective cohort study. A 1,360-bed tertiary care hospital. Adult patients admitted to the general ward were screened. Data were collected over 36 months from rapid response system implementation (October 2012 to September 2015) and more than 45 months before rapid response system implementation (January 2009 to September 2012). None. The rapid response system operates from 7 AM to 10 PM on weekdays and from 7 AM to 12 PM on Saturdays. Primary outcomes were the difference of cardiopulmonary arrest incidence between pre-rapid response system and post-rapid response system periods and whether the rapid response system operating time affects the cardiopulmonary arrest incidence. The overall cardiopulmonary arrest incidence (per 1,000 admissions) was 1.43. Although the number of admissions per month and case-mix index were increased (3,555.18 vs 4,564.72, p times (0.82 vs 0.49/1,000 admissions; p = 0.001) but remained similar during rapid response system nonoperating times (0.77 vs 0.73/1,000 admissions; p = 0.729). The implementation of a part-time rapid response system reduced the cardiopulmonary arrest incidence based on the reduction of cardiopulmonary arrest during rapid response system operating times. Further analysis of the cost effectiveness of part-time rapid response system is needed.

  16. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licensing reviews of nuclear power plant structures. The documentation of the Seismic Module of CARES consists of three volumes. This report is Volume 2 of the three volume documentation of the Seismic Module of CARES and represents the User's Manual. 14 refs

  17. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licensing reviews of nuclear power plant structures. The documentation of the Seismic Module of CARES consists of three volumes. This report represents Volume 3 of the volume documentation of the Seismic Module of CARES. It presents three sample problems typically encountered in the Soil-Structure Interaction analyses. 14 refs., 36 figs., 2 tabs

  18. CARES (Computer Analysis for Rapid Evaluation of Structures) Version 1.0, seismic module

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulas, A.J.; Miller, C.A.; Costantino, C.J.

    1990-07-01

    During FY's 1988 and 1989, Brookhaven National Laboratory (BNL) developed the CARES system (Computer Analysis for Rapid Evaluation of Structures) for the US Nuclear Regulatory Commission (NRC). CARES is a PC software system which has been designed to perform structural response computations similar to those encountered in licencing reviews of nuclear power plant structures. The docomentation of the Seismic Module of CARES consists of three volumes. This report represents Volume 1 of the three volume documentation of the Seismic Module of CARES. It concentrates on the theoretical basis of the system and presents modeling assumptions and limitations as well as solution schemes and algorithms of CARES. 31 refs., 6 figs

  19. Real-time patient survey data during routine clinical activities for rapid-cycle quality improvement.

    Science.gov (United States)

    Wofford, James Lucius; Campos, Claudia L; Jones, Robert E; Stevens, Sheila F

    2015-03-12

    Surveying patients is increasingly important for evaluating and improving health care delivery, but practical survey strategies during routine care activities have not been available. We examined the feasibility of conducting routine patient surveys in a primary care clinic using commercially available technology (Web-based survey creation, deployment on tablet computers, cloud-based management of survey data) to expedite and enhance several steps in data collection and management for rapid quality improvement cycles. We used a Web-based data management tool (survey creation, deployment on tablet computers, real-time data accumulation and display of survey results) to conduct four patient surveys during routine clinic sessions over a one-month period. Each survey consisted of three questions and focused on a specific patient care domain (dental care, waiting room experience, care access/continuity, Internet connectivity). Of the 727 available patients during clinic survey days, 316 patients (43.4%) attempted the survey, and 293 (40.3%) completed the survey. For the four 3-question surveys, the average time per survey was overall 40.4 seconds, with a range of 5.4 to 20.3 seconds for individual questions. Yes/No questions took less time than multiple choice questions (average 9.6 seconds versus 14.0). Average response time showed no clear pattern by order of questions or by proctor strategy, but monotonically increased with number of words in the question (30 words)-8.0, 11.8, 16.8, seconds, respectively. This technology-enabled data management system helped capture patient opinions, accelerate turnaround of survey data, with minimal impact on a busy primary care clinic. This new model of patient survey data management is feasible and sustainable in a busy office setting, supports and engages clinicians in the quality improvement process, and harmonizes with the vision of a learning health care system.

  20. Rapid tomographic reconstruction based on machine learning for time-resolved combustion diagnostics

    Science.gov (United States)

    Yu, Tao; Cai, Weiwei; Liu, Yingzheng

    2018-04-01

    Optical tomography has attracted surged research efforts recently due to the progress in both the imaging concepts and the sensor and laser technologies. The high spatial and temporal resolutions achievable by these methods provide unprecedented opportunity for diagnosis of complicated turbulent combustion. However, due to the high data throughput and the inefficiency of the prevailing iterative methods, the tomographic reconstructions which are typically conducted off-line are computationally formidable. In this work, we propose an efficient inversion method based on a machine learning algorithm, which can extract useful information from the previous reconstructions and build efficient neural networks to serve as a surrogate model to rapidly predict the reconstructions. Extreme learning machine is cited here as an example for demonstrative purpose simply due to its ease of implementation, fast learning speed, and good generalization performance. Extensive numerical studies were performed, and the results show that the new method can dramatically reduce the computational time compared with the classical iterative methods. This technique is expected to be an alternative to existing methods when sufficient training data are available. Although this work is discussed under the context of tomographic absorption spectroscopy, we expect it to be useful also to other high speed tomographic modalities such as volumetric laser-induced fluorescence and tomographic laser-induced incandescence which have been demonstrated for combustion diagnostics.

  1. Critical capacity, travel time delays and travel time distribution of rapid mass transit systems

    Science.gov (United States)

    Legara, Erika Fille; Monterola, Christopher; Lee, Kee Khoon; Hung, Gih Guang

    2014-07-01

    We set up a mechanistic agent-based model of a rapid mass transit system. Using empirical data from Singapore's unidentifiable smart fare card, we validate our model by reconstructing actual travel demand and duration of travel statistics. We subsequently use this model to investigate two phenomena that are known to significantly affect the dynamics within the RTS: (1) overloading in trains and (2) overcrowding in the RTS platform. We demonstrate that by varying the loading capacity of trains, a tipping point emerges at which an exponential increase in the duration of travel time delays is observed. We also probe the impact on the rail system dynamics of three types of passenger growth distribution across stations: (i) Dirac delta, (ii) uniform and (iii) geometric, which is reminiscent of the effect of land use on transport. Under the assumption of a fixed loading capacity, we demonstrate the dependence of a given origin-destination (OD) pair on the flow volume of commuters in station platforms.

  2. 29 CFR 1921.22 - Computation of time.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Computation of time. 1921.22 Section 1921.22 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR... WORKERS' COMPENSATION ACT Miscellaneous § 1921.22 Computation of time. Sundays and holidays shall be...

  3. 43 CFR 45.3 - How are time periods computed?

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false How are time periods computed? 45.3... IN FERC HYDROPOWER LICENSES General Provisions § 45.3 How are time periods computed? (a) General... run is not included. (2) The last day of the period is included. (i) If that day is a Saturday, Sunday...

  4. 5 CFR 890.101 - Definitions; time computations.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Definitions; time computations. 890.101....101 Definitions; time computations. (a) In this part, the terms annuitant, carrier, employee, employee... in section 8901 of title 5, United States Code, and supplement the following definitions: Appropriate...

  5. Time computations in anuran auditory systems

    Directory of Open Access Journals (Sweden)

    Gary J Rose

    2014-05-01

    Full Text Available Temporal computations are important in the acoustic communication of anurans. In many cases, calls between closely related species are nearly identical spectrally but differ markedly in temporal structure. Depending on the species, calls can differ in pulse duration, shape and/or rate (i.e., amplitude modulation, direction and rate of frequency modulation, and overall call duration. Also, behavioral studies have shown that anurans are able to discriminate between calls that differ in temporal structure. In the peripheral auditory system, temporal information is coded primarily in the spatiotemporal patterns of activity of auditory-nerve fibers. However, major transformations in the representation of temporal information occur in the central auditory system. In this review I summarize recent advances in understanding how temporal information is represented in the anuran midbrain, with particular emphasis on mechanisms that underlie selectivity for pulse duration and pulse rate (i.e., intervals between onsets of successive pulses. Two types of neurons have been identified that show selectivity for pulse rate: long-interval cells respond well to slow pulse rates but fail to spike or respond phasically to fast pulse rates; conversely, interval-counting neurons respond to intermediate or fast pulse rates, but only after a threshold number of pulses, presented at optimal intervals, have occurred. Duration-selectivity is manifest as short-pass, band-pass or long-pass tuning. Whole-cell patch recordings, in vivo, suggest that excitation and inhibition are integrated in diverse ways to generate temporal selectivity. In many cases, activity-related enhancement or depression of excitatory or inhibitory processes appear to contribute to selective responses.

  6. Rapid expansion and pseudo spectral implementation for reverse time migration in VTI media

    KAUST Repository

    Pestana, Reynam C; Ursin, Bjø rn; Stoffa, Paul L

    2012-01-01

    In isotropic media, we use the scalar acoustic wave equation to perform reverse time migration (RTM) of the recorded pressure wavefield data. In anisotropic media, P- and SV-waves are coupled, and the elastic wave equation should be used for RTM. For computational efficiency, a pseudo-acoustic wave equation is often used. This may be solved using a coupled system of second-order partial differential equations. We solve these using a pseudo spectral method and the rapid expansion method (REM) for the explicit time marching. This method generates a degenerate SV-wave in addition to the P-wave arrivals of interest. To avoid this problem, the elastic wave equation for vertical transversely isotropic (VTI) media can be split into separate wave equations for P- and SV-waves. These separate wave equations are stable, and they can be effectively used to model and migrate seismic data in VTI media where |ε- δ| is small. The artifact for the SV-wave has also been removed. The independent pseudo-differential wave equations can be solved one for each mode using the pseudo spectral method for the spatial derivatives and the REM for the explicit time advance of the wavefield. We show numerically stable and high-resolution modeling and RTM results for the pure P-wave mode in VTI media. © 2012 Sinopec Geophysical Research Institute.

  7. Rapid expansion and pseudo spectral implementation for reverse time migration in VTI media

    KAUST Repository

    Pestana, Reynam C

    2012-04-24

    In isotropic media, we use the scalar acoustic wave equation to perform reverse time migration (RTM) of the recorded pressure wavefield data. In anisotropic media, P- and SV-waves are coupled, and the elastic wave equation should be used for RTM. For computational efficiency, a pseudo-acoustic wave equation is often used. This may be solved using a coupled system of second-order partial differential equations. We solve these using a pseudo spectral method and the rapid expansion method (REM) for the explicit time marching. This method generates a degenerate SV-wave in addition to the P-wave arrivals of interest. To avoid this problem, the elastic wave equation for vertical transversely isotropic (VTI) media can be split into separate wave equations for P- and SV-waves. These separate wave equations are stable, and they can be effectively used to model and migrate seismic data in VTI media where |ε- δ| is small. The artifact for the SV-wave has also been removed. The independent pseudo-differential wave equations can be solved one for each mode using the pseudo spectral method for the spatial derivatives and the REM for the explicit time advance of the wavefield. We show numerically stable and high-resolution modeling and RTM results for the pure P-wave mode in VTI media. © 2012 Sinopec Geophysical Research Institute.

  8. Advanced real time radioscopy and computed tomography

    International Nuclear Information System (INIS)

    Sauerwein, Ch.; Nuding, W.; Grimm, R.; Wiacker, H.

    1996-01-01

    The paper describes three x-ray inspection systems. One radioscopic system is designed for the inspection of castings. The next integrates a radioscopic and a tomographic mode. The radioscopy has a high resolution camera and real time image processor. Radiation sources are a 450 kV industrial and a 200 kV microfocus tube. The third system is a tomographic system with 30 scintillation detectors for the inspection of nuclear waste containers. (author)

  9. Real time computer controlled weld skate

    Science.gov (United States)

    Wall, W. A., Jr.

    1977-01-01

    A real time, adaptive control, automatic welding system was developed. This system utilizes the general case geometrical relationships between a weldment and a weld skate to precisely maintain constant weld speed and torch angle along a contoured workplace. The system is compatible with the gas tungsten arc weld process or can be adapted to other weld processes. Heli-arc cutting and machine tool routing operations are possible applications.

  10. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    National Research Council Canada - National Science Library

    Skjellum, Anthony

    2004-01-01

    Polymorphous Computing Architectures (PCA) rapidly "morph" (reorganize) software and hardware configurations in order to achieve high performance on computation styles ranging from specialized streaming to general threaded applications...

  11. Development of rapid methods for relaxation time mapping and motion estimation using magnetic resonance imaging

    Energy Technology Data Exchange (ETDEWEB)

    Gilani, Syed Irtiza Ali

    2008-09-15

    Recent technological developments in the field of magnetic resonance imaging have resulted in advanced techniques that can reduce the total time to acquire images. For applications such as relaxation time mapping, which enables improved visualisation of in vivo structures, rapid imaging techniques are highly desirable. TAPIR is a Look- Locker-based sequence for high-resolution, multislice T{sub 1} relaxation time mapping. Despite the high accuracy and precision of TAPIR, an improvement in the k-space sampling trajectory is desired to acquire data in clinically acceptable times. In this thesis, a new trajectory, termed line-sharing, is introduced for TAPIR that can potentially reduce the acquisition time by 40 %. Additionally, the line-sharing method was compared with the GRAPPA parallel imaging method. These methods were employed to reconstruct time-point images from the data acquired on a 4T high-field MR research scanner. Multislice, multipoint in vivo results obtained using these methods are presented. Despite improvement in acquisition speed, through line-sharing, for example, motion remains a problem and artefact-free data cannot always be obtained. Therefore, in this thesis, a rapid technique is introduced to estimate in-plane motion. The presented technique is based on calculating the in-plane motion parameters, i.e., translation and rotation, by registering the low-resolution MR images. The rotation estimation method is based on the pseudo-polar FFT, where the Fourier domain is composed of frequencies that reside in an oversampled set of non-angularly, equispaced points. The essence of the method is that unlike other Fourier-based registration schemes, the employed approach does not require any interpolation to calculate the pseudo-polar FFT grid coordinates. Translation parameters are estimated by the phase correlation method. However, instead of two-dimensional analysis of the phase correlation matrix, a low complexity subspace identification of the phase

  12. Development of rapid methods for relaxation time mapping and motion estimation using magnetic resonance imaging

    International Nuclear Information System (INIS)

    Gilani, Syed Irtiza Ali

    2008-09-01

    Recent technological developments in the field of magnetic resonance imaging have resulted in advanced techniques that can reduce the total time to acquire images. For applications such as relaxation time mapping, which enables improved visualisation of in vivo structures, rapid imaging techniques are highly desirable. TAPIR is a Look- Locker-based sequence for high-resolution, multislice T 1 relaxation time mapping. Despite the high accuracy and precision of TAPIR, an improvement in the k-space sampling trajectory is desired to acquire data in clinically acceptable times. In this thesis, a new trajectory, termed line-sharing, is introduced for TAPIR that can potentially reduce the acquisition time by 40 %. Additionally, the line-sharing method was compared with the GRAPPA parallel imaging method. These methods were employed to reconstruct time-point images from the data acquired on a 4T high-field MR research scanner. Multislice, multipoint in vivo results obtained using these methods are presented. Despite improvement in acquisition speed, through line-sharing, for example, motion remains a problem and artefact-free data cannot always be obtained. Therefore, in this thesis, a rapid technique is introduced to estimate in-plane motion. The presented technique is based on calculating the in-plane motion parameters, i.e., translation and rotation, by registering the low-resolution MR images. The rotation estimation method is based on the pseudo-polar FFT, where the Fourier domain is composed of frequencies that reside in an oversampled set of non-angularly, equispaced points. The essence of the method is that unlike other Fourier-based registration schemes, the employed approach does not require any interpolation to calculate the pseudo-polar FFT grid coordinates. Translation parameters are estimated by the phase correlation method. However, instead of two-dimensional analysis of the phase correlation matrix, a low complexity subspace identification of the phase

  13. A computer-controlled system for rapid soil analysis of 226Ra

    International Nuclear Information System (INIS)

    Doane, R.W.; Berven, B.A.; Blair, M.S.

    1984-01-01

    A computer-controlled multichannel analysis system has been developed by the Radiological Survey Activities (RASA) Group at Oak Ridge National Laboratory (ORNL) for the Department of Energy (DOE) in support of the DOE's remedial action programs. The purpose of this system is to provide a rapid estimate of the 226 Ra concentration in soil samples using a 6 x 9 inch NaI(T1) crystal containing a 3.25 inch deep by 3.5 inch diameter well. This gamma detection system is controlled by a minicomputer with a dual floppy disk storage medium, line printer, and optional X-Y plotter. A two-chip interface was also designed at ORNL which handles all control signals generated from the computer keyboard. These computer-generated control signals are processed in machine language for rapid data transfer and BASIC language is used for data processing. The computer system is a Commodore Business Machines (CBM) Model 8032 personal computer with CBM peripherals. Control and data signals are utilized via the parallel user's port to the interface unit. The analog-to-digital converter (ADC) is controlled in machine language, bootstrapped to high memory, and is addressed through the BASIC program. The BASIC program is designed to be ''user friendly'' and provides the operator with several modes of operation such as background and analysis acquisition. Any number of energy regions-of-interest (ROI) may be analyzed with automatic background substraction. Also employed in the BASIC program are the 226 Ra algorithms which utilize linear and polynomial regression equations for data conversion and look-up tables for radon equilibrating coefficients. The optional X-Y plotter may be used with two- or three-dimensional curve programs to enhance data analysis and presentation. A description of the system is presented and typical applications are discussed

  14. Preparing printed circuit boards for rapid turn-around time on a protomat plotter

    International Nuclear Information System (INIS)

    Hawtree, J.

    1998-01-01

    This document describes the use of the LPKF ProtoMat mill/drill unit circuit board Plotter, with the associated CAD/CAM software BoardMaster and CircuitCAM. At present its primarily use here at Fermilab's Particle Physics Department is for rapid-turnover of prototype PCBs double-sided and single-sided copper clad printed circuit boards (PCBs). (The plotter is also capable of producing gravure films and engraving aluminum or plastic although we have not used it for this.) It has the capability of making traces 0.004 inch wide with 0.004 inch spacings which is appropriate for high density surface mount circuits as well as other through-mounted discrete and integrated components. One of the primary benefits of the plotter is the capability to produce double-sided drilled boards from CAD files in a few hours. However to achieve this rapid turn-around time, some care must be taken in preparing the files. This document describes how to optimize the process of PCB fabrication. With proper preparation, researchers can often have a completed circuit board in a day's time instead of a week or two wait with usual procedures. It is assumed that the software and hardware are properly installed and that the machinist is acquainted with the Win95 operating system and the basics of the associated software. This paper does not describe its use with pen plotters, lasers or rubouts. The process of creating a PCB (printed circuit board) begins with the CAD (computer-aided design) software, usually PCAD or VeriBest. These files are then moved to CAM (computer-aided machining) where they are edited and converted to put them into the proper format for running on the ProtoMat plotter. The plotter then performs the actual machining of the board. This document concentrates on the LPKF programs CircuitCam BASIS and BoardMaster for the CAM software. These programs run on a Windows 95 platform to run an LPKF ProtoMat 93s plotter

  15. Original Article. Evaluation of Rapid Detection of Nasopharyngeal Colonization with MRSA by Real-Time PCR

    Directory of Open Access Journals (Sweden)

    Kang Feng-feng

    2012-03-01

    Full Text Available Objective To investigate the clinical application of Real-Time PCR for rapid detection of methicillin-resistant Staphylococcus aureus (MRSA directly from nasopharyngeal swab specimens.

  16. Rapid phenotyping of crop root systems in undisturbed field soils using X-ray computed tomography.

    Science.gov (United States)

    Pfeifer, Johannes; Kirchgessner, Norbert; Colombi, Tino; Walter, Achim

    2015-01-01

    X-ray computed tomography (CT) has become a powerful tool for root phenotyping. Compared to rather classical, destructive methods, CT encompasses various advantages. In pot experiments the growth and development of the same individual root can be followed over time and in addition the unaltered configuration of the 3D root system architecture (RSA) interacting with a real field soil matrix can be studied. Yet, the throughput, which is essential for a more widespread application of CT for basic research or breeding programs, suffers from the bottleneck of rapid and standardized segmentation methods to extract root structures. Using available methods, root segmentation is done to a large extent manually, as it requires a lot of interactive parameter optimization and interpretation and therefore needs a lot of time. Based on commercially available software, this paper presents a protocol that is faster, more standardized and more versatile compared to existing segmentation methods, particularly if used to analyse field samples collected in situ. To the knowledge of the authors this is the first study approaching to develop a comprehensive segmentation method suitable for comparatively large columns sampled in situ which contain complex, not necessarily connected root systems from multiple plants grown in undisturbed field soil. Root systems from several crops were sampled in situ and CT-volumes determined with the presented method were compared to root dry matter of washed root samples. A highly significant (P < 0.01) and strong correlation (R(2) = 0.84) was found, demonstrating the value of the presented method in the context of field research. Subsequent to segmentation, a method for the measurement of root thickness distribution has been used. Root thickness is a central RSA trait for various physiological research questions such as root growth in compacted soil or under oxygen deficient soil conditions, but hardly assessable in high throughput until today, due

  17. Recent achievements in real-time computational seismology in Taiwan

    Science.gov (United States)

    Lee, S.; Liang, W.; Huang, B.

    2012-12-01

    Real-time computational seismology is currently possible to be achieved which needs highly connection between seismic database and high performance computing. We have developed a real-time moment tensor monitoring system (RMT) by using continuous BATS records and moment tensor inversion (CMT) technique. The real-time online earthquake simulation service is also ready to open for researchers and public earthquake science education (ROS). Combine RMT with ROS, the earthquake report based on computational seismology can provide within 5 minutes after an earthquake occurred (RMT obtains point source information ROS completes a 3D simulation real-time now. For more information, welcome to visit real-time computational seismology earthquake report webpage (RCS).

  18. Time-of-Flight Cameras in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2010-01-01

    Computer Graphics, Computer Vision and Human Machine Interaction (HMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real-time geometry...

  19. 29 CFR 4245.8 - Computation of time.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Computation of time. 4245.8 Section 4245.8 Labor Regulations Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION INSOLVENCY, REORGANIZATION, TERMINATION, AND OTHER RULES APPLICABLE TO MULTIEMPLOYER PLANS NOTICE OF INSOLVENCY § 4245.8 Computation of...

  20. Noise-constrained switching times for heteroclinic computing

    Science.gov (United States)

    Neves, Fabio Schittler; Voit, Maximilian; Timme, Marc

    2017-03-01

    Heteroclinic computing offers a novel paradigm for universal computation by collective system dynamics. In such a paradigm, input signals are encoded as complex periodic orbits approaching specific sequences of saddle states. Without inputs, the relevant states together with the heteroclinic connections between them form a network of states—the heteroclinic network. Systems of pulse-coupled oscillators or spiking neurons naturally exhibit such heteroclinic networks of saddles, thereby providing a substrate for general analog computations. Several challenges need to be resolved before it becomes possible to effectively realize heteroclinic computing in hardware. The time scales on which computations are performed crucially depend on the switching times between saddles, which in turn are jointly controlled by the system's intrinsic dynamics and the level of external and measurement noise. The nonlinear dynamics of pulse-coupled systems often strongly deviate from that of time-continuously coupled (e.g., phase-coupled) systems. The factors impacting switching times in pulse-coupled systems are still not well understood. Here we systematically investigate switching times in dependence of the levels of noise and intrinsic dissipation in the system. We specifically reveal how local responses to pulses coact with external noise. Our findings confirm that, like in time-continuous phase-coupled systems, piecewise-continuous pulse-coupled systems exhibit switching times that transiently increase exponentially with the number of switches up to some order of magnitude set by the noise level. Complementarily, we show that switching times may constitute a good predictor for the computation reliability, indicating how often an input signal must be reiterated. By characterizing switching times between two saddles in conjunction with the reliability of a computation, our results provide a first step beyond the coding of input signal identities toward a complementary coding for

  1. The use of real-time polymerase chain reaction for rapid diagnosis of skeletal tuberculosis.

    Science.gov (United States)

    Kobayashi, Naomi; Fraser, Thomas G; Bauer, Thomas W; Joyce, Michael J; Hall, Gerri S; Tuohy, Marion J; Procop, Gary W

    2006-07-01

    We identified Mycobacterium tuberculosis DNA using real-time polymerase chain reaction on a specimen from an osteolytic lesion of a femoral condyle, in which the frozen section demonstrated granulomas. The process was much more rapid than is possible with culture. The rapid detection of M tuberculosis and the concomitant exclusion of granulomatous disease caused by nontuberculous mycobacteria or systemic fungi are necessary to appropriately treat skeletal tuberculosis. The detection and identification of M tuberculosis by culture may require several weeks using traditional methods. The real-time polymerase chain reaction method used has been shown to be rapid and reliable, and is able to detect and differentiate both tuberculous and nontuberculous mycobacteria. Real-time polymerase chain reaction may become a diagnostic standard for the evaluation of clinical specimens for the presence of mycobacteria; this case demonstrates the potential utility of this assay for the rapid diagnosis of skeletal tuberculosis.

  2. Relativistic Photoionization Computations with the Time Dependent Dirac Equation

    Science.gov (United States)

    2016-10-12

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6795--16-9698 Relativistic Photoionization Computations with the Time Dependent Dirac... Photoionization Computations with the Time Dependent Dirac Equation Daniel F. Gordon and Bahman Hafizi Naval Research Laboratory 4555 Overlook Avenue, SW...Unclassified Unlimited Unclassified Unlimited 22 Daniel Gordon (202) 767-5036 Tunneling Photoionization Ionization of inner shell electrons by laser

  3. Reading Time Allocation Strategies and Working Memory Using Rapid Serial Visual Presentation

    Science.gov (United States)

    Busler, Jessica N.; Lazarte, Alejandro A.

    2017-01-01

    Rapid serial visual presentation (RSVP) is a useful method for controlling the timing of text presentations and studying how readers' characteristics, such as working memory (WM) and reading strategies for time allocation, influence text recall. In the current study, a modified version of RSVP (Moving Window RSVP [MW-RSVP]) was used to induce…

  4. A rapid and direct real time PCR-based method for identification of Salmonella spp

    DEFF Research Database (Denmark)

    Rodriguez-Lazaro, D.; Hernández, Marta; Esteve, T.

    2003-01-01

    The aim of this work was the validation of a rapid, real-time PCR assay based on TaqMan((R)) technology for the unequivocal identification of Salmonella spp. to be used directly on an agar-grown colony. A real-time PCR system targeting at the Salmonella spp. invA gene was optimized and validated ...

  5. A Distributed Computing Network for Real-Time Systems.

    Science.gov (United States)

    1980-11-03

    7 ) AU2 o NAVA TUNDEWATER SY$TEMS CENTER NEWPORT RI F/G 9/2 UIS RIBUT E 0 COMPUTIN G N LTWORK FOR REAL - TIME SYSTEMS .(U) UASSIFIED NOV Al 6 1...MORAIS - UT 92 dLEVEL c A Distributed Computing Network for Real - Time Systems . 11 𔃺-1 Gordon E/Morson I7 y tm- ,r - t "en t As J 2 -p .. - 7 I’ cNaval...NUMBER TD 5932 / N 4. TITLE mand SubotI. S. TYPE OF REPORT & PERIOD COVERED A DISTRIBUTED COMPUTING NETWORK FOR REAL - TIME SYSTEMS 6. PERFORMING ORG

  6. Computer simulations of long-time tails: what's new?

    NARCIS (Netherlands)

    Hoef, van der M.A.; Frenkel, D.

    1995-01-01

    Twenty five years ago Alder and Wainwright discovered, by simulation, the 'long-time tails' in the velocity autocorrelation function of a single particle in fluid [1]. Since then, few qualitatively new results on long-time tails have been obtained by computer simulations. However, within the

  7. OVERVIEW OF DEVELOPMENT OF P-CARES: PROBABILISTIC COMPUTER ANALYSIS FOR RAPID EVALUATION OF STRUCTURES

    International Nuclear Information System (INIS)

    NIE, J.; XU, J.; COSTANTINO, C.; THOMAS, V.

    2007-01-01

    Brookhaven National Laboratory (BNL) undertook an effort to revise the CARES (Computer Analysis for Rapid Evaluation of Structures) program under the auspices of the US Nuclear Regulatory Commission (NRC). The CARES program provided the NRC staff a capability to quickly check the validity and/or accuracy of the soil-structure interaction (SSI) models and associated data received from various applicants. The aim of the current revision was to implement various probabilistic simulation algorithms in CARES (referred hereinafter as P-CARES [1]) for performing the probabilistic site response and soil-structure interaction (SSI) analyses. This paper provides an overview of the development process of P-CARES, including the various probabilistic simulation techniques used to incorporate the effect of site soil uncertainties into the seismic site response and SSI analyses and an improved graphical user interface (GUI)

  8. Computation of reactor control rod drop time under accident conditions

    International Nuclear Information System (INIS)

    Dou Yikang; Yao Weida; Yang Renan; Jiang Nanyan

    1998-01-01

    The computational method of reactor control rod drop time under accident conditions lies mainly in establishing forced vibration equations for the components under action of outside forces on control rod driven line and motion equation for the control rod moving in vertical direction. The above two kinds of equations are connected by considering the impact effects between control rod and its outside components. Finite difference method is adopted to make discretization of the vibration equations and Wilson-θ method is applied to deal with the time history problem. The non-linearity caused by impact is iteratively treated with modified Newton method. Some experimental results are used to validate the validity and reliability of the computational method. Theoretical and experimental testing problems show that the computer program based on the computational method is applicable and reliable. The program can act as an effective tool of design by analysis and safety analysis for the relevant components

  9. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  10. Continuous-Time Symmetric Hopfield Nets are Computationally Universal

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Orponen, P.

    2003-01-01

    Roč. 15, č. 3 (2003), s. 693-733 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : continuous-time Hopfield network * Liapunov function * analog computation * computational power * Turing universality Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003

  11. Cone-beam computed tomography evaluation of dentoskeletal changes after asymmetric rapid maxillary expansion.

    Science.gov (United States)

    Baka, Zeliha Muge; Akin, Mehmet; Ucar, Faruk Izzet; Ileri, Zehra

    2015-01-01

    The aims of this study were to quantitatively evaluate the changes in arch widths and buccolingual inclinations of the posterior teeth after asymmetric rapid maxillary expansion (ARME) and to compare the measurements between the crossbite and the noncrossbite sides with cone-beam computed tomography (CBCT). From our clinic archives, we selected the CBCT records of 30 patients with unilateral skeletal crossbite (13 boys, 14.2 ± 1.3 years old; 17 girls, 13.8 ± 1.3 years old) who underwent ARME treatment. A modified acrylic bonded rapid maxillary expansion appliance including an occlusal locking mechanism was used in all patients. CBCT records had been taken before ARME treatment and after a 3-month retention period. Fourteen angular and 80 linear measurements were taken for the maxilla and the mandible. Frontally clipped CBCT images were used for the evaluation. Paired sample and independent sample t tests were used for statistical comparisons. Comparisons of the before-treatment and after-retention measurements showed that the arch widths and buccolingual inclinations of the posterior teeth increased significantly on the crossbite side of the maxilla and on the noncrossbite side of the mandible (P ARME treatment, the crossbite side of the maxilla and the noncrossbite side of the mandible were more affected than were the opposite sides. Copyright © 2015. Published by Elsevier Inc.

  12. A computer-based matrix for rapid calculation of pulmonary hemodynamic parameters in congenital heart disease

    International Nuclear Information System (INIS)

    Lopes, Antonio Augusto; Miranda, Rogerio dos Anjos; Goncalves, Rilvani Cavalcante; Thomaz, Ana Maria

    2009-01-01

    In patients with congenital heart disease undergoing cardiac catheterization for hemodynamic purposes, parameter estimation by the indirect Fick method using a single predicted value of oxygen consumption has been a matter of criticism. We developed a computer-based routine for rapid estimation of replicate hemodynamic parameters using multiple predicted values of oxygen consumption. Using Microsoft Excel facilities, we constructed a matrix containing 5 models (equations) for prediction of oxygen consumption, and all additional formulas needed to obtain replicate estimates of hemodynamic parameters. By entering data from 65 patients with ventricular septal defects, aged 1 month to 8 years, it was possible to obtain multiple predictions for oxygen consumption, with clear between-age groups ( P <.001) and between-methods ( P <.001) differences. Using these predictions in the individual patient, it was possible to obtain the upper and lower limits of a likely range for any given parameter, which made estimation more realistic. The organized matrix allows for rapid obtainment of replicate parameter estimates, without error due to exhaustive calculations. (author)

  13. Heterogeneous real-time computing in radio astronomy

    Science.gov (United States)

    Ford, John M.; Demorest, Paul; Ransom, Scott

    2010-07-01

    Modern computer architectures suited for general purpose computing are often not the best choice for either I/O-bound or compute-bound problems. Sometimes the best choice is not to choose a single architecture, but to take advantage of the best characteristics of different computer architectures to solve your problems. This paper examines the tradeoffs between using computer systems based on the ubiquitous X86 Central Processing Units (CPU's), Field Programmable Gate Array (FPGA) based signal processors, and Graphical Processing Units (GPU's). We will show how a heterogeneous system can be produced that blends the best of each of these technologies into a real-time signal processing system. FPGA's tightly coupled to analog-to-digital converters connect the instrument to the telescope and supply the first level of computing to the system. These FPGA's are coupled to other FPGA's to continue to provide highly efficient processing power. Data is then packaged up and shipped over fast networks to a cluster of general purpose computers equipped with GPU's, which are used for floating-point intensive computation. Finally, the data is handled by the CPU and written to disk, or further processed. Each of the elements in the system has been chosen for its specific characteristics and the role it can play in creating a system that does the most for the least, in terms of power, space, and money.

  14. Highly reliable computer network for real time system

    International Nuclear Information System (INIS)

    Mohammed, F.A.; Omar, A.A.; Ayad, N.M.A.; Madkour, M.A.I.; Ibrahim, M.K.

    1988-01-01

    Many of computer networks have been studied different trends regarding the network architecture and the various protocols that govern data transfers and guarantee a reliable communication among all a hierarchical network structure has been proposed to provide a simple and inexpensive way for the realization of a reliable real-time computer network. In such architecture all computers in the same level are connected to a common serial channel through intelligent nodes that collectively control data transfers over the serial channel. This level of computer network can be considered as a local area computer network (LACN) that can be used in nuclear power plant control system since it has geographically dispersed subsystems. network expansion would be straight the common channel for each added computer (HOST). All the nodes are designed around a microprocessor chip to provide the required intelligence. The node can be divided into two sections namely a common section that interfaces with serial data channel and a private section to interface with the host computer. This part would naturally tend to have some variations in the hardware details to match the requirements of individual host computers. fig 7

  15. TimeSet: A computer program that accesses five atomic time services on two continents

    Science.gov (United States)

    Petrakis, P. L.

    1993-01-01

    TimeSet is a shareware program for accessing digital time services by telephone. At its initial release, it was capable of capturing time signals only from the U.S. Naval Observatory to set a computer's clock. Later the ability to synchronize with the National Institute of Standards and Technology was added. Now, in Version 7.10, TimeSet is able to access three additional telephone time services in Europe - in Sweden, Austria, and Italy - making a total of five official services addressable by the program. A companion program, TimeGen, allows yet another source of telephone time data strings for callers equipped with TimeSet version 7.10. TimeGen synthesizes UTC time data strings in the Naval Observatory's format from an accurately set and maintained DOS computer clock, and transmits them to callers. This allows an unlimited number of 'freelance' time generating stations to be created. Timesetting from TimeGen is made feasible by the advent of Becker's RighTime, a shareware program that learns the drift characteristics of a computer's clock and continuously applies a correction to keep it accurate, and also brings .01 second resolution to the DOS clock. With clock regulation by RighTime and periodic update calls by the TimeGen station to an official time source via TimeSet, TimeGen offers the same degree of accuracy within the resolution of the computer clock as any official atomic time source.

  16. Computing return times or return periods with rare event algorithms

    Science.gov (United States)

    Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy

    2018-04-01

    The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.

  17. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  18. Rapid, computer vision-enabled murine screening system identifies neuropharmacological potential of two new mechanisms

    Directory of Open Access Journals (Sweden)

    Steven L Roberds

    2011-09-01

    Full Text Available The lack of predictive in vitro models for behavioral phenotypes impedes rapid advancement in neuropharmacology and psychopharmacology. In vivo behavioral assays are more predictive of activity in human disorders, but such assays are often highly resource-intensive. Here we describe the successful application of a computer vision-enabled system to identify potential neuropharmacological activity of two new mechanisms. The analytical system was trained using multiple drugs that are used clinically to treat depression, schizophrenia, anxiety, and other psychiatric or behavioral disorders. During blinded testing the PDE10 inhibitor TP-10 produced a signature of activity suggesting potential antipsychotic activity. This finding is consistent with TP-10’s activity in multiple rodent models that is similar to that of clinically used antipsychotic drugs. The CK1ε inhibitor PF-670462 produced a signature consistent with anxiolytic activity and, at the highest dose tested, behavioral effects similar to that of opiate analgesics. Neither TP-10 nor PF-670462 was included in the training set. Thus, computer vision-based behavioral analysis can facilitate drug discovery by identifying neuropharmacological effects of compounds acting through new mechanisms.

  19. SU-F-J-102: Lower Esophagus Margin Implications Based On Rapid Computational Algorithm for SBRT

    Energy Technology Data Exchange (ETDEWEB)

    Cardenas, M; Mazur, T; Li, H; Mutic, S; Bradley, J; Tsien, C; Green, O [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: To quantify inter-fraction esophagus-variation. Methods: Computed tomography and daily on-treatment 0.3-T MRI data sets for 7 patients were analyzed using a novel Matlab-based (Mathworks, Natick, MA) rapid computational method. Rigid registration was performed from the cricoid to the gastro-esophageal junction. CT and MR-based contours were compared at slice intervals of 3mm. Variation was quantified by “expansion,” defined as additional length in any radial direction from CT contour to MR contour. Expansion computations were performed with 360° of freedom in each axial slice. We partitioned expansions into left anterior, right anterior, right posterior, and left posterior quadrants (LA, RA, RP, and LP, respectively). Sample means were compared by analysis of variance (ANOVA) and Fisher’s Protected Least Significant Difference test. Results: Fifteen fractions and 1121 axial slices from 7 patients undergoing SBRT for primary lung cancer (3) and metastatic lung disease (4) were analyzed, generating 41,970 measurements. Mean LA, RA, RP, and LP expansions were 4.30±0.05 mm, 3.71±0.05mm, 3.17±0.07, and 3.98±0.06mm, respectively. 50.13% of all axial slices showed variation > 5 mm in one or more directions. Variation was greatest in lower esophagus with mean LA, RA, RP, and LP expansion (5.98±0.09 mm, 4.59±0.09 mm, 4.04±0.16 mm, and 5.41±0.16 mm, respectively). The difference was significant compared to mid and upper esophagus (p<.0001). The 95th percentiles of expansion for LA, RA, RP, LP were 13.36 mm, 9.97 mm, 11.29 mm, and 12.19 mm, respectively. Conclusion: Analysis of on-treatment MR imaging of the lower esophagus during thoracic SBRT suggests margin expansions of 13.36 mm LA, 9.97 mm RA, 11.29 mm RP, 12.19 mm LP would account for 95% of measurements. Our novel algorithm for rapid assessment of margin expansion for critical structures with 360° of freedom in each axial slice enables continuously adaptive patient-specific margins which may

  20. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    Science.gov (United States)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  1. Effect of the MCNP model definition on the computation time

    International Nuclear Information System (INIS)

    Šunka, Michal

    2017-01-01

    The presented work studies the influence of the method of defining the geometry in the MCNP transport code and its impact on the computational time, including the difficulty of preparing an input file describing the given geometry. Cases using different geometric definitions including the use of basic 2-dimensional and 3-dimensional objects and theirs combinations were studied. The results indicate that an inappropriate definition can increase the computational time by up to 59% (a more realistic case indicates 37%) for the same results and the same statistical uncertainty. (orig.)

  2. TV time but not computer time is associated with cardiometabolic risk in Dutch young adults.

    Science.gov (United States)

    Altenburg, Teatske M; de Kroon, Marlou L A; Renders, Carry M; Hirasing, Remy; Chinapaw, Mai J M

    2013-01-01

    TV time and total sedentary time have been positively related to biomarkers of cardiometabolic risk in adults. We aim to examine the association of TV time and computer time separately with cardiometabolic biomarkers in young adults. Additionally, the mediating role of waist circumference (WC) is studied. Data of 634 Dutch young adults (18-28 years; 39% male) were used. Cardiometabolic biomarkers included indicators of overweight, blood pressure, blood levels of fasting plasma insulin, cholesterol, glucose, triglycerides and a clustered cardiometabolic risk score. Linear regression analyses were used to assess the cross-sectional association of self-reported TV and computer time with cardiometabolic biomarkers, adjusting for demographic and lifestyle factors. Mediation by WC was checked using the product-of-coefficient method. TV time was significantly associated with triglycerides (B = 0.004; CI = [0.001;0.05]) and insulin (B = 0.10; CI = [0.01;0.20]). Computer time was not significantly associated with any of the cardiometabolic biomarkers. We found no evidence for WC to mediate the association of TV time or computer time with cardiometabolic biomarkers. We found a significantly positive association of TV time with cardiometabolic biomarkers. In addition, we found no evidence for WC as a mediator of this association. Our findings suggest a need to distinguish between TV time and computer time within future guidelines for screen time.

  3. Real-time computational photon-counting LiDAR

    Science.gov (United States)

    Edgar, Matthew; Johnson, Steven; Phillips, David; Padgett, Miles

    2018-03-01

    The availability of compact, low-cost, and high-speed MEMS-based spatial light modulators has generated widespread interest in alternative sampling strategies for imaging systems utilizing single-pixel detectors. The development of compressed sensing schemes for real-time computational imaging may have promising commercial applications for high-performance detectors, where the availability of focal plane arrays is expensive or otherwise limited. We discuss the research and development of a prototype light detection and ranging (LiDAR) system via direct time of flight, which utilizes a single high-sensitivity photon-counting detector and fast-timing electronics to recover millimeter accuracy three-dimensional images in real time. The development of low-cost real time computational LiDAR systems could have importance for applications in security, defense, and autonomous vehicles.

  4. Imprecise results: Utilizing partial computations in real-time systems

    Science.gov (United States)

    Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.

    1987-01-01

    In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.

  5. Real-Time GPS Monitoring for Earthquake Rapid Assessment in the San Francisco Bay Area

    Science.gov (United States)

    Guillemot, C.; Langbein, J. O.; Murray, J. R.

    2012-12-01

    The U.S. Geological Survey Earthquake Science Center has deployed a network of eight real-time Global Positioning System (GPS) stations in the San Francisco Bay area and is implementing software applications to continuously evaluate the status of the deformation within the network. Real-time monitoring of the station positions is expected to provide valuable information for rapidly estimating source parameters should a large earthquake occur in the San Francisco Bay area. Because earthquake response applications require robust data access, as a first step we have developed a suite of web-based applications which are now routinely used to monitor the network's operational status and data streaming performance. The web tools provide continuously updated displays of important telemetry parameters such as data latency and receive rates, as well as source voltage and temperature information within each instrument enclosure. Automated software on the backend uses the streaming performance data to mitigate the impact of outages, radio interference and bandwidth congestion on deformation monitoring operations. A separate set of software applications manages the recovery of lost data due to faulty communication links. Displacement estimates are computed in real-time for various combinations of USGS, Plate Boundary Observatory (PBO) and Bay Area Regional Deformation (BARD) network stations. We are currently comparing results from two software packages (one commercial and one open-source) used to process 1-Hz data on the fly and produce estimates of differential positions. The continuous monitoring of telemetry makes it possible to tune the network to minimize the impact of transient interruptions of the data flow, from one or more stations, on the estimated positions. Ongoing work is focused on using data streaming performance history to optimize the quality of the position, reduce drift and outliers by switching to the best set of stations within the network, and

  6. Rapid determination of long-lived artificial alpha radionuclides using time interval analysis

    International Nuclear Information System (INIS)

    Uezu, Yasuhiro; Koarashi, Jun; Sanada, Yukihisa; Hashimoto, Tetsuo

    2003-01-01

    It is important to monitor long lived alpha radionuclides as plutonium ( 238 Pu, 239+240 Pu) in the field of working area and environment of nuclear fuel cycle facilities, because it is well known that potential risks of cancer-causing from alpha radiation is higher than gamma radiations. Thus, these monitoring are required high sensitivity, high resolution and rapid determination in order to measure a very low-level concentration of plutonium isotopes. In such high sensitive monitoring, natural radionuclides, including radon ( 222 Rn or 220 Rn) and their progenies, should be eliminated as low as possible. In this situation, a sophisticated discrimination method between Pu and progenies of 222 Rn or 220 Rn using time interval analysis (TIA), which was able to subtract short-lived radionuclides using the time interval distributions calculation of successive alpha and beta decay events within millisecond or microsecond orders, was designed and developed. In this system, alpha rays from 214 Po, 216 Po and 212 Po are extractable. TIA measuring system composes of Silicon Surface Barrier Detector (SSD), an amplifier, an Analog to Digital Converter (ADC), a Multi-Channel Analyzer (MCA), a high-resolution timer (TIMER), a multi-parameter collector and a personal computer. In ADC, incidental alpha and beta pulses are sent to the MCA and the TIMER simultaneously. Pulses from them are synthesized by the multi-parameter collector. After measurement, natural radionuclides are subtracted. Airborne particles were collected on membrane filter for 60 minutes at 100 L/min. Small Pu particles were added on the surface of it. Alpha and beta rays were measured and natural radionuclides were subtracted within 5 times of 145 msec. by TIA. As a result of it, the hidden Pu in natural background could be recognized clearly. The lower limit of determination of 239 Pu is calculated as 6x10 -9 Bq/cm 3 . This level is satisfied with the derived air concentration (DAC) of 239 Pu (8x10 -9 Bq/cm 3

  7. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  8. Rapid expansion method (REM) for time‐stepping in reverse time migration (RTM)

    KAUST Repository

    Pestana, Reynam C.; Stoffa, Paul L.

    2009-01-01

    an analytical approximation for the Bessel function where we assume that the time step is sufficiently small. From this derivation we find that if we consider only the first two Chebyshev polynomials terms in the rapid expansion method we can obtain the second

  9. Rapid and real-time detection technologies for emerging viruses of ...

    Indian Academy of Sciences (India)

    2008-10-17

    Oct 17, 2008 ... The development of technologies with rapid and sensitive detection capabilities and increased throughput have become crucial for responding to greater number threats posed by emerging and re-emerging viruses in the recent past. The conventional identification methods require time-consuming culturing ...

  10. Time is of essence; rapid identification of veterinary pathogens using MALDI TOF

    DEFF Research Database (Denmark)

    Nonnemann, Bettina; Dalsgaard, Inger; Pedersen, Karl

    Rapid and accurate identification of microbial pathogens is a cornerstone for timely and correct treatment of diseases of livestock and fish. The utility of the MALDI-TOF technique in the diagnostic laboratory is directly related to the quality of mass spectra and quantity of different microbial...

  11. New layer-based imaging and rapid prototyping techniques for computer-aided design and manufacture of custom dental restoration.

    Science.gov (United States)

    Lee, M-Y; Chang, C-C; Ku, Y C

    2008-01-01

    Fixed dental restoration by conventional methods greatly relies on the skill and experience of the dental technician. The quality and accuracy of the final product depends mostly on the technician's subjective judgment. In addition, the traditional manual operation involves many complex procedures, and is a time-consuming and labour-intensive job. Most importantly, no quantitative design and manufacturing information is preserved for future retrieval. In this paper, a new device for scanning the dental profile and reconstructing 3D digital information of a dental model based on a layer-based imaging technique, called abrasive computer tomography (ACT) was designed in-house and proposed for the design of custom dental restoration. The fixed partial dental restoration was then produced by rapid prototyping (RP) and computer numerical control (CNC) machining methods based on the ACT scanned digital information. A force feedback sculptor (FreeForm system, Sensible Technologies, Inc., Cambridge MA, USA), which comprises 3D Touch technology, was applied to modify the morphology and design of the fixed dental restoration. In addition, a comparison of conventional manual operation and digital manufacture using both RP and CNC machining technologies for fixed dental restoration production is presented. Finally, a digital custom fixed restoration manufacturing protocol integrating proposed layer-based dental profile scanning, computer-aided design, 3D force feedback feature modification and advanced fixed restoration manufacturing techniques is illustrated. The proposed method provides solid evidence that computer-aided design and manufacturing technologies may become a new avenue for custom-made fixed restoration design, analysis, and production in the 21st century.

  12. Real-time brain computer interface using imaginary movements

    DEFF Research Database (Denmark)

    El-Madani, Ahmad; Sørensen, Helge Bjarup Dissing; Kjær, Troels W.

    2015-01-01

    Background: Brain Computer Interface (BCI) is the method of transforming mental thoughts and imagination into actions. A real-time BCI system can improve the quality of life of patients with severe neuromuscular disorders by enabling them to communicate with the outside world. In this paper...

  13. GRAPHIC, time-sharing magnet design computer programs at Argonne

    International Nuclear Information System (INIS)

    Lari, R.J.

    1974-01-01

    This paper describes three magnet design computer programs in use at the Zero Gradient Synchrotron of Argonne National Laboratory. These programs are used in the time sharing mode in conjunction with a Tektronix model 4012 graphic display terminal. The first program in called TRIM, the second MAGNET, and the third GFUN. (U.S.)

  14. Instructional Advice, Time Advice and Learning Questions in Computer Simulations

    Science.gov (United States)

    Rey, Gunter Daniel

    2010-01-01

    Undergraduate students (N = 97) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without instructional advice) x 2 (with or without time advice) x 2…

  15. Radiographic and computed tomographic demonstration of pseudotumor cerebri due to rapid weight gain in a child with pelvic rhabdomyosarcoma

    Energy Technology Data Exchange (ETDEWEB)

    Berdon, W.E.; Barker, D.H.; Barash, F.S.

    1982-06-01

    Rapid weight gain in a malnourished child can be associated with suture diastasis in the pattern of pseudotumor cerebri; this has been previously reported in deprivational dwarfism and cystic fibrosis. In a child with pelvic rhabdomyosarcoma, skull radiographs and cranial computed tomographic (CT) scans were available prior to a period of rapid weight gain induced by hyperalimentation. Suture diastasis developed and repeat CT scans showed this to be accompanied by smaller ventricles.

  16. Radiographic and computed tomographic demonstration of pseudotumor cerebri due to rapid weight gain in a child with pelvic rhabdomyosarcoma

    International Nuclear Information System (INIS)

    Berdon, W.E.; Barker, D.H.; Barash, F.S.

    1982-01-01

    Rapid weight gain in a malnourished child can be associated with suture diastasis in the pattern of pseudotumor cerebri; this has been previously reported in deprivational dwarfism and cystic fibrosis. In a child with pelvic rhabdomyosarcoma, skull radiographs and cranial computed tomographic (CT) scans were available prior to a period of rapid weight gain induced by hyperalimentation. Suture diastasis developed and repeat CT scans showed this to be accompanied by smaller ventricles

  17. [Introduction and some problems of the rapid time series laboratory reporting system].

    Science.gov (United States)

    Kanao, M; Yamashita, K; Kuwajima, M

    1999-09-01

    We introduced an on-line system of biochemical, hematological, serological, urinary, bacteriological, and emergency examinations and associated office work using a client server system NEC PC-LACS based on a system consisting of concentration of outpatient blood collection, concentration of outpatient reception, and outpatient examination by reservation. Using this on-line system, results of 71 items in chemical serological, hematological, and urinary examinations are rapidly reported within 1 hour. Since the ordering system at our hospital has not been completed yet, we constructed a rapid time series reporting system in which time series data obtained on 5 serial occasions are printed on 2 sheets of A4 paper at the time of the final report. In each consultation room of the medical outpatient clinic, at the neuromedical outpatient clinic, and at the kidney center where examinations are frequently performed, terminal equipment and a printer for inquiry were established for real-time output of time series reports. Results are reported by FAX to the other outpatient clinics and wards, and subsequently, time series reports are output at the clinical laboratory department. This system allowed rapid examination, especially preconsultation examination. This system was also useful for reducing office work and effectively utilize examination data.

  18. Mapping land cover through time with the Rapid Land Cover Mapper—Documentation and user manual

    Science.gov (United States)

    Cotillon, Suzanne E.; Mathis, Melissa L.

    2017-02-15

    The Rapid Land Cover Mapper is an Esri ArcGIS® Desktop add-in, which was created as an alternative to automated or semiautomated mapping methods. Based on a manual photo interpretation technique, the tool facilitates mapping over large areas and through time, and produces time-series raster maps and associated statistics that characterize the changing landscapes. The Rapid Land Cover Mapper add-in can be used with any imagery source to map various themes (for instance, land cover, soils, or forest) at any chosen mapping resolution. The user manual contains all essential information for the user to make full use of the Rapid Land Cover Mapper add-in. This manual includes a description of the add-in functions and capabilities, and step-by-step procedures for using the add-in. The Rapid Land Cover Mapper add-in was successfully used by the U.S. Geological Survey West Africa Land Use Dynamics team to accurately map land use and land cover in 17 West African countries through time (1975, 2000, and 2013).

  19. Neural Computations in a Dynamical System with Multiple Time Scales.

    Science.gov (United States)

    Mi, Yuanyuan; Lin, Xiaohan; Wu, Si

    2016-01-01

    Neural systems display rich short-term dynamics at various levels, e.g., spike-frequency adaptation (SFA) at the single-neuron level, and short-term facilitation (STF) and depression (STD) at the synapse level. These dynamical features typically cover a broad range of time scales and exhibit large diversity in different brain regions. It remains unclear what is the computational benefit for the brain to have such variability in short-term dynamics. In this study, we propose that the brain can exploit such dynamical features to implement multiple seemingly contradictory computations in a single neural circuit. To demonstrate this idea, we use continuous attractor neural network (CANN) as a working model and include STF, SFA and STD with increasing time constants in its dynamics. Three computational tasks are considered, which are persistent activity, adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, and hence cannot be implemented by a single dynamical feature or any combination with similar time constants. However, with properly coordinated STF, SFA and STD, we show that the network is able to implement the three computational tasks concurrently. We hope this study will shed light on the understanding of how the brain orchestrates its rich dynamics at various levels to realize diverse cognitive functions.

  20. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  1. An Open Source Rapid Computer Aided Control System Design Toolchain Using Scilab, Scicos and RTAI Linux

    Science.gov (United States)

    Bouchpan-Lerust-Juéry, L.

    2007-08-01

    Current and next generation on-board computer systems tend to implement real-time embedded control applications (e.g. Attitude and Orbit Control Subsystem (AOCS), Packet Utililization Standard (PUS), spacecraft autonomy . . . ) which must meet high standards of Reliability and Predictability as well as Safety. All these requirements require a considerable amount of effort and cost for Space Sofware Industry. This paper, in a first part, presents a free Open Source integrated solution to develop RTAI applications from analysis, design, simulation and direct implementation using code generation based on Open Source and in its second part summarises this suggested approach, its results and the conclusion for further work.

  2. Comparison of Computational Approaches for Rapid Aerodynamic Assessment of Small UAVs

    Science.gov (United States)

    Shafer, Theresa C.; Lynch, C. Eric; Viken, Sally A.; Favaregh, Noah; Zeune, Cale; Williams, Nathan; Dansie, Jonathan

    2014-01-01

    Computational Fluid Dynamic (CFD) methods were used to determine the basic aerodynamic, performance, and stability and control characteristics of the unmanned air vehicle (UAV), Kahu. Accurate and timely prediction of the aerodynamic characteristics of small UAVs is an essential part of military system acquisition and air-worthiness evaluations. The forces and moments of the UAV were predicted using a variety of analytical methods for a range of configurations and conditions. The methods included Navier Stokes (N-S) flow solvers (USM3D, Kestrel and Cobalt) that take days to set up and hours to converge on a single solution; potential flow methods (PMARC, LSAERO, and XFLR5) that take hours to set up and minutes to compute; empirical methods (Datcom) that involve table lookups and produce a solution quickly; and handbook calculations. A preliminary aerodynamic database can be developed very efficiently by using a combination of computational tools. The database can be generated with low-order and empirical methods in linear regions, then replacing or adjusting the data as predictions from higher order methods are obtained. A comparison of results from all the data sources as well as experimental data obtained from a wind-tunnel test will be shown and the methods will be evaluated on their utility during each portion of the flight envelope.

  3. Computational complexity of time-dependent density functional theory

    International Nuclear Information System (INIS)

    Whitfield, J D; Yung, M-H; Tempel, D G; Aspuru-Guzik, A; Boixo, S

    2014-01-01

    Time-dependent density functional theory (TDDFT) is rapidly emerging as a premier method for solving dynamical many-body problems in physics and chemistry. The mathematical foundations of TDDFT are established through the formal existence of a fictitious non-interacting system (known as the Kohn–Sham system), which can reproduce the one-electron reduced probability density of the actual system. We build upon these works and show that on the interior of the domain of existence, the Kohn–Sham system can be efficiently obtained given the time-dependent density. We introduce a V-representability parameter which diverges at the boundary of the existence domain and serves to quantify the numerical difficulty of constructing the Kohn-Sham potential. For bounded values of V-representability, we present a polynomial time quantum algorithm to generate the time-dependent Kohn–Sham potential with controllable error bounds. (paper)

  4. Efficient quantum algorithm for computing n-time correlation functions.

    Science.gov (United States)

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  5. Distributed computing for real-time petroleum reservoir monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Ayodele, O. R. [University of Alberta, Edmonton, AB (Canada)

    2004-05-01

    Computer software architecture is presented to illustrate how the concept of distributed computing can be applied to real-time reservoir monitoring processes, permitting the continuous monitoring of the dynamic behaviour of petroleum reservoirs at much shorter intervals. The paper describes the fundamental technologies driving distributed computing, namely Java 2 Platform Enterprise edition (J2EE) by Sun Microsystems, and the Microsoft Dot-Net (Microsoft.Net) initiative, and explains the challenges involved in distributed computing. These are: (1) availability of permanently placed downhole equipment to acquire and transmit seismic data; (2) availability of high bandwidth to transmit the data; (3) security considerations; (4) adaptation of existing legacy codes to run on networks as downloads on demand; and (5) credibility issues concerning data security over the Internet. Other applications of distributed computing in the petroleum industry are also considered, specifically MWD, LWD and SWD (measurement-while-drilling, logging-while-drilling, and simulation-while-drilling), and drill-string vibration monitoring. 23 refs., 1 fig.

  6. Real-time Tsunami Inundation Prediction Using High Performance Computers

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2014-12-01

    Recently off-shore tsunami observation stations based on cabled ocean bottom pressure gauges are actively being deployed especially in Japan. These cabled systems are designed to provide real-time tsunami data before tsunamis reach coastlines for disaster mitigation purposes. To receive real benefits of these observations, real-time analysis techniques to make an effective use of these data are necessary. A representative study was made by Tsushima et al. (2009) that proposed a method to provide instant tsunami source prediction based on achieving tsunami waveform data. As time passes, the prediction is improved by using updated waveform data. After a tsunami source is predicted, tsunami waveforms are synthesized from pre-computed tsunami Green functions of linear long wave equations. Tsushima et al. (2014) updated the method by combining the tsunami waveform inversion with an instant inversion of coseismic crustal deformation and improved the prediction accuracy and speed in the early stages. For disaster mitigation purposes, real-time predictions of tsunami inundation are also important. In this study, we discuss the possibility of real-time tsunami inundation predictions, which require faster-than-real-time tsunami inundation simulation in addition to instant tsunami source analysis. Although the computational amount is large to solve non-linear shallow water equations for inundation predictions, it has become executable through the recent developments of high performance computing technologies. We conducted parallel computations of tsunami inundation and achieved 6.0 TFLOPS by using 19,000 CPU cores. We employed a leap-frog finite difference method with nested staggered grids of which resolution range from 405 m to 5 m. The resolution ratio of each nested domain was 1/3. Total number of grid points were 13 million, and the time step was 0.1 seconds. Tsunami sources of 2011 Tohoku-oki earthquake were tested. The inundation prediction up to 2 hours after the

  7. Soft Real-Time PID Control on a VME Computer

    Science.gov (United States)

    Karayan, Vahag; Sander, Stanley; Cageao, Richard

    2007-01-01

    microPID (uPID) is a computer program for real-time proportional + integral + derivative (PID) control of a translation stage in a Fourier-transform ultraviolet spectrometer. microPID implements a PID control loop over a position profile at sampling rate of 8 kHz (sampling period 125microseconds). The software runs in a strippeddown Linux operating system on a VersaModule Eurocard (VME) computer operating in real-time priority queue using an embedded controller, a 16-bit digital-to-analog converter (D/A) board, and a laser-positioning board (LPB). microPID consists of three main parts: (1) VME device-driver routines, (2) software that administers a custom protocol for serial communication with a control computer, and (3) a loop section that obtains the current position from an LPB-driver routine, calculates the ideal position from the profile, and calculates a new voltage command by use of an embedded PID routine all within each sampling period. The voltage command is sent to the D/A board to control the stage. microPID uses special kernel headers to obtain microsecond timing resolution. Inasmuch as microPID implements a single-threaded process and all other processes are disabled, the Linux operating system acts as a soft real-time system.

  8. Rapid lung MRI in children with pulmonary infections: Time to change our diagnostic algorithms.

    Science.gov (United States)

    Sodhi, Kushaljit Singh; Khandelwal, Niranjan; Saxena, Akshay Kumar; Singh, Meenu; Agarwal, Ritesh; Bhatia, Anmol; Lee, Edward Y

    2016-05-01

    To determine the diagnostic utility of a new rapid MRI protocol, as compared with computed tomography (CT) for the detection of various pulmonary and mediastinal abnormalities in children with suspected pulmonary infections. Seventy-five children (age range of 5 to 15 years) with clinically suspected pulmonary infections were enrolled in this prospective study, which was approved by the institutional ethics committee. All patients underwent thoracic MRI (1.5T) and CT (64 detector) scan within 48 h of each other. The sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of MRI were evaluated with CT as a standard of reference. Inter-observer agreement was measured with the kappa coefficient. MRI with a new rapid MRI protocol demonstrated sensitivity, specificity, PPV, and NPV of 100% for detecting pulmonary consolidation, nodules (>3 mm), cyst/cavity, hyperinflation, pleural effusion, and lymph nodes. The kappa-test showed almost perfect agreement between MRI and multidetector CT (MDCT) in detecting thoracic abnormalities (k = 0.9). No statistically significant difference was observed between MRI and MDCT for detecting thoracic abnormalities by the McNemar test (P = 0.125). Rapid lung MRI was found to be comparable to MDCT for detecting thoracic abnormalities in pediatric patients with clinically suspected pulmonary infections. It has a great potential as the first line cross-sectional imaging modality of choice in this patient population. However, further studies will be helpful for confirmation of our findings. © 2015 Wiley Periodicals, Inc.

  9. Effects of computing time delay on real-time control systems

    Science.gov (United States)

    Shin, Kang G.; Cui, Xianzhong

    1988-01-01

    The reliability of a real-time digital control system depends not only on the reliability of the hardware and software used, but also on the speed in executing control algorithms. The latter is due to the negative effects of computing time delay on control system performance. For a given sampling interval, the effects of computing time delay are classified into the delay problem and the loss problem. Analysis of these two problems is presented as a means of evaluating real-time control systems. As an example, both the self-tuning predicted (STP) control and Proportional-Integral-Derivative (PID) control are applied to the problem of tracking robot trajectories, and their respective effects of computing time delay on control performance are comparatively evaluated. For this example, the STP (PID) controller is shown to outperform the PID (STP) controller in coping with the delay (loss) problem.

  10. Thermal Conductivities in Solids from First Principles: Accurate Computations and Rapid Estimates

    Science.gov (United States)

    Carbogno, Christian; Scheffler, Matthias

    In spite of significant research efforts, a first-principles determination of the thermal conductivity κ at high temperatures has remained elusive. Boltzmann transport techniques that account for anharmonicity perturbatively become inaccurate under such conditions. Ab initio molecular dynamics (MD) techniques using the Green-Kubo (GK) formalism capture the full anharmonicity, but can become prohibitively costly to converge in time and size. We developed a formalism that accelerates such GK simulations by several orders of magnitude and that thus enables its application within the limited time and length scales accessible in ab initio MD. For this purpose, we determine the effective harmonic potential occurring during the MD, the associated temperature-dependent phonon properties and lifetimes. Interpolation in reciprocal and frequency space then allows to extrapolate to the macroscopic scale. For both force-field and ab initio MD, we validate this approach by computing κ for Si and ZrO2, two materials known for their particularly harmonic and anharmonic character. Eventually, we demonstrate how these techniques facilitate reasonable estimates of κ from existing MD calculations at virtually no additional computational cost.

  11. Integrating Remote Sensing Data, Hybrid-Cloud Computing, and Event Notifications for Advanced Rapid Imaging & Analysis (Invited)

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Fielding, E. J.; Agram, P.; Manipon, G.; Stough, T. M.; Simons, M.; Rosen, P. A.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.

    2013-12-01

    Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR) and Continuous Global Positioning System (CGPS) are now important elements in our toolset for monitoring earthquake-generating faults, volcanic eruptions, hurricane damage, landslides, reservoir subsidence, and other natural and man-made hazards. Geodetic imaging's unique ability to capture surface deformation with high spatial and temporal resolution has revolutionized both earthquake science and volcanology. Continuous monitoring of surface deformation and surface change before, during, and after natural hazards improves decision-making from better forecasts, increased situational awareness, and more informed recovery. However, analyses of InSAR and GPS data sets are currently handcrafted following events and are not generated rapidly and reliably enough for use in operational response to natural disasters. Additionally, the sheer data volumes needed to handle a continuous stream of InSAR data sets also presents a bottleneck. It has been estimated that continuous processing of InSAR coverage of California alone over 3-years would reach PB-scale data volumes. Our Advanced Rapid Imaging and Analysis for Monitoring Hazards (ARIA-MH) science data system enables both science and decision-making communities to monitor areas of interest with derived geodetic data products via seamless data preparation, processing, discovery, and access. We will present our findings on the use of hybrid-cloud computing to improve the timely processing and delivery of geodetic data products, integrating event notifications from USGS to improve the timely processing for response, as well as providing browse results for quick looks with other tools for integrative analysis.

  12. Sorting on STAR. [CDC computer algorithm timing comparison

    Science.gov (United States)

    Stone, H. S.

    1978-01-01

    Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.

  13. Computational intelligence in time series forecasting theory and engineering applications

    CERN Document Server

    Palit, Ajoy K

    2005-01-01

    Foresight in an engineering enterprise can make the difference between success and failure, and can be vital to the effective control of industrial systems. Applying time series analysis in the on-line milieu of most industrial plants has been problematic owing to the time and computational effort required. The advent of soft computing tools offers a solution. The authors harness the power of intelligent technologies individually and in combination. Examples of the particular systems and processes susceptible to each technique are investigated, cultivating a comprehensive exposition of the improvements on offer in quality, model building and predictive control and the selection of appropriate tools from the plethora available. Application-oriented engineers in process control, manufacturing, production industry and research centres will find much to interest them in this book. It is suitable for industrial training purposes, as well as serving as valuable reference material for experimental researchers.

  14. Fuji apple storage time rapid determination method using Vis/NIR spectroscopy

    Science.gov (United States)

    Liu, Fuqi; Tang, Xuxiang

    2015-01-01

    Fuji apple storage time rapid determination method using visible/near-infrared (Vis/NIR) spectroscopy was studied in this paper. Vis/NIR diffuse reflection spectroscopy responses to samples were measured for 6 days. Spectroscopy data were processed by stochastic resonance (SR). Principal component analysis (PCA) was utilized to analyze original spectroscopy data and SNR eigen value. Results demonstrated that PCA could not totally discriminate Fuji apples using original spectroscopy data. Signal-to-noise ratio (SNR) spectrum clearly classified all apple samples. PCA using SNR spectrum successfully discriminated apple samples. Therefore, Vis/NIR spectroscopy was effective for Fuji apple storage time rapid discrimination. The proposed method is also promising in condition safety control and management for food and environmental laboratories. PMID:25874818

  15. Developments in time-resolved high pressure x-ray diffraction using rapid compression and decompression

    International Nuclear Information System (INIS)

    Smith, Jesse S.; Sinogeikin, Stanislav V.; Lin, Chuanlong; Rod, Eric; Bai, Ligang; Shen, Guoyin

    2015-01-01

    Complementary advances in high pressure research apparatus and techniques make it possible to carry out time-resolved high pressure research using what would customarily be considered static high pressure apparatus. This work specifically explores time-resolved high pressure x-ray diffraction with rapid compression and/or decompression of a sample in a diamond anvil cell. Key aspects of the synchrotron beamline and ancillary equipment are presented, including source considerations, rapid (de)compression apparatus, high frequency imaging detectors, and software suitable for processing large volumes of data. A number of examples are presented, including fast equation of state measurements, compression rate dependent synthesis of metastable states in silicon and germanium, and ultrahigh compression rates using a piezoelectric driven diamond anvil cell

  16. Rapid growth, early maturation and short generation time in African annual fishes

    Czech Academy of Sciences Publication Activity Database

    Blažek, Radim; Polačik, Matej; Reichard, Martin

    2013-01-01

    Roč. 4, č. 24 (2013), s. 24 ISSN 2041-9139 R&D Projects: GA ČR(CZ) GAP506/11/0112 Institutional support: RVO:68081766 Keywords : extreme life history * annual fish * explosive growth * rapid maturation * generation time * killifish * diapause * vertebrate * reaction norm * Savanna Subject RIV: EG - Zoology Impact factor: 3.104, year: 2013 http://www.evodevojournal.com/content/4/1/24

  17. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  18. A heterogeneous hierarchical architecture for real-time computing

    Energy Technology Data Exchange (ETDEWEB)

    Skroch, D.A.; Fornaro, R.J.

    1988-12-01

    The need for high-speed data acquisition and control algorithms has prompted continued research in the area of multiprocessor systems and related programming techniques. The result presented here is a unique hardware and software architecture for high-speed real-time computer systems. The implementation of a prototype of this architecture has required the integration of architecture, operating systems and programming languages into a cohesive unit. This report describes a Heterogeneous Hierarchial Architecture for Real-Time (H{sup 2} ART) and system software for program loading and interprocessor communication.

  19. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  20. Replacing Heavily Damaged Teeth by Third Molar Autotransplantation With the Use of Cone-Beam Computed Tomography and Rapid Prototyping.

    Science.gov (United States)

    Verweij, Jop P; Anssari Moin, David; Wismeijer, Daniel; van Merkesteyn, J P Richard

    2017-09-01

    This article describes the autotransplantation of third molars to replace heavily damaged premolars and molars. Specifically, this article reports on the use of preoperative cone-beam computed tomographic planning and 3-dimensional (3D) printed replicas of donor teeth to prepare artificial tooth sockets. In the present case, an 18-year-old patient underwent autotransplantation of 3 third molars to replace 1 premolar and 2 molars that were heavily damaged after trauma. Approximately 1 year after the traumatic incident, autotransplantation with the help of 3D planning and rapid prototyping was performed. The right maxillary third molar replaced the right maxillary first premolar. The 2 mandibular wisdom teeth replaced the left mandibular first and second molars. During the surgical procedure, artificial tooth sockets were prepared with the help of 3D printed donor tooth copies to prevent iatrogenic damage to the actual donor teeth. These replicas of the donor teeth were designed based on the preoperative cone-beam computed tomogram and manufactured with the help of 3D printing techniques. The use of a replica of the donor tooth resulted in a predictable and straightforward procedure, with extra-alveolar times shorter than 2 minutes for all transplantations. The transplanted teeth were placed in infraocclusion and fixed with a suture splint. Postoperative follow-up showed physiologic integration of the transplanted teeth and a successful outcome for all transplants. In conclusion, this technique facilitates a straightforward and predictable procedure for autotransplantation of third molars. The use of printed analogues of the donor teeth decreases the risk of iatrogenic damage and the extra-alveolar time of the transplanted tooth is minimized. This facilitates a successful outcome. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  1. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    Science.gov (United States)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. STICK: Spike Time Interval Computational Kernel, a Framework for General Purpose Computation Using Neurons, Precise Timing, Delays, and Synchrony.

    Science.gov (United States)

    Lagorce, Xavier; Benosman, Ryad

    2015-11-01

    There has been significant research over the past two decades in developing new platforms for spiking neural computation. Current neural computers are primarily developed to mimic biology. They use neural networks, which can be trained to perform specific tasks to mainly solve pattern recognition problems. These machines can do more than simulate biology; they allow us to rethink our current paradigm of computation. The ultimate goal is to develop brain-inspired general purpose computation architectures that can breach the current bottleneck introduced by the von Neumann architecture. This work proposes a new framework for such a machine. We show that the use of neuron-like units with precise timing representation, synaptic diversity, and temporal delays allows us to set a complete, scalable compact computation framework. The framework provides both linear and nonlinear operations, allowing us to represent and solve any function. We show usability in solving real use cases from simple differential equations to sets of nonlinear differential equations leading to chaotic attractors.

  3. P-CARES 2.0.0, Probabilistic Computer Analysis for Rapid Evaluation of Structures

    International Nuclear Information System (INIS)

    2008-01-01

    1 - Description of program or function: P-CARES 2.0.0 (Probabilistic Computer Analysis for Rapid Evaluation of Structures) was developed for NRC staff use to determine the validity and accuracy of the analysis methods used by various utilities for structural safety evaluations of nuclear power plants. P-CARES provides the capability to effectively evaluate the probabilistic seismic response using simplified soil and structural models and to quickly check the validity and/or accuracy of the SSI data received from applicants and licensees. The code is organized in a modular format with the basic modules of the system performing static, seismic, and nonlinear analysis. 2 - Methods: P-CARES is an update of the CARES program developed at Brookhaven National Laboratory during the 1980's. A major improvement is the enhanced analysis capability in which a probabilistic algorithm has been implemented to perform the probabilistic site response and soil-structure interaction (SSI) analyses. This is accomplished using several sampling techniques such as the Latin Hypercube sampling (LHC), engineering LHC, the Fekete Point Set method, and also the traditional Monte Carlo simulation. This new feature enhances the site response and SSI analysis such that the effect of uncertainty in local site soil properties can now be quantified. Another major addition to P-CARES is a graphical user interface (GUI) which significantly improves the performance of P-Cares in terms of the inter-relations among different functions of the program, and facilitates the input/output processing and execution management. It also provides many user friendly features that would allow an analyst to quickly develop insights from the analysis results. 3 - Restrictions on the complexity of the problem: None noted

  4. Rapid diagnosis of sepsis with TaqMan-Based multiplex real-time PCR.

    Science.gov (United States)

    Liu, Chang-Feng; Shi, Xin-Ping; Chen, Yun; Jin, Ye; Zhang, Bing

    2018-02-01

    The survival rate of septic patients mainly depends on a rapid and reliable diagnosis. A rapid, broad range, specific and sensitive quantitative diagnostic test is the urgent need. Thus, we developed a TaqMan-Based Multiplex real-time PCR assays to identify bloodstream pathogens within a few hours. Primers and TaqMan probes were designed to be complementary to conserved regions in the 16S rDNA gene of different kinds of bacteria. To evaluate accurately, sensitively, and specifically, the known bacteria samples (Standard strains, whole blood samples) are determined by TaqMan-Based Multiplex real-time PCR. In addition, 30 blood samples taken from patients with clinical symptoms of sepsis were tested by TaqMan-Based Multiplex real-time PCR and blood culture. The mean frequency of positive for Multiplex real-time PCR was 96% at a concentration of 100 CFU/mL, and it was 100% at a concentration greater than 1000 CFU/mL. All the known blood samples and Standard strains were detected positively by TaqMan-Based Multiplex PCR, no PCR products were detected when DNAs from other bacterium were used in the multiplex assay. Among the 30 patients with clinical symptoms of sepsis, 18 patients were confirmed positive by Multiplex real-time PCR and seven patients were confirmed positive by blood culture. TaqMan-Based Multiplex real-time PCR assay with highly sensitivity, specificity and broad detection range, is a rapid and accurate method in the detection of bacterial pathogens of sepsis and should have a promising usage in the diagnosis of sepsis. © 2017 Wiley Periodicals, Inc.

  5. Climate Data Provenance Tracking for Just-In-Time Computation

    Science.gov (United States)

    Fries, S.; Nadeau, D.; Doutriaux, C.; Williams, D. N.

    2016-12-01

    The "Climate Data Management System" (CDMS) was created in 1996 as part of the Climate Data Analysis Tools suite of software. It provides a simple interface into a wide variety of climate data formats, and creates NetCDF CF-Compliant files. It leverages the NumPy framework for high performance computation, and is an all-in-one IO and computation package. CDMS has been extended to track manipulations of data, and trace that data all the way to the original raw data. This extension tracks provenance about data, and enables just-in-time (JIT) computation. The provenance for each variable is packaged as part of the variable's metadata, and can be used to validate data processing and computations (by repeating the analysis on the original data). It also allows for an alternate solution for sharing analyzed data; if the bandwidth for a transfer is prohibitively expensive, the provenance serialization can be passed in a much more compact format and the analysis rerun on the input data. Data provenance tracking in CDMS enables far-reaching and impactful functionalities, permitting implementation of many analytical paradigms.

  6. A note on computing average state occupation times

    Directory of Open Access Journals (Sweden)

    Jan Beyersmann

    2014-05-01

    Full Text Available Objective: This review discusses how biometricians would probably compute or estimate expected waiting times, if they had the data. Methods: Our framework is a time-inhomogeneous Markov multistate model, where all transition hazards are allowed to be time-varying. We assume that the cumulative transition hazards are given. That is, they are either known, as in a simulation, determined by expert guesses, or obtained via some method of statistical estimation. Our basic tool is product integration, which transforms the transition hazards into the matrix of transition probabilities. Product integration enjoys a rich mathematical theory, which has successfully been used to study probabilistic and statistical aspects of multistate models. Our emphasis will be on practical implementation of product integration, which allows us to numerically approximate the transition probabilities. Average state occupation times and other quantities of interest may then be derived from the transition probabilities.

  7. Design and implementation of a rapid-mixer flow cell for time-resolved infrared microspectroscopy

    International Nuclear Information System (INIS)

    Marinkovic, Nebojsa S.; Adzic, Aleksandar R.; Sullivan, Michael; Kovacs, Kevin; Miller, Lisa M.; Rousseau, Denis L.; Yeh, Syun-Ru; Chance, Mark R.

    2000-01-01

    A rapid mixer for the analysis of reactions in the millisecond and submillisecond time domains by Fourier-transform infrared microspectroscopy has been constructed. The cell was tested by examination of cytochrome-c folding kinetics. The device allows collection of full infrared spectral data on millisecond and faster time scales subsequent to chemical jump reaction initiation. The data quality is sufficiently good such that spectral fitting techniques could be applied to analysis of the data. Thus, this method provides an advantage over kinetic measurements at single wavelengths using infrared laser or diode sources, particularly where band overlap exists

  8. Rapid-mixing studies on the time-scale of radiation damage in cells

    International Nuclear Information System (INIS)

    Adams, G.E.; Michael, B.D.; Asquith, J.C.; Shenoy, M.A.; Watts, M.E.; Whillans, D.W.

    1975-01-01

    Rapid mixing studies were performed to determine the time scale of radiation damage in cells. There is evidence that the sensitizing effects of oxygen and other chemical dose-modifying agents on the response of cells to ionizing radiation involve fast free-radical processes. Fast response technique studies in bacterial systems have shown that extremely fast processes occur when the bacteria are exposed to oxygen or other dose-modifying agents during irradiation. The time scales observed were consistent with the involvement of fast free-radical reactions in the expression of these effects

  9. Real-time risk assessment in seismic early warning and rapid response: a feasibility study in Bishkek (Kyrgyzstan)

    Science.gov (United States)

    Picozzi, M.; Bindi, D.; Pittore, M.; Kieling, K.; Parolai, S.

    2013-04-01

    Earthquake early warning systems (EEWS) are considered to be an effective, pragmatic, and viable tool for seismic risk reduction in cities. While standard EEWS approaches focus on the real-time estimation of an earthquake's location and magnitude, innovative developments in EEWS include the capacity for the rapid assessment of damage. Clearly, for all public authorities that are engaged in coordinating emergency activities during and soon after earthquakes, real-time information about the potential damage distribution within a city is invaluable. In this work, we present a first attempt to design an early warning and rapid response procedure for real-time risk assessment. In particular, the procedure uses typical real-time information (i.e., P-wave arrival times and early waveforms) derived from a regional seismic network for locating and evaluating the size of an earthquake, information which in turn is exploited for extracting a risk map representing the potential distribution of damage from a dataset of predicted scenarios compiled for the target city. A feasibility study of the procedure is presented for the city of Bishkek, the capital of Kyrgyzstan, which is surrounded by the Kyrgyz seismic network by mimicking the ground motion associated with two historical events that occurred close to Bishkek, namely the 1911 Kemin ( M = 8.2; ±0.2) and the 1885 Belovodsk ( M = 6.9; ±0.5) earthquakes. Various methodologies from previous studies were considered when planning the implementation of the early warning and rapid response procedure for real-time risk assessment: the Satriano et al. (Bull Seismol Soc Am 98(3):1482-1494, 2008) approach to real-time earthquake location; the Caprio et al. (Geophys Res Lett 38:L02301, 2011) approach for estimating moment magnitude in real time; the EXSIM method for ground motion simulation (Motazedian and Atkinson, Bull Seismol Soc Am 95:995-1010, 2005); the Sokolov (Earthquake Spectra 161: 679-694, 2002) approach for estimating

  10. Computer network time synchronization the network time protocol on earth and in space

    CERN Document Server

    Mills, David L

    2010-01-01

    Carefully coordinated, reliable, and accurate time synchronization is vital to a wide spectrum of fields-from air and ground traffic control, to buying and selling goods and services, to TV network programming. Ill-gotten time could even lead to the unimaginable and cause DNS caches to expire, leaving the entire Internet to implode on the root servers.Written by the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol on Earth and in Space, Second Edition addresses the technological infrastructure of time dissemination, distrib

  11. Computational electrodynamics the finite-difference time-domain method

    CERN Document Server

    Taflove, Allen

    2005-01-01

    This extensively revised and expanded third edition of the Artech House bestseller, Computational Electrodynamics: The Finite-Difference Time-Domain Method, offers engineers the most up-to-date and definitive resource on this critical method for solving Maxwell's equations. The method helps practitioners design antennas, wireless communications devices, high-speed digital and microwave circuits, and integrated optical devices with unsurpassed efficiency. There has been considerable advancement in FDTD computational technology over the past few years, and the third edition brings professionals the very latest details with entirely new chapters on important techniques, major updates on key topics, and new discussions on emerging areas such as nanophotonics. What's more, to supplement the third edition, the authors have created a Web site with solutions to problems, downloadable graphics and videos, and updates, making this new edition the ideal textbook on the subject as well.

  12. In this issue: Time to replace doctors’ judgement with computers

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2015-11-01

    Full Text Available Informaticians continue to rise to the challenge, set by the English Health Minister, of trying to replace doctors’ judgement with computers. This issue describes successes and where there are barriers. However, whilst there is progress this tends to be incremental and there are grand challenges to be overcome before computers can replace clinician. These grand challenges include: (1 improving usability so it is possible to more readily incorporate technology into clinical workflow; (2 rigorous new analytic methods that make use of the mass of available data, ‘Big data’, to create real-world evidence; (3 faster ways of meeting regulatory and legal requirements including ensuring privacy; (4 provision of reimbursement models to fund innovative technology that can substitute for clinical time and (5 recognition that innovations that improve quality also often increase cost. Informatics more is likely to support and augment clinical decision making rather than replace clinicians.

  13. Space-Time Trellis Coded 8PSK Schemes for Rapid Rayleigh Fading Channels

    Directory of Open Access Journals (Sweden)

    Salam A. Zummo

    2002-05-01

    Full Text Available This paper presents the design of 8PSK space-time (ST trellis codes suitable for rapid fading channels. The proposed codes utilize the design criteria of ST codes over rapid fading channels. Two different approaches have been used. The first approach maximizes the symbol-wise Hamming distance (HD between signals leaving from or entering to the same encoder′s state. In the second approach, set partitioning based on maximizing the sum of squared Euclidean distances (SSED between the ST signals is performed; then, the branch-wise HD is maximized. The proposed codes were simulated over independent and correlated Rayleigh fading channels. Coding gains up to 4 dB have been observed over other ST trellis codes of the same complexity.

  14. Rapid evolution in insect pests: the importance of space and time in population genomics studies.

    Science.gov (United States)

    Pélissié, Benjamin; Crossley, Michael S; Cohen, Zachary Paul; Schoville, Sean D

    2018-04-01

    Pest species in agroecosystems often exhibit patterns of rapid evolution to environmental and human-imposed selection pressures. Although the role of adaptive processes is well accepted, few insect pests have been studied in detail and most research has focused on selection at insecticide resistance candidate genes. Emerging genomic datasets provide opportunities to detect and quantify selection in insect pest populations, and address long-standing questions about mechanisms underlying rapid evolutionary change. We examine the strengths of recent studies that stratify population samples both in space (along environmental gradients and comparing ancestral vs. derived populations) and in time (using chronological sampling, museum specimens and comparative phylogenomics), resulting in critical insights on evolutionary processes, and providing new directions for studying pests in agroecosystems. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Computation Offloading for Frame-Based Real-Time Tasks under Given Server Response Time Guarantees

    Directory of Open Access Journals (Sweden)

    Anas S. M. Toma

    2014-11-01

    Full Text Available Computation offloading has been adopted to improve the performance of embedded systems by offloading the computation of some tasks, especially computation-intensive tasks, to servers or clouds. This paper explores computation offloading for real-time tasks in embedded systems, provided given response time guarantees from the servers, to decide which tasks should be offloaded to get the results in time. We consider frame-based real-time tasks with the same period and relative deadline. When the execution order of the tasks is given, the problem can be solved in linear time. However, when the execution order is not specified, we prove that the problem is NP-complete. We develop a pseudo-polynomial-time algorithm for deriving feasible schedules, if they exist.  An approximation scheme is also developed to trade the error made from the algorithm and the complexity. Our algorithms are extended to minimize the period/relative deadline of the tasks for performance maximization. The algorithms are evaluated with a case study for a surveillance system and synthesized benchmarks.

  16. Real time simulation of large systems on mini-computer

    International Nuclear Information System (INIS)

    Nakhle, Michel; Roux, Pierre.

    1979-01-01

    Most simulation languages will only accept an explicit formulation of differential equations, and logical variables hold no special status therein. The pace of the suggested methods of integration is limited by the smallest time constant of the model submitted. The NEPTUNIX 2 simulation software has a language that will take implicit equations and an integration method of which the variable pace is not limited by the time constants of the model. This, together with high time and memory ressources optimization of the code generated, makes NEPTUNIX 2 a basic tool for simulation on mini-computers. Since the logical variables are specific entities under centralized control, correct processing of discontinuities and synchronization with a real process are feasible. The NEPTUNIX 2 is the industrial version of NEPTUNIX 1 [fr

  17. Rapid expansion method (REM) for time‐stepping in reverse time migration (RTM)

    KAUST Repository

    Pestana, Reynam C.

    2009-01-01

    We show that the wave equation solution using a conventional finite‐difference scheme, derived commonly by the Taylor series approach, can be derived directly from the rapid expansion method (REM). After some mathematical manipulation we consider an analytical approximation for the Bessel function where we assume that the time step is sufficiently small. From this derivation we find that if we consider only the first two Chebyshev polynomials terms in the rapid expansion method we can obtain the second order time finite‐difference scheme that is frequently used in more conventional finite‐difference implementations. We then show that if we use more terms from the REM we can obtain a more accurate time integration of the wave field. Consequently, we have demonstrated that the REM is more accurate than the usual finite‐difference schemes and it provides a wave equation solution which allows us to march in large time steps without numerical dispersion and is numerically stable. We illustrate the method with post and pre stack migration results.

  18. Response time distributions in rapid chess: A large-scale decision making experiment

    Directory of Open Access Journals (Sweden)

    Mariano Sigman

    2010-10-01

    Full Text Available Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times and position value in rapid chess games. We measured robust emergent statistical observables: 1 Response time (RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, 2 RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation.

  19. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  20. Ergonomic assessment of musculoskeletal disorders risk among the computer users by Rapid Upper Limb Assessment method

    Directory of Open Access Journals (Sweden)

    Ehsanollah Habibi

    2016-01-01

    Conclusion: This study result showed that frequency of musculoskeletal problems in the neck, back, elbow, and wrist was generally high among our subjects, and ergonomic interventions such as computer workstation redesign, users educate about ergonomic principles computer with work, reduced working hours in computers with work must be carried out.

  1. FRANTIC: a computer code for time dependent unavailability analysis

    International Nuclear Information System (INIS)

    Vesely, W.E.; Goldberg, F.F.

    1977-03-01

    The FRANTIC computer code evaluates the time dependent and average unavailability for any general system model. The code is written in FORTRAN IV for the IBM 370 computer. Non-repairable components, monitored components, and periodically tested components are handled. One unique feature of FRANTIC is the detailed, time dependent modeling of periodic testing which includes the effects of test downtimes, test overrides, detection inefficiencies, and test-caused failures. The exponential distribution is used for the component failure times and periodic equations are developed for the testing and repair contributions. Human errors and common mode failures can be included by assigning an appropriate constant probability for the contributors. The output from FRANTIC consists of tables and plots of the system unavailability along with a breakdown of the unavailability contributions. Sensitivity studies can be simply performed and a wide range of tables and plots can be obtained for reporting purposes. The FRANTIC code represents a first step in the development of an approach that can be of direct value in future system evaluations. Modifications resulting from use of the code, along with the development of reliability data based on operating reactor experience, can be expected to provide increased confidence in its use and potential application to the licensing process

  2. Autotransplantation of immature third molars using a computer-aided rapid prototyping model: a report of 4 cases.

    Science.gov (United States)

    Jang, Ji-Hyun; Lee, Seung-Jong; Kim, Euiseong

    2013-11-01

    Autotransplantation of immature teeth can be an option for premature tooth loss in young patients as an alternative to immediately replacing teeth with fixed or implant-supported prostheses. The present case series reports 4 successful autotransplantation cases using computer-aided rapid prototyping (CARP) models with immature third molars. The compromised upper and lower molars (n = 4) of patients aged 15-21 years old were transplanted with third molars using CARP models. Postoperatively, the pulp vitality and the development of the roots were examined clinically and radiographically. The patient follow-up period was 2-7.5 years after surgery. The long-term follow-up showed that all of the transplants were asymptomatic and functional. Radiographic examination indicated that the apices developed continuously and the root length and thickness increased. The final follow-up examination revealed that all of the transplants kept the vitality, and the apices were fully developed with normal periodontal ligaments and trabecular bony patterns. Based on long-term follow-up observations, our 4 cases of autotransplantation of immature teeth using CARP models resulted in favorable prognoses. The CARP model assisted in minimizing the extraoral time and the possible Hertwig epithelial root sheath injury of the transplanted tooth. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  3. Variable dead time counters: 2. A computer simulation

    International Nuclear Information System (INIS)

    Hooton, B.W.; Lees, E.W.

    1980-09-01

    A computer model has been developed to give a pulse train which simulates that generated by a variable dead time counter (VDC) used in safeguards determination of Pu mass. The model is applied to two algorithms generally used for VDC analysis. It is used to determine their limitations at high counting rates and to investigate the effects of random neutrons from (α,n) reactions. Both algorithms are found to be deficient for use with masses of 240 Pu greater than 100g and one commonly used algorithm is shown, by use of the model and also by theory, to yield a result which is dependent on the random neutron intensity. (author)

  4. Status of IGS Ultra-Rapid Products for Real-Time Applications

    Science.gov (United States)

    Ray, J.; Griffiths, J.

    2008-12-01

    Since November 2000 the International GNSS Service (IGS) has produced Ultra-rapid (IGU) products for near real-time and real-time applications. They include GPS orbits, satellite clocks, and Earth rotation parameters for a sliding 48-hr period. The first day of each update is based on the most recent GPS observational data from the IGS hourly tracking network. At the time of release, these observed products have an initial latency of 3 hr. The second day of each update consists of predictions. So the predictions between about 3 and 9 hr into the second half are relevant for true real-time uses. Originally updated twice daily, the IGU products since April 2004 have been issued four times per day, at 3, 9, 15, and 21 UTC. Up to seven Analysis Centers (ACs) contribute to the IGU combinations: Astronomical Institute of the University of Berne (AIUB), European Space Operations Center (ESOC), Geodetic Observatory Pecny (GOP), GeoForschungsZentrum (GFZ) Potsdam, Natural Resources Canada (NRC), Scripps Insitution of Oceanography (SIO), U.S. Naval Observatory (USNO). This redundancy affords a high measure of reliability and enhanced orbit accuracy. IGU orbit precision has improved markedly since late 2007. This is due to a combination of factors: decommissioning of the old, poorly behaved PRN29 in October 2007; upgraded procedures implemented by GOP around the same time, by SIO in spring 2008, and by USNO in June 2008; better handling of maneuvered satellites at the combination level starting June 2008; and stricter AC rejection criteria since July 2008. As a consequence, the weighted 1D RMS residual of the IGU orbit predictions over their first 6 hr is currently about 20 to 30 mm (after a Helmert transformation) compared to the IGS Rapid orbits, averaged over the constellation. The median residual is about 15 to 20 mm. When extended to the full 24 hr prediction period, the IGU orbit errors approximately double. Systematic rotational offsets are probably more important than

  5. [Real-time PCR in rapid diagnosis of Aeromonas hydrophila necrotizing soft tissue infections].

    Science.gov (United States)

    Kohayagawa, Yoshitaka; Izumi, Yoko; Ushita, Misuzu; Niinou, Norio; Koshizaki, Masayuki; Yamamori, Yuji; Kaneko, Sakae; Fukushima, Hiroshi

    2009-11-01

    We report a case of rapidly progressive necrotizing soft tissue infection and sepsis followed by a patient's death. We suspected Vibrio vulnificus infection because the patient's underlying disease was cirrhosis and the course extremely rapid. No microbe had been detected at death. We extracted DNA from a blood culture bottle. SYBR green I real-time PCR was conducted but could not detect V. vulnificus vvh in the DNA sample. Aeromonas hydrophila was cultured and identified in blood and necrotized tissue samples. Real-time PCR was conducted to detect A. hydrophila ahh1, AHCYTOEN and aerA in the DNA sample extracted from the blood culture bottle and an isolated necrotized tissue strain, but only ahh1 was positive. High-mortality in necrotizing soft tissue infections makes it is crucial to quickly detect V. vulnificus and A. hydrophila. We found real-time PCR for vvh, ahh1, AHCYTOEN, and aerA useful in detecting V. vulnificus and A. hydrophila in necrotizing soft tissue infections.

  6. Response time distributions in rapid chess: a large-scale decision making experiment.

    Science.gov (United States)

    Sigman, Mariano; Etchemendy, Pablo; Slezak, Diego Fernández; Cecchi, Guillermo A

    2010-01-01

    Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation.

  7. Early Flood Detection for Rapid Humanitarian Response: Harnessing Near Real-Time Satellite and Twitter Signals

    Directory of Open Access Journals (Sweden)

    Brenden Jongman

    2015-10-01

    Full Text Available Humanitarian organizations have a crucial role in response and relief efforts after floods. The effectiveness of disaster response is contingent on accurate and timely information regarding the location, timing and impacts of the event. Here we show how two near-real-time data sources, satellite observations of water coverage and flood-related social media activity from Twitter, can be used to support rapid disaster response, using case-studies in the Philippines and Pakistan. For these countries we analyze information from disaster response organizations, the Global Flood Detection System (GFDS satellite flood signal, and flood-related Twitter activity analysis. The results demonstrate that these sources of near-real-time information can be used to gain a quicker understanding of the location, the timing, as well as the causes and impacts of floods. In terms of location, we produce daily impact maps based on both satellite information and social media, which can dynamically and rapidly outline the affected area during a disaster. In terms of timing, the results show that GFDS and/or Twitter signals flagging ongoing or upcoming flooding are regularly available one to several days before the event was reported to humanitarian organizations. In terms of event understanding, we show that both GFDS and social media can be used to detect and understand unexpected or controversial flood events, for example due to the sudden opening of hydropower dams or the breaching of flood protection. The performance of the GFDS and Twitter data for early detection and location mapping is mixed, depending on specific hydrological circumstances (GFDS and social media penetration (Twitter. Further research is needed to improve the interpretation of the GFDS signal in different situations, and to improve the pre-processing of social media data for operational use.

  8. Rapidly re-computable EEG (electroencephalography) forward models for realistic head shapes

    International Nuclear Information System (INIS)

    Ermer, J.J.; Mosher, J.C.; Baillet, S.; Leahy, R.M.

    2001-01-01

    Solution of the EEG source localization (inverse) problem utilizing model-based methods typically requires a significant number of forward model evaluations. For subspace based inverse methods like MUSIC (6), the total number of forward model evaluations can often approach an order of 10 3 or 10 4 . Techniques based on least-squares minimization may require significantly more evaluations. The observed set of measurements over an M-sensor array is often expressed as a linear forward spatio-temporal model of the form: F = GQ + N (1) where the observed forward field F (M-sensors x N-time samples) can be expressed in terms of the forward model G, a set of dipole moment(s) Q (3xP-dipoles x N-time samples) and additive noise N. Because of their simplicity, ease of computation, and relatively good accuracy, multi-layer spherical models (7) (or fast approximations described in (1), (7)) have traditionally been the 'forward model of choice' for approximating the human head. However, approximation of the human head via a spherical model does have several key drawbacks. By its very shape, the use of a spherical model distorts the true distribution of passive currents in the skull cavity. Spherical models also require that the sensor positions be projected onto the fitted sphere (Fig. 1), resulting in a distortion of the true sensor-dipole spatial geometry (and ultimately the computed surface potential). The use of a single 'best-fitted' sphere has the added drawback of incomplete coverage of the inner skull region, often ignoring areas such as the frontal cortex. In practice, this problem is typically countered by fitting additional sphere(s) to those region(s) not covered by the primary sphere. The use of these additional spheres results in added complication to the forward model. Using high-resolution spatial information obtained via X-ray CT or MR imaging, a realistic head model can be formed by tessellating the head into a set of contiguous regions (typically the scalp

  9. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  10. Accessible high performance computing solutions for near real-time image processing for time critical applications

    Science.gov (United States)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  11. Research on rapid agile metrology for manufacturing based on real-time multitask operating system

    Science.gov (United States)

    Chen, Jihong; Song, Zhen; Yang, Daoshan; Zhou, Ji; Buckley, Shawn

    1996-10-01

    Rapid agile metrology for manufacturing (RAMM) using multiple non-contact sensors is likely to remain a growing trend in manufacturing. High speed inspecting systems for manufacturing is characterized by multitasks implemented in parallel and real-time events which occur simultaneously. In this paper, we introduce a real-time operating system into RAMM research. A general task model of a class-based object- oriented technology is proposed. A general multitask frame of a typical RAMM system using OPNet is discussed. Finally, an application example of a machine which inspects parts held on a carrier strip is described. With RTOS and OPNet, this machine can measure two dimensions of the contacts at 300 parts/second.

  12. Development of a Rapid Insulin Assay by Homogenous Time-Resolved Fluorescence.

    Directory of Open Access Journals (Sweden)

    Zachary J Farino

    Full Text Available Direct measurement of insulin is critical for basic and clinical studies of insulin secretion. However, current methods are expensive and time-consuming. We developed an insulin assay based on homogenous time-resolved fluorescence that is significantly more rapid and cost-effective than current commonly used approaches. This assay was applied effectively to an insulin secreting cell line, INS-1E cells, as well as pancreatic islets, allowing us to validate the assay by elucidating mechanisms by which dopamine regulates insulin release. We found that dopamine functioned as a significant negative modulator of glucose-stimulated insulin secretion. Further, we showed that bromocriptine, a known dopamine D2/D3 receptor agonist and newly approved drug used for treatment of type II diabetes mellitus, also decreased glucose-stimulated insulin secretion in islets to levels comparable to those caused by dopamine treatment.

  13. UMTS rapid response real-time seismic networks: implementation and strategies at INGV

    Science.gov (United States)

    Govoni, Aladino; Margheriti, Lucia; Moretti, Milena; Lauciani, Valentino; Sensale, Gianpaolo; Bucci, Augusto; Criscuoli, Fabio

    2015-04-01

    The benefits of portable real-time seismic networks are several and well known. During the management of a temporary experiment from the real-time data it is possible to detect and fix rapidly problems with power supply, time synchronization, disk failures and, most important, seismic signal quality degradation due to unexpected noise sources or sensor alignment/tampering. This usually minimizes field maintenance trips and maximizes both the quantity and the quality of the acquired data. When the area of the temporary experiment is not well monitored by the local permanent network, the real-time data from the temporary experiment can be fed to the permanent network monitoring system improving greatly both the real-time hypocentral locations and the final revised bulletin. All these benefits apply also in case of seismic crises when rapid deployment stations can significantly contribute to the aftershock analysis. Nowadays data transmission using meshed radio networks or satellite systems is not a big technological problem for a permanent seismic network where each site is optimized for the device power consumption and is usually installed by properly specialized technicians that can configure transmission devices and align antennas. This is not usually practical for temporary networks and especially for rapid response networks where the installation time is the main concern. These difficulties are substantially lowered using the now widespread UMTS technology for data transmission. A small (but sometimes power hungry) properly configured device with an omnidirectional antenna must be added to the station assembly. All setups are usually configured before deployment and this allows for an easy installation also by untrained personnel. We describe here the implementation of a UMTS based portable seismic network for both temporary experiments and rapid response applications developed at INGV. The first field experimentation of this approach dates back to the 2009 L

  14. Improving multi-GNSS ultra-rapid orbit determination for real-time precise point positioning

    Science.gov (United States)

    Li, Xingxing; Chen, Xinghan; Ge, Maorong; Schuh, Harald

    2018-03-01

    Currently, with the rapid development of multi-constellation Global Navigation Satellite Systems (GNSS), the real-time positioning and navigation are undergoing dramatic changes with potential for a better performance. To provide more precise and reliable ultra-rapid orbits is critical for multi-GNSS real-time positioning, especially for the three merging constellations Beidou, Galileo and QZSS which are still under construction. In this contribution, we present a five-system precise orbit determination (POD) strategy to fully exploit the GPS + GLONASS + BDS + Galileo + QZSS observations from CDDIS + IGN + BKG archives for the realization of hourly five-constellation ultra-rapid orbit update. After adopting the optimized 2-day POD solution (updated every hour), the predicted orbit accuracy can be obviously improved for all the five satellite systems in comparison to the conventional 1-day POD solution (updated every 3 h). The orbit accuracy for the BDS IGSO satellites can be improved by about 80, 45 and 50% in the radial, cross and along directions, respectively, while the corresponding accuracy improvement for the BDS MEO satellites reaches about 50, 20 and 50% in the three directions, respectively. Furthermore, the multi-GNSS real-time precise point positioning (PPP) ambiguity resolution has been performed by using the improved precise satellite orbits. Numerous results indicate that combined GPS + BDS + GLONASS + Galileo (GCRE) kinematic PPP ambiguity resolution (AR) solutions can achieve the shortest time to first fix (TTFF) and highest positioning accuracy in all coordinate components. With the addition of the BDS, GLONASS and Galileo observations to the GPS-only processing, the GCRE PPP AR solution achieves the shortest average TTFF of 11 min with 7{°} cutoff elevation, while the TTFF of GPS-only, GR, GE and GC PPP AR solution is 28, 15, 20 and 17 min, respectively. As the cutoff elevation increases, the reliability and accuracy of GPS-only PPP AR solutions

  15. Rapid, Time-Division Multiplexed, Direct Absorption- and Wavelength Modulation-Spectroscopy

    Directory of Open Access Journals (Sweden)

    Alexander Klein

    2014-11-01

    Full Text Available We present a tunable diode laser spectrometer with a novel, rapid time multiplexed direct absorption- and wavelength modulation-spectroscopy operation mode. The new technique allows enhancing the precision and dynamic range of a tunable diode laser absorption spectrometer without sacrificing accuracy. The spectroscopic technique combines the benefits of absolute concentration measurements using calibration-free direct tunable diode laser absorption spectroscopy (dTDLAS with the enhanced noise rejection of wavelength modulation spectroscopy (WMS. In this work we demonstrate for the first time a 125 Hz time division multiplexed (TDM-dTDLAS-WMS spectroscopic scheme by alternating the modulation of a DFB-laser between a triangle-ramp (dTDLAS and an additional 20 kHz sinusoidal modulation (WMS. The absolute concentration measurement via the dTDLAS-technique allows one to simultaneously calibrate the normalized 2f/1f-signal of the WMS-technique. A dTDLAS/WMS-spectrometer at 1.37 µm for H2O detection was built for experimental validation of the multiplexing scheme over a concentration range from 50 to 3000 ppmV (0.1 MPa, 293 K. A precision of 190 ppbV was achieved with an absorption length of 12.7 cm and an averaging time of two seconds. Our results show a five-fold improvement in precision over the entire concentration range and a significantly decreased averaging time of the spectrometer.

  16. Multiscale Methods, Parallel Computation, and Neural Networks for Real-Time Computer Vision.

    Science.gov (United States)

    Battiti, Roberto

    1990-01-01

    This thesis presents new algorithms for low and intermediate level computer vision. The guiding ideas in the presented approach are those of hierarchical and adaptive processing, concurrent computation, and supervised learning. Processing of the visual data at different resolutions is used not only to reduce the amount of computation necessary to reach the fixed point, but also to produce a more accurate estimation of the desired parameters. The presented adaptive multiple scale technique is applied to the problem of motion field estimation. Different parts of the image are analyzed at a resolution that is chosen in order to minimize the error in the coefficients of the differential equations to be solved. Tests with video-acquired images show that velocity estimation is more accurate over a wide range of motion with respect to the homogeneous scheme. In some cases introduction of explicit discontinuities coupled to the continuous variables can be used to avoid propagation of visual information from areas corresponding to objects with different physical and/or kinematic properties. The human visual system uses concurrent computation in order to process the vast amount of visual data in "real -time." Although with different technological constraints, parallel computation can be used efficiently for computer vision. All the presented algorithms have been implemented on medium grain distributed memory multicomputers with a speed-up approximately proportional to the number of processors used. A simple two-dimensional domain decomposition assigns regions of the multiresolution pyramid to the different processors. The inter-processor communication needed during the solution process is proportional to the linear dimension of the assigned domain, so that efficiency is close to 100% if a large region is assigned to each processor. Finally, learning algorithms are shown to be a viable technique to engineer computer vision systems for different applications starting from

  17. Neural Computations in a Dynamical System with Multiple Time Scales

    Directory of Open Access Journals (Sweden)

    Yuanyuan Mi

    2016-09-01

    Full Text Available Neural systems display rich short-term dynamics at various levels, e.g., spike-frequencyadaptation (SFA at single neurons, and short-term facilitation (STF and depression (STDat neuronal synapses. These dynamical features typically covers a broad range of time scalesand exhibit large diversity in different brain regions. It remains unclear what the computationalbenefit for the brain to have such variability in short-term dynamics is. In this study, we proposethat the brain can exploit such dynamical features to implement multiple seemingly contradictorycomputations in a single neural circuit. To demonstrate this idea, we use continuous attractorneural network (CANN as a working model and include STF, SFA and STD with increasing timeconstants in their dynamics. Three computational tasks are considered, which are persistent activity,adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, andhence cannot be implemented by a single dynamical feature or any combination with similar timeconstants. However, with properly coordinated STF, SFA and STD, we show that the network isable to implement the three computational tasks concurrently. We hope this study will shed lighton the understanding of how the brain orchestrates its rich dynamics at various levels to realizediverse cognitive functions.

  18. Chemistry, physics and time: the computer modelling of glassmaking.

    Science.gov (United States)

    Martlew, David

    2003-01-01

    A decade or so ago the remains of an early flat glass furnace were discovered in St Helens. Continuous glass production only became feasible after the Siemens Brothers demonstrated their continuous tank furnace at Dresden in 1870. One manufacturer of flat glass enthusiastically adopted the new technology and secretly explored many variations on this theme during the next fifteen years. Study of the surviving furnace remains using today's computer simulation techniques showed how, in 1887, that technology was adapted to the special demands of window glass making. Heterogeneous chemical reactions at high temperatures are required to convert the mixture of granular raw materials into the homogeneous glass needed for windows. Kinetics (and therefore the economics) of glassmaking is dominated by heat transfer and chemical diffusion as refractory grains are converted to highly viscous molten glass. Removal of gas bubbles in a sufficiently short period of time is vital for profitability, but the glassmaker must achieve this in a reaction vessel which is itself being dissolved by the molten glass. Design and operational studies of today's continuous tank furnaces need to take account of these factors, and good use is made of computer simulation techniques to shed light on the way furnaces behave and how improvements may be made. This paper seeks to show how those same techniques can be used to understand how the early Siemens continuous tank furnaces were designed and operated, and how the Victorian entrepreneurs succeeded in managing the thorny problems of what was, in effect, a vulnerable high temperature continuous chemical reactor.

  19. Rapid identification of ST131 Escherichia coli by a novel multiplex real-time allelic discrimination assay.

    Science.gov (United States)

    François, Patrice; Bonetti, Eve-Julie; Fankhauser, Carolina; Baud, Damien; Cherkaoui, Abdessalam; Schrenzel, Jacques; Harbarth, Stephan

    2017-09-01

    Escherichia coli sequence type 131 is increasingly described in severe hospital infections. We developed a rapid real-time allelic discrimination assay for the rapid identification of E. coli ST131 isolates. This rapid assay represents an affordable alternative to sequence-based strategies before completing characterization of potentially highly virulent isolates of E. coli. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A rapid infusion protocol is safe for total dose iron polymaltose: time for change.

    Science.gov (United States)

    Garg, M; Morrison, G; Friedman, A; Lau, A; Lau, D; Gibson, P R

    2011-07-01

    Intravenous correction of iron deficiency by total dose iron polymaltose is inexpensive and safe, but current protocols entail prolonged administration over more than 4 h. This results in reduced patient acceptance, and hospital resource strain. We aimed to assess prospectively the safety of a rapid intravenous protocol and compare this with historical controls. Consecutive patients in whom intravenous iron replacement was indicated were invited to have up to 1.5 g iron polymaltose by a 58-min infusion protocol after an initial 15-min test dose without pre-medication. Infusion-related adverse events (AE) and delayed AE over the ensuing 5 days were also prospectively documented and graded as mild, moderate or severe. One hundred patients, 63 female, mean age 54 (range 18-85) years were studied. Thirty-four infusion-related AE to iron polymaltose occurred in a total of 24 patients--25 mild, 8 moderate and 1 severe; higher than previously reported for a slow protocol iron infusion. Thirty-one delayed AE occurred in 26 patients--26 mild, 3 moderate and 2 severe; similar to previously reported. All but five patients reported they would prefer iron replacement through the rapid protocol again. The presence of inflammatory bowel disease (IBD) predicted infusion-related reactions (54% vs 14% without IBD, P cost, resource utilization and time benefits for the patient and hospital system. © 2011 The Authors. Internal Medicine Journal © 2011 Royal Australasian College of Physicians.

  1. Rapid and Sensitive Lateral Flow Immunoassay Method for Procalcitonin (PCT Based on Time-Resolved Immunochromatography

    Directory of Open Access Journals (Sweden)

    Xiang-Yang Shao

    2017-02-01

    Full Text Available Procalcitonin (PCT is a current, frequently-used marker for severe bacterial infection. The aim of this study was to develop a cost-effective detection kit for rapid quantitative and on-site detection of PCT. To develop the new PCT quantitative detecting kit, a double-antibody sandwich immunofluorescent assay was employed based on time-resolved immunofluorescent assay (TRFIA combined with lateral flow immunoassay (LFIA. The performance of the new developed kit was evaluated in the aspects of linearity, precision, accuracy, and specificity. Two-hundred thirty-four serum samples were enrolled to carry out the comparison test. The new PCT quantitative detecting kit exhibited a higher sensitivity (0.08 ng/mL. The inter-assay coefficient of variation (CV and the intra-assay CV were 5.4%–7.7% and 5.7%–13.4%, respectively. The recovery rates ranged from 93% to 105%. Furthermore, a high correlation (n = 234, r = 0.977, p < 0.0001 and consistency (Kappa = 0.875 were obtained when compared with the PCT kit from Roche Elecsys BRAHMS. Thus, the new quantitative method for detecting PCT has been successfully established. The results indicated that the newly-developed system based on TRFIA combined with LFIA was suitable for rapid and on-site detection for PCT, which might be a useful platform for other biomarkers in point-of-care tests.

  2. Adjustment to subtle time constraints and power law learning in rapid serial visual presentation

    Directory of Open Access Journals (Sweden)

    Jacqueline Chakyung Shin

    2015-11-01

    Full Text Available We investigated whether attention could be modulated through the implicit learning of temporal information in a rapid serial visual presentation (RSVP task. Participants identified two target letters among numeral distractors. The stimulus-onset asynchrony immediately following the first target (SOA1 varied at three levels (70, 98, and 126 ms randomly between trials or fixed within blocks of trials. Practice over three consecutive days resulted in a continuous improvement in the identification rate for both targets and attenuation of the attentional blink (AB, a decrement in target (T2 identification when presented 200-400 ms after another target (T1. Blocked SOA1s led to a faster rate of improvement in RSVP performance and more target order reversals relative to random SOA1s, suggesting that the implicit learning of SOA1 positively affected performance. The results also reveal power law learning curves for individual target identification as well as the reduction in the AB decrement. These learning curves reflect the spontaneous emergence of skill through subtle attentional modulations rather than general attentional distribution. Together, the results indicate that implicit temporal learning could improve high level and rapid cognitive processing and highlights the sensitivity and adaptability of the attentional system to subtle constraints in stimulus timing.

  3. A Swellable Microneedle Patch to Rapidly Extract Skin Interstitial Fluid for Timely Metabolic Analysis.

    Science.gov (United States)

    Chang, Hao; Zheng, Mengjia; Yu, Xiaojun; Than, Aung; Seeni, Razina Z; Kang, Rongjie; Tian, Jingqi; Khanh, Duong Phan; Liu, Linbo; Chen, Peng; Xu, Chenjie

    2017-10-01

    Skin interstitial fluid (ISF) is an emerging source of biomarkers for disease diagnosis and prognosis. Microneedle (MN) patch has been identified as an ideal platform to extract ISF from the skin due to its pain-free and easy-to-administrated properties. However, long sampling time is still a serious problem which impedes timely metabolic analysis. In this study, a swellable MN patch that can rapidly extract ISF is developed. The MN patch is made of methacrylated hyaluronic acid (MeHA) and further crosslinked through UV irradiation. Owing to the supreme water affinity of MeHA, this MN patch can extract sufficient ISF in a short time without the assistance of extra devices, which remarkably facilitates timely metabolic analysis. Due to covalent crosslinked network, the MN patch maintains the structure integrity in the swelling hydrated state without leaving residues in skin after usage. More importantly, the extracted ISF metabolites can be efficiently recovered from MN patch by centrifugation for the subsequent offline analysis of metabolites such as glucose and cholesterol. Given the recent trend of easy-to-use point-of-care devices for personal healthcare monitoring, this study opens a new avenue for the development of MN-based microdevices for sampling ISF and minimally invasive metabolic detection. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Rapid Large Earthquake and Run-up Characterization in Quasi Real Time

    Science.gov (United States)

    Bravo, F. J.; Riquelme, S.; Koch, P.; Cararo, S.

    2017-12-01

    Several test in quasi real time have been conducted by the rapid response group at CSN (National Seismological Center) to characterize earthquakes in Real Time. These methods are known for its robustness and realibility to create Finite Fault Models. The W-phase FFM Inversion, The Wavelet Domain FFM and The Body Wave and FFM have been implemented in real time at CSN, all these algorithms are running automatically and triggered by the W-phase Point Source Inversion. Dimensions (Large and Width ) are predefined by adopting scaling laws for earthquakes in subduction zones. We tested the last four major earthquakes occurred in Chile using this scheme: The 2010 Mw 8.8 Maule Earthquake, The 2014 Mw 8.2 Iquique Earthquake, The 2015 Mw 8.3 Illapel Earthquake and The 7.6 Melinka Earthquake. We obtain many solutions as time elapses, for each one of those we calculate the run-up using an analytical formula. Our results are in agreements with some FFM already accepted by the sicentific comunnity aswell as run-up observations in the field.

  5. Control of force during rapid visuomotor force-matching tasks can be described by discrete time PID control algorithms.

    Science.gov (United States)

    Dideriksen, Jakob Lund; Feeney, Daniel F; Almuklass, Awad M; Enoka, Roger M

    2017-08-01

    Force trajectories during isometric force-matching tasks involving isometric contractions vary substantially across individuals. In this study, we investigated if this variability can be explained by discrete time proportional, integral, derivative (PID) control algorithms with varying model parameters. To this end, we analyzed the pinch force trajectories of 24 subjects performing two rapid force-matching tasks with visual feedback. Both tasks involved isometric contractions to a target force of 10% maximal voluntary contraction. One task involved a single action (pinch) and the other required a double action (concurrent pinch and wrist extension). 50,000 force trajectories were simulated with a computational neuromuscular model whose input was determined by a PID controller with different PID gains and frequencies at which the controller adjusted muscle commands. The goal was to find the best match between each experimental force trajectory and all simulated trajectories. It was possible to identify one realization of the PID controller that matched the experimental force produced during each task for most subjects (average index of similarity: 0.87 ± 0.12; 1 = perfect similarity). The similarities for both tasks were significantly greater than that would be expected by chance (single action: p = 0.01; double action: p = 0.04). Furthermore, the identified control frequencies in the simulated PID controller with the greatest similarities decreased as task difficulty increased (single action: 4.0 ± 1.8 Hz; double action: 3.1 ± 1.3 Hz). Overall, the results indicate that discrete time PID controllers are realistic models for the neural control of force in rapid force-matching tasks involving isometric contractions.

  6. The application of digital computers to near-real-time processing of flutter test data

    Science.gov (United States)

    Hurley, S. R.

    1976-01-01

    Procedures used in monitoring, analyzing, and displaying flight and ground flutter test data are presented. These procedures include three digital computer programs developed to process structural response data in near real time. Qualitative and quantitative modal stability data are derived from time history response data resulting from rapid sinusoidal frequency sweep forcing functions, tuned-mode quick stops, and pilot induced control pulses. The techniques have been applied to both fixed and rotary wing aircraft, during flight, whirl tower rotor systems tests, and wind tunnel flutter model tests. An hydraulically driven oscillatory aerodynamic vane excitation system utilized during the flight flutter test programs accomplished during Lockheed L-1011 and S-3A development is described.

  7. Rapid estimation of split renal function in kidney donors using software developed for computed tomographic renal volumetry

    International Nuclear Information System (INIS)

    Kato, Fumi; Kamishima, Tamotsu; Morita, Ken; Muto, Natalia S.; Okamoto, Syozou; Omatsu, Tokuhiko; Oyama, Noriko; Terae, Satoshi; Kanegae, Kakuko; Nonomura, Katsuya; Shirato, Hiroki

    2011-01-01

    Purpose: To evaluate the speed and precision of split renal volume (SRV) measurement, which is the ratio of unilateral renal volume to bilateral renal volume, using a newly developed software for computed tomographic (CT) volumetry and to investigate the usefulness of SRV for the estimation of split renal function (SRF) in kidney donors. Method: Both dynamic CT and renal scintigraphy in 28 adult potential living renal donors were the subjects of this study. We calculated SRV using the newly developed volumetric software built into a PACS viewer (n-SRV), and compared it with SRV calculated using a conventional workstation, ZIOSOFT (z-SRV). The correlation with split renal function (SRF) using 99m Tc-DMSA scintigraphy was also investigated. Results: The time required for volumetry of bilateral kidneys with the newly developed software (16.7 ± 3.9 s) was significantly shorter than that of the workstation (102.6 ± 38.9 s, p < 0.0001). The results of n-SRV (49.7 ± 4.0%) were highly consistent with those of z-SRV (49.9 ± 3.6%), with a mean discrepancy of 0.12 ± 0.84%. The SRF also agreed well with the n-SRV, with a mean discrepancy of 0.25 ± 1.65%. The dominant side determined by SRF and n-SRV showed agreement in 26 of 28 cases (92.9%). Conclusion: The newly developed software for CT volumetry was more rapid than the conventional workstation volumetry and just as accurate, and was suggested to be useful for the estimation of SRF and thus the dominant side in kidney donors.

  8. Rapid estimation of split renal function in kidney donors using software developed for computed tomographic renal volumetry

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Fumi, E-mail: fumikato@med.hokudai.ac.jp [Department of Radiology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, Hokkaido 060-8638 (Japan); Kamishima, Tamotsu, E-mail: ktamotamo2@yahoo.co.jp [Department of Radiology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, Hokkaido 060-8638 (Japan); Morita, Ken, E-mail: kenordic@carrot.ocn.ne.jp [Department of Urology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, 060-8638 (Japan); Muto, Natalia S., E-mail: nataliamuto@gmail.com [Department of Radiology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, Hokkaido 060-8638 (Japan); Okamoto, Syozou, E-mail: shozo@med.hokudai.ac.jp [Department of Nuclear Medicine, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, 060-8638 (Japan); Omatsu, Tokuhiko, E-mail: omatoku@nirs.go.jp [Department of Radiology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, Hokkaido 060-8638 (Japan); Oyama, Noriko, E-mail: ZAT04404@nifty.ne.jp [Department of Radiology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, Hokkaido 060-8638 (Japan); Terae, Satoshi, E-mail: saterae@med.hokudai.ac.jp [Department of Radiology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, Hokkaido 060-8638 (Japan); Kanegae, Kakuko, E-mail: IZW00143@nifty.ne.jp [Department of Nuclear Medicine, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, 060-8638 (Japan); Nonomura, Katsuya, E-mail: k-nonno@med.hokudai.ac.jp [Department of Urology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, 060-8638 (Japan); Shirato, Hiroki, E-mail: shirato@med.hokudai.ac.jp [Department of Radiology, Hokkaido University Graduate School of Medicine, N15, W7, Kita-ku, Sapporo, Hokkaido 060-8638 (Japan)

    2011-07-15

    Purpose: To evaluate the speed and precision of split renal volume (SRV) measurement, which is the ratio of unilateral renal volume to bilateral renal volume, using a newly developed software for computed tomographic (CT) volumetry and to investigate the usefulness of SRV for the estimation of split renal function (SRF) in kidney donors. Method: Both dynamic CT and renal scintigraphy in 28 adult potential living renal donors were the subjects of this study. We calculated SRV using the newly developed volumetric software built into a PACS viewer (n-SRV), and compared it with SRV calculated using a conventional workstation, ZIOSOFT (z-SRV). The correlation with split renal function (SRF) using {sup 99m}Tc-DMSA scintigraphy was also investigated. Results: The time required for volumetry of bilateral kidneys with the newly developed software (16.7 {+-} 3.9 s) was significantly shorter than that of the workstation (102.6 {+-} 38.9 s, p < 0.0001). The results of n-SRV (49.7 {+-} 4.0%) were highly consistent with those of z-SRV (49.9 {+-} 3.6%), with a mean discrepancy of 0.12 {+-} 0.84%. The SRF also agreed well with the n-SRV, with a mean discrepancy of 0.25 {+-} 1.65%. The dominant side determined by SRF and n-SRV showed agreement in 26 of 28 cases (92.9%). Conclusion: The newly developed software for CT volumetry was more rapid than the conventional workstation volumetry and just as accurate, and was suggested to be useful for the estimation of SRF and thus the dominant side in kidney donors.

  9. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Hongbin; Wang Chunyan; Qi Yao [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China); University of Chinese Academy of Sciences, Beijing 100039 (China); Song Fengrui, E-mail: songfr@ciac.jl.cn [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China); Liu Zhiqiang; Liu Shuying [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China)

    2012-11-08

    Highlights: Black-Right-Pointing-Pointer DART MS combined with PCA and HCA was used to rapidly identify markers of Radix Aconiti. Black-Right-Pointing-Pointer The DART MS behavior of six aconitine-type alkaloids was investigated. Black-Right-Pointing-Pointer Chemical markers were recognized between the qualified and unqualified samples. Black-Right-Pointing-Pointer DART MS was shown to be an effective tool for quality control of Radix Aconiti Preparata. - Abstract: This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in

  10. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry

    International Nuclear Information System (INIS)

    Zhu Hongbin; Wang Chunyan; Qi Yao; Song Fengrui; Liu Zhiqiang; Liu Shuying

    2012-01-01

    Highlights: ► DART MS combined with PCA and HCA was used to rapidly identify markers of Radix Aconiti. ► The DART MS behavior of six aconitine-type alkaloids was investigated. ► Chemical markers were recognized between the qualified and unqualified samples. ► DART MS was shown to be an effective tool for quality control of Radix Aconiti Preparata. - Abstract: This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in combination with multivariate data analysis provides a very flexible and reliable method for quality

  11. Real-Time Digital Bright Field Technology for Rapid Antibiotic Susceptibility Testing.

    Science.gov (United States)

    Canali, Chiara; Spillum, Erik; Valvik, Martin; Agersnap, Niels; Olesen, Tom

    2018-01-01

    Optical scanning through bacterial samples and image-based analysis may provide a robust method for bacterial identification, fast estimation of growth rates and their modulation due to the presence of antimicrobial agents. Here, we describe an automated digital, time-lapse, bright field imaging system (oCelloScope, BioSense Solutions ApS, Farum, Denmark) for rapid and higher throughput antibiotic susceptibility testing (AST) of up to 96 bacteria-antibiotic combinations at a time. The imaging system consists of a digital camera, an illumination unit and a lens where the optical axis is tilted 6.25° relative to the horizontal plane of the stage. Such tilting grants more freedom of operation at both high and low concentrations of microorganisms. When considering a bacterial suspension in a microwell, the oCelloScope acquires a sequence of 6.25°-tilted images to form an image Z-stack. The stack contains the best-focus image, as well as the adjacent out-of-focus images (which contain progressively more out-of-focus bacteria, the further the distance from the best-focus position). The acquisition process is repeated over time, so that the time-lapse sequence of best-focus images is used to generate a video. The setting of the experiment, image analysis and generation of time-lapse videos can be performed through a dedicated software (UniExplorer, BioSense Solutions ApS). The acquired images can be processed for online and offline quantification of several morphological parameters, microbial growth, and inhibition over time.

  12. Time sequential single photon emission computed tomography studies in brain tumour using thallium-201

    International Nuclear Information System (INIS)

    Ueda, Takashi; Kaji, Yasuhiro; Wakisaka, Shinichiro; Watanabe, Katsushi; Hoshi, Hiroaki; Jinnouchi, Seishi; Futami, Shigemi

    1993-01-01

    Time sequential single photon emission computed tomography (SPECT) studies using thallium-201 were performed in 25 patients with brain tumours to evaluate the kinetics of thallium in the tumour and the biological malignancy grade preoperatively. After acquisition and reconstruction of SPECT data from 1 min post injection to 48 h (1, 2, 3, 4, 5, 6, 7, 8, 9, 10 and 15-20 min, followed by 4-6, 24 and 48 h), the thallium uptake ratio in the tumour versus the homologous contralateral area of the brain was calculated and compared with findings of X-ray CT, magnetic resonance imaging, cerebral angiography and histological investigations. Early uptake of thallium in tumours was related to tumour vascularity and the disruption of the blood-brain barrier. High and rapid uptake and slow reduction of thallium indicated a hypervascular malignant tumour; however, high and rapid uptake but rapid reduction of thallium indicated a hypervascular benign tumour, such as meningioma. Hypovascular and benign tumours tended to show low uptake and slow reduction of thallium. Long-lasting retention or uptake of thallium indicates tumour malignancy. (orig.)

  13. The Onset and Time Course of Semantic Priming during Rapid Recognition of Visual Words

    Science.gov (United States)

    Hoedemaker, Renske S.; Gordon, Peter C.

    2016-01-01

    In two experiments, we assessed the effects of response latency and task-induced goals on the onset and time course of semantic priming during rapid processing of visual words as revealed by ocular response tasks. In Experiment 1 (Ocular Lexical Decision Task), participants performed a lexical decision task using eye-movement responses on a sequence of four words. In Experiment 2, the same words were encoded for an episodic recognition memory task that did not require a meta-linguistic judgment. For both tasks, survival analyses showed that the earliest-observable effect (Divergence Point or DP) of semantic priming on target-word reading times occurred at approximately 260 ms, and ex-Gaussian distribution fits revealed that the magnitude of the priming effect increased as a function of response time. Together, these distributional effects of semantic priming suggest that the influence of the prime increases when target processing is more effortful. This effect does not require that the task include a metalinguistic judgment; manipulation of the task goals across experiments affected the overall response speed but not the location of the DP or the overall distributional pattern of the priming effect. These results are more readily explained as the result of a retrospective rather than a prospective priming mechanism and are consistent with compound-cue models of semantic priming. PMID:28230394

  14. Rapid quantitative detection of Lactobacillus sakei in meat and fermented sausages by real-time PCR.

    Science.gov (United States)

    Martín, Belén; Jofré, Anna; Garriga, Margarita; Pla, Maria; Aymerich, Teresa

    2006-09-01

    A quick and simple method for quantitative detection of Lactobacillus sakei in fermented sausages was successfully developed. It is based on Chelex-100-based DNA purification and real-time PCR enumeration using a TaqMan fluorescence probe. Primers and probes were designed in the L. sakei 16S-23S rRNA intergenic transcribed spacer region, and the assay was evaluated using L. sakei genomic DNA and an artificially inoculated sausage model. The detection limit of this technique was approximately 3 cells per reaction mixture using both purified DNA and the inoculated sausage model. The quantification limit was established at 30 cells per reaction mixture in both models. The assay was then applied to enumerate L. sakei in real samples, and the results were compared to the MRS agar count method followed by confirmation of the percentage of L. sakei colonies. The results obtained by real-time PCR were not statistically significantly different than those obtained by plate count on MRS agar (P > 0.05), showing a satisfactory agreement between both methods. Therefore, the real-time PCR assay developed can be considered a promising rapid alternative method for the quantification of L. sakei and evaluation of the implantation of starter strains of L. sakei in fermented sausages.

  15. Rapid Detection of Biological and Chemical Threat Agents Using Physical Chemistry, Active Detection, and Computational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung; Dong, Li; Fu, Rong; Liotta, Lance; Narayanan, Aarthi; Petricoin, Emanuel; Ross, Mark; Russo, Paul; Zhou, Weidong; Luchini, Alessandra; Manes, Nathan; Chertow, Jessica; Han, Suhua; Kidd, Jessica; Senina, Svetlana; Groves, Stephanie

    2007-01-01

    Basic technologies have been successfully developed within this project: rapid collection of aerosols and a rapid ultra-sensitive immunoassay technique. Water-soluble, humidity-resistant polyacrylamide nano-filters were shown to (1) capture aerosol particles as small as 20 nm, (2) work in humid air and (3) completely liberate their captured particles in an aqueous solution compatible with the immunoassay technique. The immunoassay technology developed within this project combines electrophoretic capture with magnetic bead detection. It allows detection of as few as 150-600 analyte molecules or viruses in only three minutes, something no other known method can duplicate. The technology can be used in a variety of applications where speed of analysis and/or extremely low detection limits are of great importance: in rapid analysis of donor blood for hepatitis, HIV and other blood-borne infections in emergency blood transfusions, in trace analysis of pollutants, or in search of biomarkers in biological fluids. Combined in a single device, the water-soluble filter and ultra-sensitive immunoassay technique may solve the problem of early warning type detection of aerosolized pathogens. These two technologies are protected with five patent applications and are ready for commercialization.

  16. Addressing unmet need for HIV testing in emergency care settings: a role for computer-facilitated rapid HIV testing?

    Science.gov (United States)

    Kurth, Ann E; Severynen, Anneleen; Spielberg, Freya

    2013-08-01

    HIV testing in emergency departments (EDs) remains underutilized. The authors evaluated a computer tool to facilitate rapid HIV testing in an urban ED. Randomly assigned nonacute adult ED patients were randomly assigned to a computer tool (CARE) and rapid HIV testing before a standard visit (n = 258) or to a standard visit (n = 259) with chart access. The authors assessed intervention acceptability and compared noted HIV risks. Participants were 56% nonWhite and 58% male; median age was 37 years. In the CARE arm, nearly all (251/258) of the patients completed the session and received HIV results; four declined to consent to the test. HIV risks were reported by 54% of users; one participant was confirmed HIV-positive, and two were confirmed false-positive (seroprevalence 0.4%, 95% CI [0.01, 2.2]). Half (55%) of the patients preferred computerized rather than face-to-face counseling for future HIV testing. In the standard arm, one HIV test and two referrals for testing occurred. Computer-facilitated HIV testing appears acceptable to ED patients. Future research should assess cost-effectiveness compared with staff-delivered approaches.

  17. Time-Domain Terahertz Computed Axial Tomography NDE System

    Science.gov (United States)

    Zimdars, David

    2012-01-01

    NASA has identified the need for advanced non-destructive evaluation (NDE) methods to characterize aging and durability in aircraft materials to improve the safety of the nation's airline fleet. 3D THz tomography can play a major role in detection and characterization of flaws and degradation in aircraft materials, including Kevlar-based composites and Kevlar and Zylon fabric covers for soft-shell fan containment where aging and durability issues are critical. A prototype computed tomography (CT) time-domain (TD) THz imaging system has been used to generate 3D images of several test objects including a TUFI tile (a thermal protection system tile used on the Space Shuttle and possibly the Orion or similar capsules). This TUFI tile had simulated impact damage that was located and the depth of damage determined. The CT motion control gan try was designed and constructed, and then integrated with a T-Ray 4000 control unit and motion controller to create a complete CT TD-THz imaging system prototype. A data collection software script was developed that takes multiple z-axis slices in sequence and saves the data for batch processing. The data collection software was integrated with the ability to batch process the slice data with the CT TD-THz image reconstruction software. The time required to take a single CT slice was decreased from six minutes to approximately one minute by replacing the 320 ps, 100-Hz waveform acquisition system with an 80 ps, 1,000-Hz waveform acquisition system. The TD-THZ computed tomography system was built from pre-existing commercial off-the-shelf subsystems. A CT motion control gantry was constructed from COTS components that can handle larger samples. The motion control gantry allows inspection of sample sizes of up to approximately one cubic foot (.0.03 cubic meters). The system reduced to practice a CT-TDTHz system incorporating a COTS 80- ps/l-kHz waveform scanner. The incorporation of this scanner in the system allows acquisition of 3D

  18. Rapid Computation of Thermodynamic Properties over Multidimensional Nonbonded Parameter Spaces Using Adaptive Multistate Reweighting.

    Science.gov (United States)

    Naden, Levi N; Shirts, Michael R

    2016-04-12

    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over 1000 CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form with charges between -2 and +2 and generally physical values of σij and ϵij in TIP3P water. We also compute entropy, enthalpy, and radial distribution functions of arbitrary unsampled parameter combinations using only the data from these sampled states and use the estimates of free energies over the entire space to examine the deviation of atomistic simulations from the Born approximation to the solvation free

  19. Effect of a Real-Time Electronic Dashboard on a Rapid Response System.

    Science.gov (United States)

    Fletcher, Grant S; Aaronson, Barry A; White, Andrew A; Julka, Reena

    2017-11-20

    A rapid response system (RRS) may have limited effectiveness when inpatient providers fail to recognize signs of early patient decompensation. We evaluated the impact of an electronic medical record (EMR)-based alerting dashboard on outcomes associated with RRS activation. We used a repeated treatment study in which the dashboard display was successively turned on and off each week for ten 2-week cycles over a 20-week period on the inpatient acute care wards of an academic medical center. The Rapid Response Team (RRT) dashboard displayed all hospital patients in a single view ranked by severity score, updated in real time. The dashboard could be seen within the EMR by any provider, including RRT members. The primary outcomes were the incidence rate ratio (IRR) of all RRT activations, unexpected ICU transfers, cardiopulmonary arrests and deaths on general medical-surgical wards (wards). We conducted an exploratory analysis of first RRT activations. There were 6736 eligible admissions during the 20-week study period. There was no change in overall RRT activations (IRR = 1.14, p = 0.07), but a significant increase in first RRT activations (IRR = 1.20, p = 0.04). There were no significant differences in unexpected ICU transfers (IRR = 1.15, p = 0.25), cardiopulmonary arrests on general wards (IRR = 1.46, p = 0.43), or deaths on general wards (IRR = 0.96, p = 0.89). The introduction of the RRT dashboard was associated with increased initial RRT activations but not overall activations, unexpected ICU transfers, cardiopulmonary arrests, or death. The RRT dashboard is a novel tool to help providers recognize patient decompensation and may improve initial RRT notification.

  20. The use of bivalves as rapid, real-time indicators of aquatic pollution

    International Nuclear Information System (INIS)

    Markich, S.J.

    1995-01-01

    The ability of bivalves to filter large volumes of water on a daily basis, combined with the relatively high permeability of their cell membranes, make them valuable organisms to use in the contemporary detection of pollution. Bivalves are well known to respond to chemical contaminants by isolating their soft tissues from the aquatic medium by valve closure. The sensory acuity (via specialized sensory regions including the osphradium) and associated repertoire of this behavioral response can be employed to assess subtle effects exerted by chemical contaminants, such as complex effluents, that may ultimately influence the survival of these organisms. As hazard assessment tools, behavioral studies reflect sublethal toxicity and often yield a highly sensitive estimate of the lowest observable effect concentration (LOEC). Moreover, valve movement behavior has been identified as one of the more sensitive biological early warning measures to a variety of aquatic contaminants, in comparison with those used in other aquatic animal phyla. Therefore, the valve movement behavior of both freshwater (Hyridella depressa, Velesunio angasi and V. ambiguus) and marine (Mytilus edulis) bivalves was continuously monitored, using an on-line computer based data acquisition system, during exposure to either trace metals (e.g. Cu, Cd, Mn and U) or complex effluents (ie treated sewage effluent and acid leachate derived from contaminated Sydney Harbour sediments), in the context of using the valve movement behavior of the bivalve species to indicate the biological significance of exposure to the above-mentioned pollutants. The results indicate that several components of the valve movement behavior of each bivalve provide quantifiable and ecologically interpretable sub-lethal endpoints for the rapid and sensitive evaluation of waters containing either complex effluents or elevated levels of trace metals

  1. Rapid and acute effects of estrogen on time perception in male and female rats

    Directory of Open Access Journals (Sweden)

    Kristen ePleil

    2011-10-01

    Full Text Available Sex differences in the rapid and acute effects of estrodiol on time perception were investigated in adult male and female Sprague-Dawley rats. Because estrodiol has been shown to increase striatal dopamine release, it may be able to modify time perception and timed performance by increasing the speed of an internal clock in a manner similar to indirect dopamine agonists such as amphetamine and cocaine. Two groups of females (neonatally estradiol-treated/adult ovariectomized and neonatally oil-treated/adult ovariectomized and 2 groups of males (neonatally castrated and adult castrated were trained in a 2 s vs. 8 s duration bisection procedure and tested using intermediate signal durations. After obtaining oil-injected baseline psychometric functions over several days, rats were administered 5μg of estradiol for 4 days and behaviorally evaluated 30 min following each injection. This oil-estradiol administration cycle was subsequently repeated 3 times following the re-establishment of baseline training. Results revealed significant sex differences in the initial baseline functions that were not modifiable by organizational hormones, with males’ duration bisection functions shifted horizontally to the left of females’. Upon the first administration of estradiol, females, but not males, showed a significant, transient leftward shift in their bisection functions, indicative of an increase in clock speed. After extensive retraining in the duration bisection procedure, rats that were exposed to gonadal hormones during the first week of life showed a significant rightward shift in their bisection functions on the fourth day of estradiol administration during each cycle, suggesting a decrease in clock speed. Taken together, our results support the view that there are multiple mechanisms of estrogens’ action in the striatum that modulate dopaminergic activity and are differentially organized by gonadal steroids during early brain development.

  2. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2012-11-08

    This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in combination with multivariate data analysis provides a very flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Accurate Rapid Lifetime Determination on Time-Gated FLIM Microscopy with Optical Sectioning.

    Science.gov (United States)

    Silva, Susana F; Domingues, José Paulo; Morgado, António Miguel

    2018-01-01

    Time-gated fluorescence lifetime imaging microscopy (FLIM) is a powerful technique to assess the biochemistry of cells and tissues. When applied to living thick samples, it is hampered by the lack of optical sectioning and the need of acquiring many images for an accurate measurement of fluorescence lifetimes. Here, we report on the use of processing techniques to overcome these limitations, minimizing the acquisition time, while providing optical sectioning. We evaluated the application of the HiLo and the rapid lifetime determination (RLD) techniques for accurate measurement of fluorescence lifetimes with optical sectioning. HiLo provides optical sectioning by combining the high-frequency content from a standard image, obtained with uniform illumination, with the low-frequency content of a second image, acquired using structured illumination. Our results show that HiLo produces optical sectioning on thick samples without degrading the accuracy of the measured lifetimes. We also show that instrument response function (IRF) deconvolution can be applied with the RLD technique on HiLo images, improving greatly the accuracy of the measured lifetimes. These results open the possibility of using the RLD technique with pulsed diode laser sources to determine accurately fluorescence lifetimes in the subnanosecond range on thick multilayer samples, providing that offline processing is allowed.

  4. Computer aided detection of suspicious regions on digital mammograms : rapid segmentation and feature extraction

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, C; Giacomini, M; Sacile, R [DIST - Department of Communication Computer and System Sciences, University of Genova, Via Opera Pia 13, 16145 Genova (Italy); Rosselli Del Turco, M [Centro per lo studio e la prevenzione oncologica, Firenze (Italy)

    1999-12-31

    A method is presented for rapid detection of suspicious regions which consists of two steps. The first step is segmentation based on texture analysis consisting of : histogram equalization, Laws filtering for texture analysis, Gaussian blur and median filtering to enhance differences between tissues in different respects, histogram thresholding to obtain a binary image, logical masking in order to detect regions to be discarded from the analysis, edge detection. This method has been tested on 60 images, obtaining 93% successful detection of suspicious regions. (authors) 4 refs, 9 figs, 1 tabs.

  5. A computational tool for the rapid design and prototyping of propellers for underwater vehicles

    OpenAIRE

    D'Epagnier, Kathryn Port.

    2007-01-01

    An open source, MATLAB (trademarked)-based propeller design code MPVL was improved to include rapid prototyping capabilities as well as other upgrades as part of this effort. The resulting code, OpenPVL is described in this thesis. In addition, results from the development code BasicPVL are presented. An intermediate code, BasicPVL, was created by the author while OpenPVL was under development, and it provides guidance for initial propeller designs and propeller efficiency analysis. OpenPVL i...

  6. A complex-plane strategy for computing rotating polytropic models - Numerical results for strong and rapid differential rotation

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1990-01-01

    In this paper, a numerical method, called complex-plane strategy, is implemented in the computation of polytropic models distorted by strong and rapid differential rotation. The differential rotation model results from a direct generalization of the classical model, in the framework of the complex-plane strategy; this generalization yields very strong differential rotation. Accordingly, the polytropic models assume extremely distorted interiors, while their boundaries are slightly distorted. For an accurate simulation of differential rotation, a versatile method, called multiple partition technique is developed and implemented. It is shown that the method remains reliable up to rotation states where other elaborate techniques fail to give accurate results. 11 refs

  7. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    International Nuclear Information System (INIS)

    Clark, Aurora Sue; Wall, Nathalie; Benny, Paul

    2015-01-01

    design of a software program that uses state-of-the-art computational combinatorial chemistry, and is developed and validated with experimental data acquisition; the resulting tool allows for rapid design and screening of new ligands for the extraction of precious metals from SNF. This document describes the software that has been produced, ligands that have been designed, and fundamental new understandings of the extraction process of Rh(III) as a function of solution phase conditions (pH, nature of acid, etc.).

  8. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    Energy Technology Data Exchange (ETDEWEB)

    Clark, Aurora Sue [Washington State Univ., Pullman, WA (United States); Wall, Nathalie [Washington State Univ., Pullman, WA (United States); Benny, Paul [Washington State Univ., Pullman, WA (United States)

    2015-11-16

    through the design of a software program that uses state-of-the-art computational combinatorial chemistry, and is developed and validated with experimental data acquisition; the resulting tool allows for rapid design and screening of new ligands for the extraction of precious metals from SNF. This document describes the software that has been produced, ligands that have been designed, and fundamental new understandings of the extraction process of Rh(III) as a function of solution phase conditions (pH, nature of acid, etc.).

  9. Real-time control of focused ultrasound heating based on rapid MR thermometry.

    Science.gov (United States)

    Vimeux, F C; De Zwart, J A; Palussiére, J; Fawaz, R; Delalande, C; Canioni, P; Grenier, N; Moonen, C T

    1999-03-01

    Real-time control of the heating procedure is essential for hyperthermia applications of focused ultrasound (FUS). The objective of this study is to demonstrate the feasibility of MRI-controlled FUS. An automatic control system was developed using a dedicated interface between the MR system control computer and the FUS wave generator. Two algorithms were used to regulate FUS power to maintain the focal point temperature at a desired level. Automatic control of FUS power level was demonstrated ex vivo at three target temperature levels (increase of 5 degrees C, 10 degrees C, and 30 degrees C above room temperature) during 30-minute hyperthermic periods. Preliminary in vivo results on rat leg muscle confirm that necrosis estimate, calculated on-line during FUS sonication, allows prediction of tissue damage. CONCLUSIONS. The feasibility of fully automatic FUS control based on MRI thermometry has been demonstrated.

  10. Time-of-Flight Sensors in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2009-01-01

    , including Computer Graphics, Computer Vision and Man Machine Interaction (MMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real...

  11. Phenomapping of rangelands in South Africa using time series of RapidEye data

    Science.gov (United States)

    Parplies, André; Dubovyk, Olena; Tewes, Andreas; Mund, Jan-Peter; Schellberg, Jürgen

    2016-12-01

    Phenomapping is an approach which allows the derivation of spatial patterns of vegetation phenology and rangeland productivity based on time series of vegetation indices. In our study, we propose a new spatial mapping approach which combines phenometrics derived from high resolution (HR) satellite time series with spatial logistic regression modeling to discriminate land management systems in rangelands. From the RapidEye time series for selected rangelands in South Africa, we calculated bi-weekly noise reduced Normalized Difference Vegetation Index (NDVI) images. For the growing season of 2011⿿2012, we further derived principal phenology metrics such as start, end and length of growing season and related phenological variables such as amplitude, left derivative and small integral of the NDVI curve. We then mapped these phenometrics across two different tenure systems, communal and commercial, at the very detailed spatial resolution of 5 m. The result of a binary logistic regression (BLR) has shown that the amplitude and the left derivative of the NDVI curve were statistically significant. These indicators are useful to discriminate commercial from communal rangeland systems. We conclude that phenomapping combined with spatial modeling is a powerful tool that allows efficient aggregation of phenology and productivity metrics for spatially explicit analysis of the relationships of crop phenology with site conditions and management. This approach has particular potential for disaggregated and patchy environments such as in farming systems in semi-arid South Africa, where phenology varies considerably among and within years. Further, we see a strong perspective for phenomapping to support spatially explicit modelling of vegetation.

  12. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  13. [COMPUTER ASSISTED DESIGN AND ELECTRON BEAMMELTING RAPID PROTOTYPING METAL THREE-DIMENSIONAL PRINTING TECHNOLOGY FOR PREPARATION OF INDIVIDUALIZED FEMORAL PROSTHESIS].

    Science.gov (United States)

    Liu, Hongwei; Weng, Yiping; Zhang, Yunkun; Xu, Nanwei; Tong, Jing; Wang, Caimei

    2015-09-01

    To study the feasibility of preparation of the individualized femoral prosthesis through computer assisted design and electron beammelting rapid prototyping (EBM-RP) metal three-dimensional (3D) printing technology. One adult male left femur specimen was used for scanning with 64-slice spiral CT; tomographic image data were imported into Mimics15.0 software to reconstruct femoral 3D model, then the 3D model of individualized femoral prosthesis was designed through UG8.0 software. Finally the 3D model data were imported into EBM-RP metal 3D printer to print the individualized sleeve. According to the 3D model of individualized prosthesis, customized sleeve was successfully prepared through the EBM-RP metal 3D printing technology, assembled with the standard handle component of SR modular femoral prosthesis to make the individualized femoral prosthesis. Customized femoral prosthesis accurately matching with metaphyseal cavity can be designed through the thin slice CT scanning and computer assisted design technology. Titanium alloy personalized prosthesis with complex 3D shape, pore surface, and good matching with metaphyseal cavity can be manufactured by the technology of EBM-RP metal 3D printing, and the technology has convenient, rapid, and accurate advantages.

  14. Accuracy of using computer-aided rapid prototyping templates for mandible reconstruction with an iliac crest graft

    Science.gov (United States)

    2014-01-01

    Background This study aimed to evaluate the accuracy of surgical outcomes in free iliac crest mandibular reconstructions that were carried out with virtual surgical plans and rapid prototyping templates. Methods This study evaluated eight patients who underwent mandibular osteotomy and reconstruction with free iliac crest grafts using virtual surgical planning and designed guiding templates. Operations were performed using the prefabricated guiding templates. Postoperative three-dimensional computer models were overlaid and compared with the preoperatively designed models in the same coordinate system. Results Compared to the virtual osteotomy, the mean error of distance of the actual mandibular osteotomy was 2.06 ± 0.86 mm. When compared to the virtual harvested grafts, the mean error volume of the actual harvested grafts was 1412.22 ± 439.24 mm3 (9.12% ± 2.84%). The mean error between the volume of the actual harvested grafts and the shaped grafts was 2094.35 ± 929.12 mm3 (12.40% ± 5.50%). Conclusions The use of computer-aided rapid prototyping templates for virtual surgical planning appears to positively influence the accuracy of mandibular reconstruction. PMID:24957053

  15. Spectroscopic sensitivity of real-time, rapidly induced phytochemical change in response to damage.

    Science.gov (United States)

    Couture, John J; Serbin, Shawn P; Townsend, Philip A

    2013-04-01

    An ecological consequence of plant-herbivore interactions is the phytochemical induction of defenses in response to insect damage. Here, we used reflectance spectroscopy to characterize the foliar induction profile of cardenolides in Asclepias syriaca in response to damage, tracked in vivo changes and examined the influence of multiple plant traits on cardenolide concentrations. Foliar cardenolide concentrations were measured at specific time points following damage to capture their induction profile. Partial least-squares regression (PLSR) modeling was employed to calibrate cardenolide concentrations to reflectance spectroscopy. In addition, subsets of plants were either repeatedly sampled to track in vivo changes or modified to reduce latex flow to damaged areas. Cardenolide concentrations and the induction profile of A. syriaca were well predicted using models derived from reflectance spectroscopy, and this held true for repeatedly sampled plants. Correlations between cardenolides and other foliar-related variables were weak or not significant. Plant modification for latex reduction inhibited an induced cardenolide response. Our findings show that reflectance spectroscopy can characterize rapid phytochemical changes in vivo. We used reflectance spectroscopy to identify the mechanisms behind the production of plant secondary metabolites, simultaneously characterizing multiple foliar constituents. In this case, cardenolide induction appears to be largely driven by enhanced latex delivery to leaves following damage. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.

  16. Rapid testing and identification of actuator using dSPACE real-time emulator

    Science.gov (United States)

    Xie, Daocheng; Wang, Zhongwei; Zeng, Qinghua

    2011-10-01

    To solve the problem of model identification of actuator in control system design of aerocraft, testing system based on dSPACE emulator is established, sending testing signal and receiving feedback voltage are realized using dSPACE interactive cards, communication between signal generating equipment and feedback voltage acquisition equipment is synchronized. This paper introduces the hardware architecture and key technologies of the simulation system. Constructing, downloading and calculating of the testing model is finished using dSPACE emulator, D/A transfer of testing signal is realized using DS2103 card, DS2002 card transfer the feedback voltage to digital value. Filtering module is added to the signal acquisition, for reduction of noise interference in the A/D channel. Precision of time and voltage is improved by setting acquisition period 1ms. The data gathered is recorded and displayed with Controldesk tools. The response of four actuators under different frequency are tested, frequency-domain analysis is done using least square method, the model of actuator is identified, simulation data fits well with real response of the actuator. The testing system created with dSPACE emulator satisfies the rapid testing and identification of actuator.

  17. Renal parenchyma thickness: a rapid estimation of renal function on computed tomography

    International Nuclear Information System (INIS)

    Kaplon, Daniel M.; Lasser, Michael S.; Sigman, Mark; Haleblian, George E.; Pareek, Gyan

    2009-01-01

    Purpose: To define the relationship between renal parenchyma thickness (RPT) on computed tomography and renal function on nuclear renography in chronically obstructed renal units (ORUs) and to define a minimal thickness ratio associated with adequate function. Materials and Methods: Twenty-eight consecutive patients undergoing both nuclear renography and CT during a six-month period between 2004 and 2006 were included. All patients that had a diagnosis of unilateral obstruction were included for analysis. RPT was measured in the following manner: The parenchyma thickness at three discrete levels of each kidney was measured using calipers on a CT workstation. The mean of these three measurements was defined as RPT. The renal parenchyma thickness ratio of the ORUs and non-obstructed renal unit (NORUs) was calculated and this was compared to the observed function on Mag-3 lasix Renogram. Results: A total of 28 patients were evaluated. Mean parenchyma thickness was 1.82 cm and 2.25 cm in the ORUs and NORUs, respectively. The mean relative renal function of ORUs was 39%. Linear regression analysis comparing renogram function to RPT ratio revealed a correlation coefficient of 0.48 (p * RPT ratio. A thickness ratio of 0.68 correlated with 20% renal function. Conclusion: RPT on computed tomography appears to be a powerful predictor of relative renal function in ORUs. Assessment of RPT is a useful and readily available clinical tool for surgical decision making (renal salvage therapy versus nephrectomy) in patients with ORUs. (author)

  18. Technical Note: Deep learning based MRAC using rapid ultra-short echo time imaging.

    Science.gov (United States)

    Jang, Hyungseok; Liu, Fang; Zhao, Gengyan; Bradshaw, Tyler; McMillan, Alan B

    2018-05-15

    In this study, we explore the feasibility of a novel framework for MR-based attenuation correction for PET/MR imaging based on deep learning via convolutional neural networks, which enables fully automated and robust estimation of a pseudo CT image based on ultrashort echo time (UTE), fat, and water images obtained by a rapid MR acquisition. MR images for MRAC are acquired using dual echo ramped hybrid encoding (dRHE), where both UTE and out-of-phase echo images are obtained within a short single acquisition (35 sec). Tissue labeling of air, soft tissue, and bone in the UTE image is accomplished via a deep learning network that was pre-trained with T1-weighted MR images. UTE images are used as input to the network, which was trained using labels derived from co-registered CT images. The tissue labels estimated by deep learning are refined by a conditional random field based correction. The soft tissue labels are further separated into fat and water components using the two-point Dixon method. The estimated bone, air, fat, and water images are then assigned appropriate Hounsfield units, resulting in a pseudo CT image for PET attenuation correction. To evaluate the proposed MRAC method, PET/MR imaging of the head was performed on 8 human subjects, where Dice similarity coefficients of the estimated tissue labels and relative PET errors were evaluated through comparison to a registered CT image. Dice coefficients for air (within the head), soft tissue, and bone labels were 0.76±0.03, 0.96±0.006, and 0.88±0.01. In PET quantification, the proposed MRAC method produced relative PET errors less than 1% within most brain regions. The proposed MRAC method utilizing deep learning with transfer learning and an efficient dRHE acquisition enables reliable PET quantification with accurate and rapid pseudo CT generation. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  19. UMTS rapid response real-time seismic networks: implementation and strategies at INGV

    Science.gov (United States)

    Govoni, A.; Margheriti, L.; Moretti, M.; Lauciani, V.; Sensale, G.; Bucci, A.; Criscuoli, F.

    2015-12-01

    Universal Mobile Telecommunications System (UMTS) and its evolutions are nowadays the most affordable and widespread data communication infrastructure available almost world wide. Moreover the always growing cellular phone market is pushing the development of new devices with higher performances and lower power consumption. All these characteristics make UMTS really useful for the implementation of an "easy to deploy" temporary real-time seismic station. Despite these remarkable features, there are many drawbacks that must be properly taken in account to effectively transmit the seismic data: Internet security, signal and service availability, power consumption. - Internet security: exposing seismological data services and seismic stations to the Internet is dangerous, attack prone and can lead to downtimes in the services, so we setup a dedicated Virtual Private Network (VPN) service to protect all the connected devices. - Signal and service availability: while for temporary experiment a carefull planning and an accurate site selection can minimize the problem, this is not always the case with rapid response networks. Moreover, as with any other leased line, the availability of the UMTS service during a seismic crisis is basically unpredictable. Nowadays in Italy during a major national emergency a Committee of the Italian Civil Defense ensures unified management and coordination of emergency activities. Inside it the telecom companies are committed to give support to the crisis management improving the standards in their communication networks. - Power consumption: it is at least of the order of that of the seismic station and, being related to data flow and signal quality is largely unpredictable. While the most secure option consists in adding a second independent solar power supply to the seismic station, this is not always a very convenient solution since it doubles the cost and doubles the equipment on site. We found that an acceptable trade-off is to add an

  20. Rapidly-steered single-element ultrasound for real-time volumetric imaging and guidance

    Science.gov (United States)

    Stauber, Mark; Western, Craig; Solek, Roman; Salisbury, Kenneth; Hristov, Dmitre; Schlosser, Jeffrey

    2016-03-01

    Volumetric ultrasound (US) imaging has the potential to provide real-time anatomical imaging with high soft-tissue contrast in a variety of diagnostic and therapeutic guidance applications. However, existing volumetric US machines utilize "wobbling" linear phased array or matrix phased array transducers which are costly to manufacture and necessitate bulky external processing units. To drastically reduce cost, improve portability, and reduce footprint, we propose a rapidly-steered single-element volumetric US imaging system. In this paper we explore the feasibility of this system with a proof-of-concept single-element volumetric US imaging device. The device uses a multi-directional raster-scan technique to generate a series of two-dimensional (2D) slices that were reconstructed into three-dimensional (3D) volumes. At 15 cm depth, 90° lateral field of view (FOV), and 20° elevation FOV, the device produced 20-slice volumes at a rate of 0.8 Hz. Imaging performance was evaluated using an US phantom. Spatial resolution was 2.0 mm, 4.7 mm, and 5.0 mm in the axial, lateral, and elevational directions at 7.5 cm. Relative motion of phantom targets were automatically tracked within US volumes with a mean error of -0.3+/-0.3 mm, -0.3+/-0.3 mm, and -0.1+/-0.5 mm in the axial, lateral, and elevational directions, respectively. The device exhibited a mean spatial distortion error of 0.3+/-0.9 mm, 0.4+/-0.7 mm, and -0.3+/-1.9 in the axial, lateral, and elevational directions. With a production cost near $1000, the performance characteristics of the proposed system make it an ideal candidate for diagnostic and image-guided therapy applications where form factor and low cost are paramount.

  1. 12 CFR 516.10 - How does OTS compute time periods under this part?

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false How does OTS compute time periods under this part? 516.10 Section 516.10 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPLICATION PROCESSING PROCEDURES § 516.10 How does OTS compute time periods under this part? In computing...

  2. Rapid and collaborative development of socially relevant computing solutions for developing communities

    CSIR Research Space (South Africa)

    Mtsweni, J

    2014-11-01

    Full Text Available stakeholders, including communities to work on common social challenges over a short period of time are emerging to address the technological gaps in ICT4D projects. This research expands on the extensive work that has been done over the years in ICT4D projects...

  3. How Many Times Should One Run a Computational Simulation?

    DEFF Research Database (Denmark)

    Seri, Raffaello; Secchi, Davide

    2017-01-01

    This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces statisti......This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...

  4. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    Science.gov (United States)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  5. Development of a high-speed real-time PCR system for rapid and precise nucleotide recognition

    Science.gov (United States)

    Terazono, Hideyuki; Takei, Hiroyuki; Hattori, Akihiro; Yasuda, Kenji

    2010-04-01

    Polymerase chain reaction (PCR) is a common method used to create copies of a specific target region of a DNA sequence and to produce large quantities of DNA. A few DNA molecules, which act as templates, are rapidly amplified by PCR into many billions of copies. PCR is a key technology in genome-based biological analysis, revolutionizing many life science fields such as medical diagnostics, food safety monitoring, and countermeasures against bioterrorism. Thus, many applications have been developed with the thermal cycling. For these PCR applications, one of the most important key factors is reduction in the data acquisition time. To reduce the acquisition time, it is necessary to decrease the temperature transition time between the high and low ends as much as possible. We have developed a novel rapid real-time PCR system based on rapid exchange of media maintained at different temperatures. This system consists of two thermal reservoirs and a reaction chamber for PCR observation. The temperature transition was achieved within 0.3 sec, and good thermal stability was achieved during thermal cycling with rapid exchange of circulating media. This system allows rigorous optimization of the temperatures required for each stage of the PCR processes. Resulting amplicons were confirmed by electrophoresis. Using the system, rapid DNA amplification was accomplished within 3.5 min, including initial heating and complete 50 PCR cycles. It clearly shows that the device could allow us faster temperature switching than the conventional conduction-based heating systems based on Peltier heating/cooling.

  6. A general algorithm for computing distance transforms in linear time

    NARCIS (Netherlands)

    Meijster, A.; Roerdink, J.B.T.M.; Hesselink, W.H.; Goutsias, J; Vincent, L; Bloomberg, DS

    2000-01-01

    A new general algorithm fur computing distance transforms of digital images is presented. The algorithm consists of two phases. Both phases consist of two scans, a forward and a backward scan. The first phase scans the image column-wise, while the second phase scans the image row-wise. Since the

  7. SENSITIVITY OF HELIOSEISMIC TRAVEL TIMES TO THE IMPOSITION OF A LORENTZ FORCE LIMITER IN COMPUTATIONAL HELIOSEISMOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Moradi, Hamed; Cally, Paul S., E-mail: hamed.moradi@monash.edu [Monash Centre for Astrophysics, School of Mathematical Sciences, Monash University, Clayton, Victoria 3800 (Australia)

    2014-02-20

    The rapid exponential increase in the Alfvén wave speed with height above the solar surface presents a serious challenge to physical modeling of the effects of magnetic fields on solar oscillations, as it introduces a significant Courant-Friedrichs-Lewy time-step constraint for explicit numerical codes. A common approach adopted in computational helioseismology, where long simulations in excess of 10 hr (hundreds of wave periods) are often required, is to cap the Alfvén wave speed by artificially modifying the momentum equation when the ratio between the Lorentz and hydrodynamic forces becomes too large. However, recent studies have demonstrated that the Alfvén wave speed plays a critical role in the MHD mode conversion process, particularly in determining the reflection height of the upwardly propagating helioseismic fast wave. Using numerical simulations of helioseismic wave propagation in constant inclined (relative to the vertical) magnetic fields we demonstrate that the imposition of such artificial limiters significantly affects time-distance travel times unless the Alfvén wave-speed cap is chosen comfortably in excess of the horizontal phase speeds under investigation.

  8. Comparative use of the computer-aided angiography and rapid prototyping technology versus conventional imaging in the management of the Tile C pelvic fractures.

    Science.gov (United States)

    Li, Baofeng; Chen, Bei; Zhang, Ying; Wang, Xinyu; Wang, Fei; Xia, Hong; Yin, Qingshui

    2016-01-01

    Computed tomography (CT) scan with three-dimensional (3D) reconstruction has been used to evaluate complex fractures in pre-operative planning. In this study, rapid prototyping of a life-size model based on 3D reconstructions including bone and vessel was applied to evaluate the feasibility and prospect of these new technologies in surgical therapy of Tile C pelvic fractures by observing intra- and perioperative outcomes. The authors conducted a retrospective study on a group of 157 consecutive patients with Tile C pelvic fractures. Seventy-six patients were treated with conventional pre-operative preparation (A group) and 81 patients were treated with the help of computer-aided angiography and rapid prototyping technology (B group). Assessment of the two groups considered the following perioperative parameters: length of surgical procedure, intra-operative complications, intra- and postoperative blood loss, postoperative pain, postoperative nausea and vomiting (PONV), length of stay, and type of discharge. The two groups were homogeneous when compared in relation to mean age, sex, body weight, injury severity score, associated injuries and pelvic fracture severity score. Group B was performed in less time (105 ± 19 minutes vs. 122 ± 23 minutes) and blood loss (31.0 ± 8.2 g/L vs. 36.2 ± 7.4 g/L) compared with group A. Patients in group B experienced less pain (2.5 ± 2.3 NRS score vs. 2.8 ± 2.0 NRS score), and PONV affected only 8 % versus 10 % of cases. Times to discharge were shorter (7.8 ± 2.0 days vs. 10.2 ± 3.1 days) in group B, and most of patients were discharged to home. In our study, patients of Tile C pelvic fractures treated with computer-aided angiography and rapid prototyping technology had a better perioperative outcome than patients treated with conventional pre-operative preparation. Further studies are necessary to investigate the advantages in terms of clinical results in the short and long run.

  9. Quo vadis? : persuasive computing using real time queue information

    NARCIS (Netherlands)

    Meys, Wouter; Groen, Maarten

    2014-01-01

    By presenting tourists with real-time information an increase in efficiency and satisfaction of their day planning can be achieved. At the same time, real-time information services can offer the municipality the opportunity to spread the tourists throughout the city centre. An important factor for

  10. Computational modeling and experimental characterization of bacterial microcolonies for rapid detection using light scattering

    Science.gov (United States)

    Bai, Nan

    A label-free and nondestructive optical elastic forward light scattering method has been extended for the analysis of microcolonies for food-borne bacteria detection and identification. To understand the forward light scattering phenomenon, a model based on the scalar diffraction theory has been employed: a bacterial colony is considered as a biological spatial light modulator with amplitude and phase modulation to the incoming light, which continues to propagate to the far-field to form a distinct scattering 'fingerprint'. Numerical implementation via angular spectrum method (ASM) and Fresnel approximation have been carried out through Fast Fourier Transform (FFT) to simulate this optical model. Sampling criteria to achieve unbiased and un-aliased simulation results have been derived and the effects of violating these conditions have been studied. Diffraction patterns predicted by these two methods (ASM and Fresnel) have been compared to show their applicability to different simulation settings. Through the simulation work, the correlation between the colony morphology and its forward scattering pattern has been established to link the number of diffraction rings and the half cone angle with the diameter and the central height of the Gaussian-shaped colonies. In order to experimentally prove the correlation, a colony morphology analyzer has been built and used to characterize the morphology of different bacteria genera and investigate their growth dynamics. The experimental measurements have demonstrated the possibility of differentiating bacteria Salmonella, Listeria, Escherichia in their early growth stage (100˜500 µm) based on their phenotypic characteristics. This conclusion has important implications in microcolony detection, as most bacteria of our interest need much less incubation time (8˜12 hours) to grow into this size range. The original forward light scatterometer has been updated to capture scattering patterns from microcolonies. Experiments have

  11. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  12. Rapid Forgetting Results from Competition over Time between Items in Visual Working Memory

    Science.gov (United States)

    Pertzov, Yoni; Manohar, Sanjay; Husain, Masud

    2017-01-01

    Working memory is now established as a fundamental cognitive process across a range of species. Loss of information held in working memory has the potential to disrupt many aspects of cognitive function. However, despite its significance, the mechanisms underlying rapid forgetting remain unclear, with intense recent debate as to whether it is…

  13. Learning Over Time: Using Rapid Prototyping Generative Analysis Experts and Reduction of Scope to Operationalize Design

    Science.gov (United States)

    2010-05-04

    during the Vietnam Conflict. 67 David A. Kolb , Experiential Learning : Experience as the Source of Learning and Development. (Upper Saddle River, NJ...Essentials for Military Applications. Newport Paper #10. Newport: Newport War College Press. 1996. Kolb , David A. Experiential Learning : Experience... learning over analysis. A broad review of design theory suggests that four techniques - rapid prototyping, generative analysis, use of experts, and

  14. Rapid detection of human parechoviruses in clinical samples by real-time PCR

    NARCIS (Netherlands)

    Benschop, Kimberley; Molenkamp, Richard; van der Ham, Alwin; Wolthers, Katja; Beld, Marcel

    2008-01-01

    BACKGROUND: Human parechoviruses (HPeVs) have been associated with severe conditions such as neonatal sepsis and meningitis in young children. Rapid identification of an infectious agent in such serious conditions in these patients is essential for adequate decision making regarding treatment and

  15. Quantum computing without wavefunctions: time-dependent density functional theory for universal quantum computation.

    Science.gov (United States)

    Tempel, David G; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms.

  16. It is Time to Ban Rapid Weight Loss from Combat Sports.

    Science.gov (United States)

    Artioli, Guilherme G; Saunders, Bryan; Iglesias, Rodrigo T; Franchini, Emerson

    2016-11-01

    Most competitions in combat sports are divided into weight classes, theoretically allowing for fairer and more evenly contested disputes between athletes of similar body size, strength and agility. It has been well documented that most athletes, regardless of the combat sports discipline, reduce significant amounts of body weight in the days prior to competition to qualify for lighter weight classes. Rapid weight loss is characterised by the reduction of a significant amount of body weight (typically 2-10 %, although larger reductions are often seen) in a few days prior to weigh-in (mostly in the last 2-3 days) achieved by a combination of methods that include starvation, severe restriction of fluid intake and intentional sweating. In doing so, athletes try to gain a competitive advantage against lighter, smaller and weaker opponents. Such a drastic and rapid weight reduction is only achievable via a combination of aggressive strategies that lead to hypohydration and starvation. The negative impact of these procedures on health is well described in the literature. Although the impact of rapid weight loss on performance is debated, there remains robust evidence showing that rapid weight loss may not impair performance, and translates into an actual competitive advantage. In addition to the health and performance implications, rapid weight loss clearly breaches fair play and stands against the spirit of the sport because an athlete unwilling to compete having rapidly reduced weight would face unfair contests against opponents who are 'artificially' bigger and stronger. The World Anti-Doping Agency Code states that a prohibited method must meet at least two of the following criteria: (1) enhances performance; (2) endangers an athlete's health; and (3) violates the spirit of the sport. We herein argue that rapid weight loss clearly meets all three criteria and, therefore, should be banned from the sport. To quote the World Anti-Doping Agency Code, this would "protect

  17. The Healthcare Improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare.

    Science.gov (United States)

    McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna

    2016-06-01

    Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.

  18. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    Science.gov (United States)

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  19. A computer vision system for rapid search inspired by surface-based attention mechanisms from human perception.

    Science.gov (United States)

    Mohr, Johannes; Park, Jong-Han; Obermayer, Klaus

    2014-12-01

    Humans are highly efficient at visual search tasks by focusing selective attention on a small but relevant region of a visual scene. Recent results from biological vision suggest that surfaces of distinct physical objects form the basic units of this attentional process. The aim of this paper is to demonstrate how such surface-based attention mechanisms can speed up a computer vision system for visual search. The system uses fast perceptual grouping of depth cues to represent the visual world at the level of surfaces. This representation is stored in short-term memory and updated over time. A top-down guided attention mechanism sequentially selects one of the surfaces for detailed inspection by a recognition module. We show that the proposed attention framework requires little computational overhead (about 11 ms), but enables the system to operate in real-time and leads to a substantial increase in search efficiency. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. 5 CFR 831.703 - Computation of annuities for part-time service.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Computation of annuities for part-time... part-time service. (a) Purpose. The computational method in this section shall be used to determine the annuity for an employee who has part-time service on or after April 7, 1986. (b) Definitions. In this...

  1. Storm blueprints patterns for distributed real-time computation

    CERN Document Server

    Goetz, P Taylor

    2014-01-01

    A blueprints book with 10 different projects built in 10 different chapters which demonstrate the various use cases of storm for both beginner and intermediate users, grounded in real-world example applications.Although the book focuses primarily on Java development with Storm, the patterns are more broadly applicable and the tips, techniques, and approaches described in the book apply to architects, developers, and operations.Additionally, the book should provoke and inspire applications of distributed computing to other industries and domains. Hadoop enthusiasts will also find this book a go

  2. Macroprocessing is the computing design principle for the times

    CERN Multimedia

    2001-01-01

    In a keynote speech, Intel Corporation CEO Craig Barrett emphasized that "macroprocessing" provides innovative and cost effective solutions to companies that they can customize and scale to match their own data needs. Barrett showcased examples of macroprocessing implementations from business, government and the scientific community, which use the power of Intel Architecture and Oracle9i Real Application Clusters to build large complex and scalable database solutions. A testimonial from CERN explained how the need for high performance computing to perform scientific research on sub-atomic particles was accomplished by using clusters of Xeon processor-based servers.

  3. Real-time exposure fusion on a mobile computer

    CSIR Research Space (South Africa)

    Bachoo, AK

    2009-12-01

    Full Text Available information in these scenarios. An image captured using a short exposure time will not saturate bright image re- gions while an image captured with a long exposure time will show more detail in the dark regions. The pixel depth provided by most camera.... The auto exposure also creates strong blown-out highlights in the foreground (the grass patch). The short shutter time (Exposure 1) correctly exposes the grass while the long shutter time (Exposure 3) is able to correctly expose the camouflaged dummy...

  4. Time-resolved temperature measurements in a rapid compression machine using quantum cascade laser absorption in the intrapulse mode

    KAUST Repository

    Nasir, Ehson Fawad; Farooq, Aamir

    2016-01-01

    A temperature sensor based on the intrapulse absorption spectroscopy technique has been developed to measure in situ temperature time-histories in a rapid compression machine (RCM). Two quantum-cascade lasers (QCLs) emitting near 4.55μm and 4.89μm

  5. Rapid freeze-drying cycle optimization using computer programs developed based on heat and mass transfer models and facilitated by tunable diode laser absorption spectroscopy (TDLAS).

    Science.gov (United States)

    Kuu, Wei Y; Nail, Steven L

    2009-09-01

    Computer programs in FORTRAN were developed to rapidly determine the optimal shelf temperature, T(f), and chamber pressure, P(c), to achieve the shortest primary drying time. The constraint for the optimization is to ensure that the product temperature profile, T(b), is below the target temperature, T(target). Five percent mannitol was chosen as the model formulation. After obtaining the optimal sets of T(f) and P(c), each cycle was assigned with a cycle rank number in terms of the length of drying time. Further optimization was achieved by dividing the drying time into a series of ramping steps for T(f), in a cascading manner (termed the cascading T(f) cycle), to further shorten the cycle time. For the purpose of demonstrating the validity of the optimized T(f) and P(c), four cycles with different predicted lengths of drying time, along with the cascading T(f) cycle, were chosen for experimental cycle runs. Tunable diode laser absorption spectroscopy (TDLAS) was used to continuously measure the sublimation rate. As predicted, maximum product temperatures were controlled slightly below the target temperature of -25 degrees C, and the cascading T(f)-ramping cycle is the most efficient cycle design. In addition, the experimental cycle rank order closely matches with that determined by modeling.

  6. Computer-determined assay time based on preset precision

    International Nuclear Information System (INIS)

    Foster, L.A.; Hagan, R.; Martin, E.R.; Wachter, J.R.; Bonner, C.A.; Malcom, J.E.

    1994-01-01

    Most current assay systems for special nuclear materials (SNM) operate on the principle of a fixed assay time which provides acceptable measurement precision without sacrificing the required throughput of the instrument. Waste items to be assayed for SNM content can contain a wide range of nuclear material. Counting all items for the same preset assay time results in a wide range of measurement precision and wastes time at the upper end of the calibration range. A short time sample taken at the beginning of the assay could optimize the analysis time on the basis of the required measurement precision. To illustrate the technique of automatically determining the assay time, measurements were made with a segmented gamma scanner at the Plutonium Facility of Los Alamos National Laboratory with the assay time for each segment determined by counting statistics in that segment. Segments with very little SNM were quickly determined to be below the lower limit of the measurement range and the measurement was stopped. Segments with significant SNM were optimally assays to the preset precision. With this method the total assay time for each item is determined by the desired preset precision. This report describes the precision-based algorithm and presents the results of measurements made to test its validity

  7. Computer-controlled neutron time-of-flight spectrometer. Part II

    International Nuclear Information System (INIS)

    Merriman, S.H.

    1979-12-01

    A time-of-flight spectrometer for neutron inelastic scattering research has been interfaced to a PDP-15/30 computer. The computer is used for experimental data acquisition and analysis and for apparatus control. This report was prepared to summarize the functions of the computer and to act as a users' guide to the software system

  8. Rapid supersensitive laser-semiconductor monitoring system. Time period covered: Dec. 15, 1993 - Dec. 15, 1994

    International Nuclear Information System (INIS)

    Pugatch, V.M.

    2001-01-01

    The creation of the rapid and sensitive system for the determination of the Alpha-radioactivity in the Environmental samples has been determined as the main goal of the Research Contract No. 7200RO/RB. As a result of the first stage of the research accomplished in the year 1993, the prototype of the system based on the combination of the laser photoionization mass spectrometry and many-channel alpha-spectrometer has been built and tested. To improve the sensitivity it was proposed to add one more stage to the laser photoionization mass-spectrometer. To develop the high position sensitivity of the system it was proposed to include into the alpha-radiometer SI strip-detector with submicron position sensitivity. Hardware and software for the laser-semiconductor monitoring system of alpha-radionuclides in the environment have been further developed and tested in frames of the IAEA Research Contract No. 7200/R1/RB. Optimization of the sample evaporation with one more stage of photoionization has been successfully performed in the laser photoionization mass-spectrometer. The automatization of the measurement procedure is under way by means of the IBM PC-386 and specially designed electronic units. The evaluated sensitivity of the new set-up is in the range of 1.0 Bq/kg. A bulk measurement of the alpha-radioactivity concentration in soil samples from the Chernobyl region (100 km) have been performed by means of thick samples method and built under this contract alpha-radiometer with large area SI semiconductor detectors. The lowest detectable level was in the range 100 Bq/kg without any radiochemical separation. Comparison with the data obtained for the same probes by means of the thin sample (with radiochemical separation) has shown higher Pu-concentration values obtained by means of the thick samples. For the first time the Sl-strip-detector with 128 channels has been applied for the alpha-radiometry purposes. Different read-out electronics (including the most

  9. Time domain phenomena of wave propagation in rapidly created plasma of periodic distribution

    International Nuclear Information System (INIS)

    Kuo, S P

    2007-01-01

    Theories, experiments and numerical simulations on the interaction of electromagnetic waves with rapidly created unmagnetized plasmas are presented. In the case that plasma is created uniformly, the frequency of a propagating electromagnetic wave is upshifted. An opposite propagation wave of the same frequency is also generated. In addition, a static current supporting a wiggler magnetic field is also produced in the plasma. When a spatially periodic structure is introduced to the rapidly created plasma, the theory and numerical simulation results show that both frequency-upshifted and downshifted waves are generated. If the plasma has a large but finite dimension in the incident wave propagation direction and is created rapidly rather than instantaneously, the frequency downshifted waves are found to be trapped by the plasma when the plasma frequency is larger than the wave frequency. The wave trapping results in accumulating the frequency-downshifted waves during the finite transient period of plasma creation. Indeed, in the experimental observations the frequency downshifted signals were detected repetitively with considerably enhanced spectral intensities, confirming the results of the numerical simulations. The missing of frequency upshifted signals in the experimental observations is explained by the modal field distributions in the periodic structure, indicating that the frequency upshifted modes experience heavier collisional damping of the plasma than the frequency downshifted modes

  10. Computation and evaluation of scheduled waiting time for railway networks

    DEFF Research Database (Denmark)

    Landex, Alex

    2010-01-01

    Timetables are affected by scheduled waiting time (SWT) that prolongs the travel times for trains and thereby passengers. SWT occurs when a train hinders another train to run with the wanted speed. The SWT affects both the trains and the passengers in the trains. The passengers may be further...... affected due to longer transfer times to other trains. SWT can be estimated analytically for a given timetable or by simulation of timetables and/or plans of operation. The simulation of SWT has the benefit that it is possible to examine the entire network. This makes it possible to improve the future...

  11. A computer program for the estimation of time of death

    DEFF Research Database (Denmark)

    Lynnerup, N

    1993-01-01

    In the 1960s Marshall and Hoare presented a "Standard Cooling Curve" based on their mathematical analyses on the postmortem cooling of bodies. Although fairly accurate under standard conditions, the "curve" or formula is based on the assumption that the ambience temperature is constant and that t......In the 1960s Marshall and Hoare presented a "Standard Cooling Curve" based on their mathematical analyses on the postmortem cooling of bodies. Although fairly accurate under standard conditions, the "curve" or formula is based on the assumption that the ambience temperature is constant...... cooling of bodies is presented. It is proposed that by having a computer program that solves the equation, giving the length of the cooling period in response to a certain rectal temperature, and which allows easy comparison of multiple solutions, the uncertainties related to ambience temperature...

  12. 42 CFR 93.509 - Computation of time.

    Science.gov (United States)

    2010-10-01

    ... holiday observed by the Federal government, in which case it includes the next business day. (b) When the... required or authorized under the rules in this part to be filed for good cause shown. When time permits...

  13. On some methods for improving time of reachability sets computation for the dynamic system control problem

    Science.gov (United States)

    Zimovets, Artem; Matviychuk, Alexander; Ushakov, Vladimir

    2016-12-01

    The paper presents two different approaches to reduce the time of computer calculation of reachability sets. First of these two approaches use different data structures for storing the reachability sets in the computer memory for calculation in single-threaded mode. Second approach is based on using parallel algorithms with reference to the data structures from the first approach. Within the framework of this paper parallel algorithm of approximate reachability set calculation on computer with SMP-architecture is proposed. The results of numerical modelling are presented in the form of tables which demonstrate high efficiency of parallel computing technology and also show how computing time depends on the used data structure.

  14. Effects of oncoming target velocities on rapid force production and accuracy of force production intensity and timing.

    Science.gov (United States)

    Ohta, Yoichi

    2017-12-01

    The present study aimed to clarify the effects of oncoming target velocities on the ability of rapid force production and accuracy and variability of simultaneous control of both force production intensity and timing. Twenty male participants (age: 21.0 ± 1.4 years) performed rapid gripping with a handgrip dynamometer to coincide with the arrival of an oncoming target by using a horizontal electronic trackway. The oncoming target velocities were 4, 8, and 12 m · s -1 , which were randomly produced. The grip force required was 30% of the maximal voluntary contraction. Although the peak force (Pf) and rate of force development (RFD) increased with increasing target velocity, the value of the RFD to Pf ratio was constant across the 3 target velocities. The accuracy of both force production intensity and timing decreased at higher target velocities. Moreover, the intrapersonal variability in temporal parameters was lower in the fast target velocity condition, but constant variability in 3 target velocities was observed in force intensity parameters. These results suggest that oncoming target velocity does not intrinsically affect the ability for rapid force production. However, the oncoming target velocity affects accuracy and variability of force production intensity and timing during rapid force production.

  15. Use of computer tomography and 3DP Rapid Prototyping technique in cranioplasty planning - analysis of accuracy of bone defect modeling

    International Nuclear Information System (INIS)

    Markowska, O.; Gardzinska, A.; Miechowicz, S.; Chrzan, R.; Urbanik, A.; Miechowicz, S.

    2009-01-01

    Background: The accuracy considerations of alignment of skull bone loss and artificial model of implant are presented. In standard surgical treatment the application of prefabricated alloplastic implants requires complicated procedures during surgery, especially additional geometry processing to provide better adjustment of implant. Rapid Prototyping can be used as an effective tool to generate complex 3D medical models and to improve and simplify surgical treatment planning. The operation time can also be significantly reduced. The aim of the study is adjustment accuracy analysis by measurements of fissure between the bone loss and implant. Material/Methods: The 3D numerical model was obtained from CT imaging with Siemens Sensation 10 CT scanner. The physical models were fabricated with 3DP Rapid Prototyping technology. The measurements were performed in determined points of the bone loss and implant borders. Results: Maximal width of fissure between bone loss and implant was 1.8 mm and minimal 0 mm. Average width was 0.714 mm, standard deviation 0.663 mm. Conclusions: Accuracy of 3DP technique is enough to create medical models in selected field of medicine. Models created using RP methods may be then used to produce implants of biocompatible material, for example by vacuum casting. Using of method suggested may allow shortening of presurgery and surgery time. (authors)

  16. A strategy for reducing turnaround time in design optimization using a distributed computer system

    Science.gov (United States)

    Young, Katherine C.; Padula, Sharon L.; Rogers, James L.

    1988-01-01

    There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.

  17. Rapid detection of undesired cosmetic ingredients by matrix-assisted laser desorption ionization time-of-flight mass spectrometry.

    Science.gov (United States)

    Ouyang, Jie; An, Dongli; Chen, Tengteng; Lin, Zhiwei

    2017-10-01

    In recent years, cosmetic industry profits soared due to the widespread use of cosmetics, which resulted in illicit manufacturers and products of poor quality. Therefore, the rapid and accurate detection of the composition of cosmetics has become crucial. At present, numerous methods, such as gas chromatography and liquid chromatography-mass spectrometry, were available for the analysis of cosmetic ingredients. However, these methods present several limitations, such as failure to perform comprehensive and rapid analysis of the samples. Compared with other techniques, matrix-assisted laser desorption ionization time-of-flight mass spectrometry offered the advantages of wide detection range, fast speed and high accuracy. In this article, we briefly summarized how to select a suitable matrix and adjust the appropriate laser energy. We also discussed the rapid identification of undesired ingredients, focusing on antibiotics and hormones in cosmetics.

  18. Newmark local time stepping on high-performance computing architectures

    KAUST Repository

    Rietmann, Max

    2016-11-25

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100×). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  19. Newmark local time stepping on high-performance computing architectures

    Energy Technology Data Exchange (ETDEWEB)

    Rietmann, Max, E-mail: max.rietmann@erdw.ethz.ch [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland); Institute of Geophysics, ETH Zurich (Switzerland); Grote, Marcus, E-mail: marcus.grote@unibas.ch [Department of Mathematics and Computer Science, University of Basel (Switzerland); Peter, Daniel, E-mail: daniel.peter@kaust.edu.sa [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland); Institute of Geophysics, ETH Zurich (Switzerland); Schenk, Olaf, E-mail: olaf.schenk@usi.ch [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland)

    2017-04-01

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100x). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  20. Newmark local time stepping on high-performance computing architectures

    KAUST Repository

    Rietmann, Max; Grote, Marcus; Peter, Daniel; Schenk, Olaf

    2016-01-01

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100×). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  1. Rapid antimicrobial susceptibility testing of clinical isolates by digital time-lapse microscopy

    DEFF Research Database (Denmark)

    Fredborg, M; Rosenvinge, F S; Spillum, E

    2015-01-01

    (168 antimicrobial agent-organism combinations) demonstrated 3.6 % minor, no major and 1.2 % very major errors of the oCelloScope system compared to conventional susceptibility testing, as well as a rapid and correct phenotypic detection of strains with methicillin-resistant Staphylococcus aureus (MRSA......-to-result, enabling same-day targeted antimicrobial therapy, facilitating antibiotic stewardship and better patient management. A full-scale validation of the oCelloScope system including more isolates is necessary to assess the impact of using it for AST....

  2. Invariant set computation for constrained uncertain discrete-time systems

    NARCIS (Netherlands)

    Athanasopoulos, N.; Bitsoris, G.

    2010-01-01

    In this article a novel approach to the determination of polytopic invariant sets for constrained discrete-time linear uncertain systems is presented. First, the problem of stabilizing a prespecified initial condition set in the presence of input and state constraints is addressed. Second, the

  3. 10 CFR 13.27 - Computation of time.

    Science.gov (United States)

    2010-01-01

    ...; and (2) By 11:59 p.m. Eastern Time for a document served by the E-Filing system. [72 FR 49153, Aug. 28... the calculation of additional days when a participant is not entitled to receive an entire filing... same filing and service method, the number of days for service will be determined by the presiding...

  4. 10 CFR 2.306 - Computation of time.

    Science.gov (United States)

    2010-01-01

    ...:59 p.m. Eastern Time for a document served by the E-Filing system. [72 FR 49151, Aug. 28, 2007] ... the calculation of additional days when a participant is not entitled to receive an entire filing... filing and service method, the number of days for service will be determined by the presiding officer...

  5. Real time operating system for a nuclear power plant computer

    International Nuclear Information System (INIS)

    Alger, L.S.; Lala, J.H.

    1986-01-01

    A quadruply redundant synchronous fault tolerant processor (FTP) is now under fabrication at the C.S. Draper Laboratory to be used initially as a trip monitor for the Experimental Breeder Reactor EBR-II operated by the Argonne National Laboratory in Idaho Falls, Idaho. The real time operating system for this processor is described

  6. Conception and production of a time sharing system for a Mitra-15 CII mini-computer dedicated to APL

    International Nuclear Information System (INIS)

    Perrin, Rene

    1977-01-01

    The installation of a time-sharing system on a mini-computer poses several interesting problems. These technical problems are especially interesting when the goal is to equitably divide the physical resources of the machine amongst users of a high level, conservational language like APL. Original solutions were necessary to be able to retain the rapidity and performances of the original hard and software. The system has been implemented in such way that several users may simultaneously access logical resources, such as the library zones their read/write requests are managed by semaphores which may also be directly controlled by the APL programmer. (author) [fr

  7. The reliable solution and computation time of variable parameters Logistic model

    OpenAIRE

    Pengfei, Wang; Xinnong, Pan

    2016-01-01

    The reliable computation time (RCT, marked as Tc) when applying a double precision computation of a variable parameters logistic map (VPLM) is studied. First, using the method proposed, the reliable solutions for the logistic map are obtained. Second, for a time-dependent non-stationary parameters VPLM, 10000 samples of reliable experiments are constructed, and the mean Tc is then computed. The results indicate that for each different initial value, the Tcs of the VPLM are generally different...

  8. Novel method of fabricating individual trays for maxillectomy patients by computer-aided design and rapid prototyping.

    Science.gov (United States)

    Huang, Zhi; Wang, Xin-zhi; Hou, Yue-Zhong

    2015-02-01

    Making impressions for maxillectomy patients is an essential but difficult task. This study developed a novel method to fabricate individual trays by computer-aided design (CAD) and rapid prototyping (RP) to simplify the process and enhance patient safety. Five unilateral maxillectomy patients were recruited for this study. For each patient, a computed tomography (CT) scan was taken. Based on the 3D surface reconstruction of the target area, an individual tray was manufactured by CAD/RP. With a conventional custom tray as control, two final impressions were made using the different types of tray for each patient. The trays were sectioned, and in each section the thickness of the material was measured at six evenly distributed points. Descriptive statistics and paired t-test were used to examine the difference of the impression thickness. SAS 9.3 was applied in the statistical analysis. Afterwards, all casts were then optically 3D scanned and compared digitally to evaluate the feasibility of this method. Impressions of all five maxillectomy patients were successfully made with individual trays fabricated by CAD/RP and traditional trays. The descriptive statistics of impression thickness measurement showed slightly more uneven results in the traditional trays, but no statistical significance was shown. A 3D digital comparison showed acceptable discrepancies within 1 mm in the majority of cast areas. The largest difference of 3 mm was observed in the buccal wall of the defective areas. Moderate deviations of 1 to 2 mm were detected in the buccal and labial vestibular groove areas. This study confirmed the feasibility of a novel method of fabricating individual trays by CAD/RP. Impressions made by individual trays manufactured using CAD/RP had a uniform thickness, with an acceptable level of accuracy compared to those made through conventional processes. © 2014 by the American College of Prosthodontists.

  9. Motor vehicle injuries in Qatar: time trends in a rapidly developing Middle Eastern nation

    Science.gov (United States)

    Al-Thani, Mohammed H; Al-Thani, Al-Anoud Mohammed; Sheikh, Javaid I; Lowenfels, Albert B

    2011-01-01

    Despite their wealth and modern road systems, traffic injury rates in Middle Eastern countries are generally higher than those in Western countries. The authors examined traffic injuries in Qatar during 2000–2010, a period of rapid population growth, focusing on the impact of speed control cameras installed in 2007 on overall injury rates and mortality. During the period 2000–2006, prior to camera installation, the mean (SD) vehicular injury death rate per 100 000 was 19.9±4.1. From 2007 to 2010, the mean (SD) vehicular death rates were significantly lower: 14.7±1.5 (p=0.028). Non-fatal severe injury rates also declined, but mild injury rates increased, perhaps because of increased traffic congestion and improved notification. It is possible that speed cameras decreased speeding enough to affect the death rate, without affecting overall injury rates. These data suggest that in a rapidly growing Middle Eastern country, photo enforcement (speed) cameras can be an important component of traffic control, but other measures will be required for maximum impact. PMID:21994881

  10. Motor vehicle injuries in Qatar: time trends in a rapidly developing Middle Eastern nation.

    Science.gov (United States)

    Mamtani, Ravinder; Al-Thani, Mohammed H; Al-Thani, Al-Anoud Mohammed; Sheikh, Javaid I; Lowenfels, Albert B

    2012-04-01

    Despite their wealth and modern road systems, traffic injury rates in Middle Eastern countries are generally higher than those in Western countries. The authors examined traffic injuries in Qatar during 2000-2010, a period of rapid population growth, focusing on the impact of speed control cameras installed in 2007 on overall injury rates and mortality. During the period 2000-2006, prior to camera installation, the mean (SD) vehicular injury death rate per 100,000 was 19.9±4.1. From 2007 to 2010, the mean (SD) vehicular death rates were significantly lower: 14.7±1.5 (p=0.028). Non-fatal severe injury rates also declined, but mild injury rates increased, perhaps because of increased traffic congestion and improved notification. It is possible that speed cameras decreased speeding enough to affect the death rate, without affecting overall injury rates. These data suggest that in a rapidly growing Middle Eastern country, photo enforcement (speed) cameras can be an important component of traffic control, but other measures will be required for maximum impact.

  11. A real-time loop-mediated isothermal amplification assay for rapid detection of Shigella species.

    Science.gov (United States)

    Liew, P S; Teh, C S J; Lau, Y L; Thong, K L

    2014-12-01

    Shigellosis is a foodborne illness caused by the genus Shigella and is an important global health issue. The development of effective techniques for rapid detection of this pathogen is essential for breaking the chain of transmission. Therefore, we have developed a novel loop-mediated isothermal amplification (LAMP) assay targeting the invasion plasmid antigen H (ipaH) gene to rapidly detect Shigella species. This assay could be performed in 90 min at an optimal temperature of 64ºC, with endpoint results visualized directly. Notably, the method was found to be more sensitive than conventional PCR. Indeed, the detection limit for the LAMP assay on pure bacterial cultures was 5.9 x 10(5) CFU/ml, while PCR displayed a limit of 5.9 x 10(7) CFU/ml. In spiked lettuce samples, the sensitivity of the LAMP assay was 3.6 x 10(4) CFU/g, whereas PCR was 3.6 x 10(5) CFU/g. Overall, the assay accurately identified 32 Shigella spp. with one enteroinvasive Escherichia coli displaying positive reaction while the remaining 32 non-Shigella strains tested were negative.

  12. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  13. Time-ordered product expansions for computational stochastic system biology

    International Nuclear Information System (INIS)

    Mjolsness, Eric

    2013-01-01

    The time-ordered product framework of quantum field theory can also be used to understand salient phenomena in stochastic biochemical networks. It is used here to derive Gillespie’s stochastic simulation algorithm (SSA) for chemical reaction networks; consequently, the SSA can be interpreted in terms of Feynman diagrams. It is also used here to derive other, more general simulation and parameter-learning algorithms including simulation algorithms for networks of stochastic reaction-like processes operating on parameterized objects, and also hybrid stochastic reaction/differential equation models in which systems of ordinary differential equations evolve the parameters of objects that can also undergo stochastic reactions. Thus, the time-ordered product expansion can be used systematically to derive simulation and parameter-fitting algorithms for stochastic systems. (paper)

  14. Wake force computation in the time domain for long structures

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-07-01

    One is often interested in calculating the wake potentials for short bunches in long structures using TBCI. For ultra-relativistic particles it is sufficient to solve for the fields only over a window containing the bunch and moving along with it. This technique reduces both the memory and the running time required by a factor that equals the ratio of the structure length to the window length. For example, for a bunch with sigma/sub z/ of one picosecond traversing a single SLAC cell this improvement factor is 15. It is thus possible to solve for the wakefields in very long structures: for a given problem, increasing the structure length will not change the memory required while only adding linearly to the CPU time needed

  15. Real-time FPGA architectures for computer vision

    Science.gov (United States)

    Arias-Estrada, Miguel; Torres-Huitzil, Cesar

    2000-03-01

    This paper presents an architecture for real-time generic convolution of a mask and an image. The architecture is intended for fast low level image processing. The FPGA-based architecture takes advantage of the availability of registers in FPGAs to implement an efficient and compact module to process the convolutions. The architecture is designed to minimize the number of accesses to the image memory and is based on parallel modules with internal pipeline operation in order to improve its performance. The architecture is prototyped in a FPGA, but it can be implemented on a dedicated VLSI to reach higher clock frequencies. Complexity issues, FPGA resources utilization, FPGA limitations, and real time performance are discussed. Some results are presented and discussed.

  16. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  17. Computational micromagnetics: prediction of time dependent and thermal properties

    International Nuclear Information System (INIS)

    Schrefl, T.; Scholz, W.; Suess, Dieter; Fidler, J.

    2001-01-01

    Finite element modeling treats magnetization processes on a length scale of several nanometers and thus gives a quantitative correlation between the microstructure and the magnetic properties of ferromagnetic materials. This work presents a novel finite element/boundary element micro-magnetics solver that combines a wavelet-based matrix compression technique for magnetostatic field calculations with a BDF/GMRES method for the time integration of the Gilbert equation of motion. The simulations show that metastable energy minima and nonuniform magnetic states within the grains are important factors in the reversal dynamics at finite temperature. The numerical solution of the Gilbert equation shows how reversed domains nucleate and expand. The switching time of submicron magnetic elements depends on the shape of the elements. Elements with slanted ends decrease the overall reversal time, as a transverse demagnetizing field suppresses oscillations of the magnetization. Thermal activated processes can be included adding a random thermal field to the effective magnetic field. Thermally assisted reversal was studied for CoCrPtTa thin-film media

  18. Rapid screening of β-Globin gene mutations by Real-Time PCR in ...

    African Journals Online (AJOL)

    Introduction of the real time PCR has made a revolution in the time taken for the PCR reactions. We present a method for the diagnosis of the common mutations of the B-thalassemia in Egyptian children & families. The procedure depends on the real-time PCR using specific fluorescently labeled hybridization probes.

  19. Impacts of Earth rotation parameters on GNSS ultra-rapid orbit prediction: Derivation and real-time correction

    Science.gov (United States)

    Wang, Qianxin; Hu, Chao; Xu, Tianhe; Chang, Guobin; Hernández Moraleda, Alberto

    2017-12-01

    Analysis centers (ACs) for global navigation satellite systems (GNSSs) cannot accurately obtain real-time Earth rotation parameters (ERPs). Thus, the prediction of ultra-rapid orbits in the international terrestrial reference system (ITRS) has to utilize the predicted ERPs issued by the International Earth Rotation and Reference Systems Service (IERS) or the International GNSS Service (IGS). In this study, the accuracy of ERPs predicted by IERS and IGS is analyzed. The error of the ERPs predicted for one day can reach 0.15 mas and 0.053 ms in polar motion and UT1-UTC direction, respectively. Then, the impact of ERP errors on ultra-rapid orbit prediction by GNSS is studied. The methods for orbit integration and frame transformation in orbit prediction with introduced ERP errors dominate the accuracy of the predicted orbit. Experimental results show that the transformation from the geocentric celestial references system (GCRS) to ITRS exerts the strongest effect on the accuracy of the predicted ultra-rapid orbit. To obtain the most accurate predicted ultra-rapid orbit, a corresponding real-time orbit correction method is developed. First, orbits without ERP-related errors are predicted on the basis of ITRS observed part of ultra-rapid orbit for use as reference. Then, the corresponding predicted orbit is transformed from GCRS to ITRS to adjust for the predicted ERPs. Finally, the corrected ERPs with error slopes are re-introduced to correct the predicted orbit in ITRS. To validate the proposed method, three experimental schemes are designed: function extrapolation, simulation experiments, and experiments with predicted ultra-rapid orbits and international GNSS Monitoring and Assessment System (iGMAS) products. Experimental results show that using the proposed correction method with IERS products considerably improved the accuracy of ultra-rapid orbit prediction (except the geosynchronous BeiDou orbits). The accuracy of orbit prediction is enhanced by at least 50

  20. Time-Resolved Fluorescent Immunochromatography of Aflatoxin B1 in Soybean Sauce: A Rapid and Sensitive Quantitative Analysis.

    Science.gov (United States)

    Wang, Du; Zhang, Zhaowei; Li, Peiwu; Zhang, Qi; Zhang, Wen

    2016-07-14

    Rapid and quantitative sensing of aflatoxin B1 with high sensitivity and specificity has drawn increased attention of studies investigating soybean sauce. A sensitive and rapid quantitative immunochromatographic sensing method was developed for the detection of aflatoxin B1 based on time-resolved fluorescence. It combines the advantages of time-resolved fluorescent sensing and immunochromatography. The dynamic range of a competitive and portable immunoassay was 0.3-10.0 µg·kg(-1), with a limit of detection (LOD) of 0.1 µg·kg(-1) and recoveries of 87.2%-114.3%, within 10 min. The results showed good correlation (R² > 0.99) between time-resolved fluorescent immunochromatographic strip test and high performance liquid chromatography (HPLC). Soybean sauce samples analyzed using time-resolved fluorescent immunochromatographic strip test revealed that 64.2% of samples contained aflatoxin B1 at levels ranging from 0.31 to 12.5 µg·kg(-1). The strip test is a rapid, sensitive, quantitative, and cost-effective on-site screening technique in food safety analysis.

  1. Influence of slice thickness of computed tomography and type of rapid protyping on the accuracy of 3-dimensional medical model

    International Nuclear Information System (INIS)

    Um, Ki Doo; Lee, Byung Do

    2004-01-01

    This study was to evaluate the influence of slice thickness of computed tomography (CT) and rapid protyping (RP) type on the accuracy of 3-dimensional medical model. Transaxial CT data of human dry skull were taken from multi-detector spiral CT. Slice thickness were 1, 2, 3 and 4 mm respectively. Three-dimensional image model reconstruction using 3-D visualization medical software (V-works 3.0) and RP model fabrication were followed. 2-RP models were 3D printing (Z402, Z Corp., Burlington, USA) and Stereolithographic Apparatus model. Linear measurements of anatomical landmarks on dry skull, 3-D image model, and 2-RP models were done and compared according to slice thickness and RP model type. There were relative error percentage in absolute value of 0.97, 1.98, 3.83 between linear measurements of dry skull and image models of 1, 2, 3 mm slice thickness respectively. There was relative error percentage in absolute value of 0.79 between linear measurements of dry skull and SLA model. There was relative error difference in absolute value of 2.52 between linear measurements of dry skull and 3D printing model. These results indicated that 3-dimensional image model of thin slice thickness and stereolithographic RP model showed relative high accuracy.

  2. Influence of slice thickness of computed tomography and type of rapid protyping on the accuracy of 3-dimensional medical model

    Energy Technology Data Exchange (ETDEWEB)

    Um, Ki Doo; Lee, Byung Do [Wonkwang University College of Medicine, Iksan (Korea, Republic of)

    2004-03-15

    This study was to evaluate the influence of slice thickness of computed tomography (CT) and rapid protyping (RP) type on the accuracy of 3-dimensional medical model. Transaxial CT data of human dry skull were taken from multi-detector spiral CT. Slice thickness were 1, 2, 3 and 4 mm respectively. Three-dimensional image model reconstruction using 3-D visualization medical software (V-works 3.0) and RP model fabrication were followed. 2-RP models were 3D printing (Z402, Z Corp., Burlington, USA) and Stereolithographic Apparatus model. Linear measurements of anatomical landmarks on dry skull, 3-D image model, and 2-RP models were done and compared according to slice thickness and RP model type. There were relative error percentage in absolute value of 0.97, 1.98, 3.83 between linear measurements of dry skull and image models of 1, 2, 3 mm slice thickness respectively. There was relative error percentage in absolute value of 0.79 between linear measurements of dry skull and SLA model. There was relative error difference in absolute value of 2.52 between linear measurements of dry skull and 3D printing model. These results indicated that 3-dimensional image model of thin slice thickness and stereolithographic RP model showed relative high accuracy.

  3. Innovative procedure for computer-assisted genioplasty: three-dimensional cephalometry, rapid-prototyping model and surgical splint.

    Science.gov (United States)

    Olszewski, R; Tranduy, K; Reychler, H

    2010-07-01

    The authors present a new procedure of computer-assisted genioplasty. They determined the anterior, posterior and inferior limits of the chin in relation to the skull and face with the newly developed and validated three-dimensional cephalometric planar analysis (ACRO 3D). Virtual planning of the osteotomy lines was carried out with Mimics (Materialize) software. The authors built a three-dimensional rapid-prototyping multi-position model of the chin area from a medical low-dose CT scan. The transfer of virtual information to the operating room consisted of two elements. First, the titanium plates on the 3D RP model were pre-bent. Second, a surgical guide for the transfer of the osteotomy lines and the positions of the screws to the operating room was manufactured. The authors present the first case of the use of this model on a patient. The postoperative results are promising, and the technique is fast and easy-to-use. More patients are needed for a definitive clinical validation of this procedure. Copyright 2010 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  4. Cone-beam computed tomography evaluation of dental, skeletal, and alveolar bone changes associated with bonded rapid maxillary expansion

    Directory of Open Access Journals (Sweden)

    Namrata Dogra

    2016-01-01

    Full Text Available Aims and Objectives: To evaluate skeletal changes in maxilla and its surrounding structures, changes in the maxillary dentition and maxillary alveolar bone changes produced by bonded rapid maxillary expansion (RME using cone-beam computed tomography (CBCT. Materials and Methods: The sample consisted of 10 patients (6 males and 4 females with age range 12 to 15 years treated with bonded RME. CBCT scans were performed at T1 (pretreatment and at T2 (immediately after expansion to evaluate the dental, skeletal, and alveolar bone changes. Results: RME treatment increased the overall skeletal parameters such as interorbital, zygomatic, nasal, and maxillary widths. Significant increases in buccal maxillary width was observed at first premolar, second premolar, and first molar level. There was a significant increase in arch width both on the palatal side and on the buccal side. Significant tipping of right and left maxillary first molars was seen. There were significant reductions in buccal bone plate thickness and increase in palatal bone plate thickness. Conclusions: Total expansion achieved with RME was a combination of dental, skeletal and alveolar bone changes. At the first molar level, 28.45% orthopedic, 16.03% alveolar bone bending, and 55.5% orthodontic changes were observed.

  5. A rapid, computational approach for assessing interfraction esophageal motion for use in stereotactic body radiation therapy planning

    Directory of Open Access Journals (Sweden)

    Michael L. Cardenas, MD

    2018-04-01

    Full Text Available Purpose: We present a rapid computational method for quantifying interfraction motion of the esophagus in patients undergoing stereotactic body radiation therapy on a magnetic resonance (MR guided radiation therapy system. Methods and materials: Patients who underwent stereotactic body radiation therapy had simulation computed tomography (CT and on-treatment MR scans performed. The esophagus was contoured on each scan. CT contours were transferred to MR volumes via rigid registration. Digital Imaging and Communications in Medicine files containing contour points were exported to MATLAB. In-plane CT and MR contour points were spline interpolated, yielding boundaries with centroid positions, CCT and CMR. MR contour points lying outside of the CT contour were extracted. For each such point, BMR(j, a segment from CCT intersecting BMR(j, was produced; its intersection with the CT contour, BCT(i, was calculated. The length of the segment Sij, between BCT(i and BMR(j, was found. The orientation θ was calculated from Sij vector components:θ = arctan[(Sijy / (Sijx]A set of segments {Sij} was produced for each slice and binned by quadrant with 0° < θ ≤ 90°, 90° < θ ≤ 180°, 180° < θ ≤ 270°, and 270° < θ ≤ 360° for the left anterior, right anterior, right posterior, and left posterior quadrants, respectively. Slices were binned into upper, middle, and lower esophageal (LE segments. Results: Seven patients, each having 3 MR scans, were evaluated, yielding 1629 axial slices and 84,716 measurements. The LE segment exhibited the greatest magnitude of motion. The mean LE measurements in the left anterior, left posterior, right anterior, and right posterior were 5.2 ± 0.07 mm, 6.0 ± 0.09 mm, 4.8 ± 0.08 mm, and 5.1 ± 0.08 mm, respectively. There was considerable interpatient variability. Conclusions: The LE segment exhibited the greatest magnitude of mobility compared with the

  6. Applications of parallel computer architectures to the real-time simulation of nuclear power systems

    International Nuclear Information System (INIS)

    Doster, J.M.; Sills, E.D.

    1988-01-01

    In this paper the authors report on efforts to utilize parallel computer architectures for the thermal-hydraulic simulation of nuclear power systems and current research efforts toward the development of advanced reactor operator aids and control systems based on this new technology. Many aspects of reactor thermal-hydraulic calculations are inherently parallel, and the computationally intensive portions of these calculations can be effectively implemented on modern computers. Timing studies indicate faster-than-real-time, high-fidelity physics models can be developed when the computational algorithms are designed to take advantage of the computer's architecture. These capabilities allow for the development of novel control systems and advanced reactor operator aids. Coupled with an integral real-time data acquisition system, evolving parallel computer architectures can provide operators and control room designers improved control and protection capabilities. Current research efforts are currently under way in this area

  7. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    International Nuclear Information System (INIS)

    Iliopoulos, AS; Sun, X; Pitsianis, N; Yin, FF; Ren, L

    2015-01-01

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  8. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States); Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  9. Comprehensive and Rapid Real-Time PCR Analysis of 21 Foodborne Outbreaks

    Directory of Open Access Journals (Sweden)

    Hiroshi Fukushima

    2009-01-01

    Full Text Available A set of four duplex SYBR Green I PCR (SG-PCR assay combined with DNA extraction using QIAamp DNA Stool Mini kit was evaluated for the detection of foodborne bacteria from 21 foodborne outbreaks. The causative pathogens were detected in almost all cases in 2 hours or less. The first run was for the detection of 8 main foodborne pathogens in 5 stool specimens within 2 hours and the second run was for the detection of other unusual suspect pathogens within a further 45 minutes. After 2 to 4 days, the causative agents were isolated and identified. The results proved that for comprehensive and rapid molecular diagnosis in foodborne outbreaks, Duplex SG-PCR assay is not only very useful, but is also economically viable for one-step differentiation of causative pathogens in fecal specimens obtained from symptomatic patients. This then allows for effective diagnosis and management of foodborne outbreaks.

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. Development and Validation of a Real-Time PCR Assay for Rapid Detection of Candida auris from Surveillance Samples.

    Science.gov (United States)

    Leach, L; Zhu, Y; Chaturvedi, S

    2018-02-01

    Candida auris is an emerging multidrug-resistant yeast causing invasive health care-associated infection with high mortality worldwide. Rapid identification of C. auris is of primary importance for the implementation of public health measures to control the spread of infection. To achieve these goals, we developed and validated a TaqMan-based real-time PCR assay targeting the internal transcribed spacer 2 ( ITS 2) region of the ribosomal gene. The assay was highly specific, reproducible, and sensitive, with the detection limit of 1 C. auris CFU/PCR. The performance of the C. auris real-time PCR assay was evaluated by using 623 surveillance samples, including 365 patient swabs and 258 environmental sponges. Real-time PCR yielded positive results from 49 swab and 58 sponge samples, with 89% and 100% clinical sensitivity with regard to their respective culture-positive results. The real-time PCR also detected C. auris DNA from 1% and 12% of swab and sponge samples with culture-negative results, indicating the presence of dead or culture-impaired C. auris The real-time PCR yielded results within 4 h of sample processing, compared to 4 to 14 days for culture, reducing turnaround time significantly. The new real-time PCR assay allows for accurate and rapid screening of C. auris and can increase effective control and prevention of this emerging multidrug-resistant fungal pathogen in health care facilities. Copyright © 2018 Leach et al.

  12. The relative timing between eye and hand rapid sequential pointing is affected by time pressure, but not by advance knowledge

    NARCIS (Netherlands)

    Deconinck, F.; van Polanen, V.; Savelsbergh, G.J.P.; Bennett, S.

    2011-01-01

    The present study examined the effect of timing constraints and advance knowledge on eye-hand coordination strategy in a sequential pointing task. Participants were required to point at two successively appearing targets on a screen while the inter-stimulus interval (ISI) and the trial order were

  13. Rapid Detection and Differentiation of Clonorchis sinensis and Opisthorchis viverrini Using Real-Time PCR and High Resolution Melting Analysis

    OpenAIRE

    Cai, Xian-Quan; Yu, Hai-Qiong; Li, Rong; Yue, Qiao-Yun; Liu, Guo-Hua; Bai, Jian-Shan; Deng, Yan; Qiu, De-Yi; Zhu, Xing-Quan

    2014-01-01

    Clonorchis sinensis and Opisthorchis viverrini are both important fish-borne pathogens, causing serious public health problem in Asia. The present study developed an assay integrating real-time PCR and high resolution melting (HRM) analysis for the specific detection and rapid identification of C. sinensis and O. viverrini. Primers targeting COX1 gene were highly specific for these liver flukes, as evidenced by the negative amplification of closely related trematodes. Assays using genomic DNA...

  14. The use of newly developed real-time PCR for the rapid identification of bacteria in culture-negative osteomyelitis.

    Science.gov (United States)

    Kobayashi, Naomi; Bauer, Thomas W; Sakai, Hiroshige; Togawa, Daisuke; Lieberman, Isador H; Fujishiro, Takaaki; Procop, Gary W

    2006-12-01

    We report a case of a culture-negative osteomyelitis in which our newly developed real-time polymerase chain reaction (PCR) could differentiate Staphylococcus aureus from Staphylococcus epidermidis. This is the first report that described the application of this novel assay to an orthopedics clinical sample. This assay may be useful for other clinical culture-negative cases in a combination with a broad-spectrum assay as a rapid microorganism identification method.

  15. On-site identification of meat species in processed foods by a rapid real-time polymerase chain reaction system.

    Science.gov (United States)

    Furutani, Shunsuke; Hagihara, Yoshihisa; Nagai, Hidenori

    2017-09-01

    Correct labeling of foods is critical for consumers who wish to avoid a specific meat species for religious or cultural reasons. Therefore, gene-based point-of-care food analysis by real-time Polymerase Chain Reaction (PCR) is expected to contribute to the quality control in the food industry. In this study, we perform rapid identification of meat species by our portable rapid real-time PCR system, following a very simple DNA extraction method. Applying these techniques, we correctly identified beef, pork, chicken, rabbit, horse, and mutton in processed foods in 20min. Our system was sensitive enough to detect the interfusion of about 0.1% chicken egg-derived DNA in a processed food sample. Our rapid real-time PCR system is expected to contribute to the quality control in food industries because it can be applied for the identification of meat species, and future applications can expand its functionality to the detection of genetically modified organisms or mutations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. The Great Transformations of Tibet and Xinjiang: a comparative analysis of rapid labour transitions in times of rapid growth in two contested minority regions of China

    NARCIS (Netherlands)

    A.M. Fischer (Andrew Martín)

    2011-01-01

    textabstractRapid growth since the mid-1990s in the Tibetan and Uyghur areas in Western China has been associated with the rapid transition of the local (mostly Tibetan and Uyghur) labour forces out of the primary sector (mostly farming and herding) and into the tertiary sector (services). The TAR,

  17. The rapid use of gender information: evidence of the time course of pronoun resolution from eyetracking.

    Science.gov (United States)

    Arnold, J E; Eisenband, J G; Brown-Schmidt, S; Trueswell, J C

    2000-07-14

    Eye movements of listeners were monitored to investigate how gender information and accessibility influence the initial processes of pronoun interpretation. Previous studies on this issue have produced mixed results, and several studies have concluded that gender cues are not automatically used during the early processes of pronoun interpretation (e.g. Garnham, A., Oakhill, J. & Cruttenden, H. (1992). The role of implicit causality and gender cue in the interpretation of pronouns. Language and Cognitive Processes, 73 (4), 231-255; Greene, S. B., McKoon, G. & Ratcliff, R. (1992). Pronoun resolution and discourse models. Journal of Experimental Psychology: Learning, Memory, and Cognition, 182, 266-283). In the two experiments presented here, participants viewed a picture with two familiar cartoon characters of either same or different gender. They listened to a text describing the picture, in which a pronoun referred to either the first, more accessible, character, or the second. (For example, Donald is bringing some mail to ¿Mickey/Minnie¿ while a violent storm is beginning. He's carrying an umbrellaellipsis.) The results of both experiments show rapid use of both gender and accessibility at approximately 200 ms after the pronoun offset.

  18. Micropower Impulse Radar: A Novel Technology for Rapid, Real-Time Detection of Pneumothorax

    Directory of Open Access Journals (Sweden)

    Phillip D. Levy

    2011-01-01

    Full Text Available Pneumothorax detection in emergency situations must be rapid and at the point of care. Current standards for detection of a pneumothorax are supine chest X-rays, ultrasound, and CT scans. Unfortunately these tools and the personnel necessary for their facile utilization may not be readily available in acute circumstances, particularly those which occur in the pre-hospital setting. The decision to treat therefore, is often made without adequate information. In this report, we describe a novel hand-held device that utilizes Micropower Impulse Radar to reliably detect the presence of a pneumothorax. The technology employs ultra wide band pulses over a frequency range of 500 MHz to 6 GHz and a proprietary algorithm analyzes return echoes to determine if a pneumothorax is present with no user interpretation required. The device has been evaluated in both trauma and surgical environments with sensitivity of 93% and specificity of 85%. It is has the CE Mark and is available for sale in Europe. Post market studies are planned starting in May of 2011. Clinical studies to support the FDA submission will be completed in the first quarter of 2012.

  19. LHC Computing Grid Project Launches intAction with International Support. A thousand times more computing power by 2006

    CERN Multimedia

    2001-01-01

    The first phase of the LHC Computing Grid project was approved at an extraordinary meeting of the Council on 20 September 2001. CERN is preparing for the unprecedented avalanche of data that will be produced by the Large Hadron Collider experiments. A thousand times more computer power will be needed by 2006! CERN's need for a dramatic advance in computing capacity is urgent. As from 2006, the four giant detectors observing trillions of elementary particle collisions at the LHC will accumulate over ten million Gigabytes of data, equivalent to the contents of about 20 million CD-ROMs, each year of its operation. A thousand times more computing power will be needed than is available to CERN today. The strategy the collabortations have adopted to analyse and store this unprecedented amount of data is the coordinated deployment of Grid technologies at hundreds of institutes which will be able to search out and analyse information from an interconnected worldwide grid of tens of thousands of computers and storag...

  20. Time-Frequency Analysis of Terahertz Radar Signals for Rapid Heart and Breath Rate Detection

    National Research Council Canada - National Science Library

    Massar, Melody L

    2008-01-01

    We develop new time-frequency analytic techniques which facilitate the detection of a person's heart and breath rates from the Doppler shift the movement of their body induces in a terahertz radar signal...

  1. Assessment of Universal Healthcare Coverage in a District of North India: A Rapid Cross-Sectional Survey Using Tablet Computers.

    Science.gov (United States)

    Singh, Tarundeep; Roy, Pritam; Jamir, Limalemla; Gupta, Saurav; Kaur, Navpreet; Jain, D K; Kumar, Rajesh

    2016-01-01

    A rapid survey was carried out in Shaheed Bhagat Singh Nagar District of Punjab state in India to ascertain health seeking behavior and out-of-pocket health expenditures. Using multistage cluster sampling design, 1,008 households (28 clusters x 36 households in each cluster) were selected proportionately from urban and rural areas. Households were selected through a house-to-house survey during April and May 2014 whose members had (a) experienced illness in the past 30 days, (b) had illness lasting longer than 30 days, (c) were hospitalized in the past 365 days, or (d) had women who were currently pregnant or experienced childbirth in the past two years. In these selected households, trained investigators, using a tablet computer-based structured questionnaire, enquired about the socio-demographics, nature of illness, source of healthcare, and healthcare and household expenditure. The data was transmitted daily to a central server using wireless communication network. Mean healthcare expenditures were computed for various health conditions. Catastrophic healthcare expenditure was defined as more than 10% of the total annual household expenditure on healthcare. Chi square test for trend was used to compare catastrophic expenditures on hospitalization between households classified into expenditure quartiles. The mean monthly household expenditure was 15,029 Indian Rupees (USD 188.2). Nearly 14.2% of the household expenditure was on healthcare. Fever, respiratory tract diseases, gastrointestinal diseases were the common acute illnesses, while heart disease, diabetes mellitus, and respiratory diseases were the more common chronic diseases. Hospitalizations were mainly due to cardiovascular diseases, gastrointestinal problems, and accidents. Only 17%, 18%, 20% and 31% of the healthcare for acute illnesses, chronic illnesses, hospitalizations and childbirth was sought in the government health facilities. Average expenditure in government health facilities was 16.6% less

  2. Assessment of Universal Healthcare Coverage in a District of North India: A Rapid Cross-Sectional Survey Using Tablet Computers.

    Directory of Open Access Journals (Sweden)

    Tarundeep Singh

    Full Text Available A rapid survey was carried out in Shaheed Bhagat Singh Nagar District of Punjab state in India to ascertain health seeking behavior and out-of-pocket health expenditures.Using multistage cluster sampling design, 1,008 households (28 clusters x 36 households in each cluster were selected proportionately from urban and rural areas. Households were selected through a house-to-house survey during April and May 2014 whose members had (a experienced illness in the past 30 days, (b had illness lasting longer than 30 days, (c were hospitalized in the past 365 days, or (d had women who were currently pregnant or experienced childbirth in the past two years. In these selected households, trained investigators, using a tablet computer-based structured questionnaire, enquired about the socio-demographics, nature of illness, source of healthcare, and healthcare and household expenditure. The data was transmitted daily to a central server using wireless communication network. Mean healthcare expenditures were computed for various health conditions. Catastrophic healthcare expenditure was defined as more than 10% of the total annual household expenditure on healthcare. Chi square test for trend was used to compare catastrophic expenditures on hospitalization between households classified into expenditure quartiles.The mean monthly household expenditure was 15,029 Indian Rupees (USD 188.2. Nearly 14.2% of the household expenditure was on healthcare. Fever, respiratory tract diseases, gastrointestinal diseases were the common acute illnesses, while heart disease, diabetes mellitus, and respiratory diseases were the more common chronic diseases. Hospitalizations were mainly due to cardiovascular diseases, gastrointestinal problems, and accidents. Only 17%, 18%, 20% and 31% of the healthcare for acute illnesses, chronic illnesses, hospitalizations and childbirth was sought in the government health facilities. Average expenditure in government health facilities was

  3. Integration of Simulink, MARTe and MDSplus for rapid development of real-time applications

    Energy Technology Data Exchange (ETDEWEB)

    Manduchi, G., E-mail: gabriele.manduchi@igi.cnr.it [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete SpA), Padova (Italy); Luchetta, A.; Taliercio, C. [Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete SpA), Padova (Italy); Neto, A.; Sartori, F. [Fusion for Energy, Barcelona (Spain); De Tommasi, G. [Fusion for Energy, Barcelona (Spain); Consorzio CREATE/DIETI, Università degli Studi di Napoli Federico II, Via Claudio 21, 80125 Napoli (Italy)

    2015-10-15

    Highlights: • The integration of two frameworks for real-time control and data acquisition is described. • The integration may significantly fasten the development of system components. • The system includes also a code generator for the integration of code written in Simulink. • A real-time control systemcan be implemented without the need of writing any line of code. - Abstract: Simulink is a graphical data flow programming tool for modeling and simulating dynamic systems. A component of Simulink, called Simulink Coder, generates C code from Simulink diagrams. MARTe is a framework for the implementation of real-time systems, currently in use in several fusion experiments. MDSplus is a framework widely used in the fusion community for the management of data. The three systems provide a solution to different facets of the same process, that is, real-time plasma control development. Simulink diagrams will describe the algorithms used in control, which will be implemented as MARTe GAMs and which will use parameters read from and produce results written to MDSplus pulse files. The three systems have been integrated in order to provide a tool suitable to speed up the development of real-time control applications. In particular, it will be shown how from a Simulink diagram describing a given algorithm to be used in a control system, it is possible to generate in an automated way the corresponding MARTe and MDSplus components that can be assembled to implement the target system.

  4. Integration of Simulink, MARTe and MDSplus for rapid development of real-time applications

    International Nuclear Information System (INIS)

    Manduchi, G.; Luchetta, A.; Taliercio, C.; Neto, A.; Sartori, F.; De Tommasi, G.

    2015-01-01

    Highlights: • The integration of two frameworks for real-time control and data acquisition is described. • The integration may significantly fasten the development of system components. • The system includes also a code generator for the integration of code written in Simulink. • A real-time control systemcan be implemented without the need of writing any line of code. - Abstract: Simulink is a graphical data flow programming tool for modeling and simulating dynamic systems. A component of Simulink, called Simulink Coder, generates C code from Simulink diagrams. MARTe is a framework for the implementation of real-time systems, currently in use in several fusion experiments. MDSplus is a framework widely used in the fusion community for the management of data. The three systems provide a solution to different facets of the same process, that is, real-time plasma control development. Simulink diagrams will describe the algorithms used in control, which will be implemented as MARTe GAMs and which will use parameters read from and produce results written to MDSplus pulse files. The three systems have been integrated in order to provide a tool suitable to speed up the development of real-time control applications. In particular, it will be shown how from a Simulink diagram describing a given algorithm to be used in a control system, it is possible to generate in an automated way the corresponding MARTe and MDSplus components that can be assembled to implement the target system.

  5. Multiscale Space-Time Computational Methods for Fluid-Structure Interactions

    Science.gov (United States)

    2015-09-13

    thermo-fluid analysis of a ground vehicle and its tires ST-SI Computational Analysis of a Vertical - Axis Wind Turbine We have successfully...of a vertical - axis wind turbine . Multiscale Compressible-Flow Computation with Particle Tracking We have successfully tested the multiscale...Tezduyar, Spenser McIntyre, Nikolay Kostov, Ryan Kolesar, Casey Habluetzel. Space–time VMS computation of wind - turbine rotor and tower aerodynamics

  6. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Evaluation of the rapid and slow maxillary expansion using cone-beam computed tomography: a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Juliana da S. Pereira

    Full Text Available ABSTRACT OBJECTIVE: The aim of this randomized clinical trial was to evaluate the dental, dentoalveolar, and skeletal changes occurring right after the rapid maxillary expansion (RME and slow maxillary expansion (SME treatment using Haas-type expander. METHODS: All subjects performed cone-beam computed tomography (CBCT before installation of expanders (T1 and right after screw stabilization (T2. Patients who did not follow the research parameters were excluded. The final sample resulted in 21 patients in RME group (mean age of 8.43 years and 16 patients in SME group (mean age of 8.70 years. Based on the skewness and kurtosis statistics, the variables were judged to be normally distributed and paired t-test and student t-test were performed at significance level of 5%. RESULTS: Intermolar angle changed significantly due to treatment and RME showed greater buccal tipping than SME. RME showed significant changes in other four measurements due to treatment: maxilla moved forward and mandible showed backward rotation and, at transversal level both skeletal and dentoalveolar showed significant changes due to maxillary expansion. SME showed significant dentoalveolar changes due to maxillary expansion. CONCLUSIONS: Only intermolar angle showed significant difference between the two modalities of maxillary expansion with greater buccal tipping for RME. Also, RME produced skeletal maxillary expansion and SME did not. Both maxillary expansion modalities were efficient to promote transversal gain at dentoalveolar level. Sagittal and vertical measurements did not show differences between groups, but RME promoted a forward movement of the maxilla and backward rotation of the mandible.

  8. Rapid estimation of the vertebral body volume: a combination of the Cavalieri principle and computed tomography images

    International Nuclear Information System (INIS)

    Odaci, Ersan; Sahin, Buenyamin; Sonmez, Osman Fikret; Kaplan, Sueleyman; Bas, Orhan; Bilgic, Sait; Bek, Yueksel; Erguer, Hayati

    2003-01-01

    Objective: The exact volume of the vertebral body is necessary for the evaluation, treatment and surgical application of related vertebral body. Thereby, the volume changes of the vertebral body are monitored, such as infectious diseases of vertebra and traumatic or non-traumatic fractures and deformities of the spine. Several studies have been conducted for the assessment of the vertebral body size based on the evaluation of the different criteria of the spine using different techniques. However, we have not found any detailed study in the literature describing the combination of the Cavalieri principle and vertebral body volume estimation. Materials and methods: In the present study we describe a rapid, simple, accurate and practical technique for estimating the volume of vertebral body. Two specimens were taken from the cadavers including ten lumbar vertebras and were scanned in axial, sagittal and coronal section planes by a computed tomography (CT) machine. The consecutive sections in 5 and 3 mm thicknesses were used to estimate the total volume of the vertebral bodies by means of the Cavalieri principle. Furthermore, to evaluate inter-observer differences the volume estimations were carried out by three performers. Results: There were no significant differences between the performers' estimates and real volumes of the vertebral bodies (P>0.05) and also between the performers' volume estimates (P>0.05). The section thickness and the section plains did not affect the accuracy of the estimates (P>0.05). A high correlation was seen between the estimates of performers and the real volumes of the vertebral bodies (r=0.881). Conclusion: We concluded that the combination of CT scanning with the Cavalieri principle is a direct and accurate technique that can be safely applied to estimate the volume of the vertebral body with the mean of 5 min and 11 s workload per vertebra

  9. Influence of time presetting procedure for rapid local heat;.ng on brazing temperature conditions

    International Nuclear Information System (INIS)

    Lezhnin, G.P.; Tul'skikh, V.E.

    1985-01-01

    Correlation of known and suggested presetting procedures for heating period during induction brazing was conducted. It is shown that brazing time must be established considering heat propagation during heating in order to obtain the assigned joint temperature regardless of heating rate change. Methods for temperature calculation in assigned zones of the joint are suggested. The suggested presetting procedure for heating time was applied for induction vacuum brazing of a tube of 12Kh18N10T steel to a pipe connection of VT20 alloy

  10. A Non-Linear Digital Computer Model Requiring Short Computation Time for Studies Concerning the Hydrodynamics of the BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reisch, F; Vayssier, G

    1969-05-15

    This non-linear model serves as one of the blocks in a series of codes to study the transient behaviour of BWR or PWR type reactors. This program is intended to be the hydrodynamic part of the BWR core representation or the hydrodynamic part of the PWR heat exchanger secondary side representation. The equations have been prepared for the CSMP digital simulation language. By using the most suitable integration routine available, the ratio of simulation time to real time is about one on an IBM 360/75 digital computer. Use of the slightly different language DSL/40 on an IBM 7044 computer takes about four times longer. The code has been tested against the Eindhoven loop with satisfactory agreement.

  11. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  12. Real Time Animation of Trees Based on BBSC in Computer Games

    Directory of Open Access Journals (Sweden)

    Xuefeng Ao

    2009-01-01

    Full Text Available That researchers in the field of computer games usually find it is difficult to simulate the motion of actual 3D model trees lies in the fact that the tree model itself has very complicated structure, and many sophisticated factors need to be considered during the simulation. Though there are some works on simulating 3D tree and its motion, few of them are used in computer games due to the high demand for real-time in computer games. In this paper, an approach of animating trees in computer games based on a novel tree model representation—Ball B-Spline Curves (BBSCs are proposed. By taking advantage of the good features of the BBSC-based model, physical simulation of the motion of leafless trees with wind blowing becomes easier and more efficient. The method can generate realistic 3D tree animation in real-time, which meets the high requirement for real time in computer games.

  13. Ubiquitous computing technology for just-in-time motivation of behavior change.

    Science.gov (United States)

    Intille, Stephen S

    2004-01-01

    This paper describes a vision of health care where "just-in-time" user interfaces are used to transform people from passive to active consumers of health care. Systems that use computational pattern recognition to detect points of decision, behavior, or consequences automatically can present motivational messages to encourage healthy behavior at just the right time. Further, new ubiquitous computing and mobile computing devices permit information to be conveyed to users at just the right place. In combination, computer systems that present messages at the right time and place can be developed to motivate physical activity and healthy eating. Computational sensing technologies can also be used to measure the impact of the motivational technology on behavior.

  14. Rapid quantification of semen hepatitis B virus DNA by real-time polymerase chain reaction

    Science.gov (United States)

    Qian, Wei-Ping; Tan, Yue-Qiu; Chen, Ying; Peng, Ying; Li, Zhi; Lu, Guang-Xiu; Lin, Marie C.; Kung, Hsiang-Fu; He, Ming-Ling; Shing, Li-Ka

    2005-01-01

    AIM: To examine the sensitivity and accuracy of real-time polymerase chain reaction (PCR) for the quantification of hepatitis B virus (HBV) DNA in semen. METHODS: Hepatitis B viral DNA was isolated from HBV carriers’ semen and sera using phenol extraction method and QIAamp DNA blood mini kit (Qiagen, Germany). HBV DNA was detected by conventional PCR and quantified by TaqMan technology-based real-time PCR (quantitative polymerase chain reaction (qPCR)). The detection threshold was 200 copies of HBV DNA for conventional PCR and 10 copies of HBV DNA for real time PCR per reaction. RESULTS: Both methods of phenol extraction and QIAamp DNA blood mini kit were suitable for isolating HBV DNA from semen. The value of the detection thresholds was 500 copies of HBV DNA per mL in the semen. The viral loads were 7.5 × 107 and 1.67 × 107 copies of HBV DNA per mL in two HBV infected patients’ sera, while 2.14 × 105 and 3.02 × 105 copies of HBV DNA per mL in the semen. CONCLUSION: Real-time PCR is a more sensitive and accurate method to detect and quantify HBV DNA in the semen. PMID:16149152

  15. Software for rapid time dependent ChIP-sequencing analysis (TDCA).

    Science.gov (United States)

    Myschyshyn, Mike; Farren-Dai, Marco; Chuang, Tien-Jui; Vocadlo, David

    2017-11-25

    Chromatin immunoprecipitation followed by DNA sequencing (ChIP-seq) and associated methods are widely used to define the genome wide distribution of chromatin associated proteins, post-translational epigenetic marks, and modifications found on DNA bases. An area of emerging interest is to study time dependent changes in the distribution of such proteins and marks by using serial ChIP-seq experiments performed in a time resolved manner. Despite such time resolved studies becoming increasingly common, software to facilitate analysis of such data in a robust automated manner is limited. We have designed software called Time-Dependent ChIP-Sequencing Analyser (TDCA), which is the first program to automate analysis of time-dependent ChIP-seq data by fitting to sigmoidal curves. We provide users with guidance for experimental design of TDCA for modeling of time course (TC) ChIP-seq data using two simulated data sets. Furthermore, we demonstrate that this fitting strategy is widely applicable by showing that automated analysis of three previously published TC data sets accurately recapitulates key findings reported in these studies. Using each of these data sets, we highlight how biologically relevant findings can be readily obtained by exploiting TDCA to yield intuitive parameters that describe behavior at either a single locus or sets of loci. TDCA enables customizable analysis of user input aligned DNA sequencing data, coupled with graphical outputs in the form of publication-ready figures that describe behavior at either individual loci or sets of loci sharing common traits defined by the user. TDCA accepts sequencing data as standard binary alignment map (BAM) files and loci of interest in browser extensible data (BED) file format. TDCA accurately models the number of sequencing reads, or coverage, at loci from TC ChIP-seq studies or conceptually related TC sequencing experiments. TC experiments are reduced to intuitive parametric values that facilitate biologically

  16. Online Operation Guidance of Computer System Used in Real-Time Distance Education Environment

    Science.gov (United States)

    He, Aiguo

    2011-01-01

    Computer system is useful for improving real time and interactive distance education activities. Especially in the case that a large number of students participate in one distance lecture together and every student uses their own computer to share teaching materials or control discussions over the virtual classrooms. The problem is that within…

  17. 78 FR 38949 - Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response

    Science.gov (United States)

    2013-06-28

    ... exposed to various forms of cyber attack. In some cases, attacks can be thwarted through the use of...-3383-01] Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response... systems will be successfully attacked. When a successful attack occurs, the job of a Computer Security...

  18. Rapid on-site sensing aflatoxin B1 in food and feed via a chromatographic time-resolved fluoroimmunoassay.

    Directory of Open Access Journals (Sweden)

    Zhaowei Zhang

    Full Text Available Aflatoxin B1 poses grave threats to food and feed safety due to its strong carcinogenesis and toxicity, thus requiring ultrasensitive rapid on-site determination. Herein, a portable immunosensor based on chromatographic time-resolved fluoroimmunoassay was developed for sensitive and on-site determination of aflatoxin B1 in food and feed samples. Chromatographic time-resolved fluoroimmunoassay offered a magnified positive signal and low signal-to-noise ratio in time-resolved mode due to the absence of noise interference caused by excitation light sources. Compared with the immunosensing performance in previous studies, this platform demonstrated a wider dynamic range of 0.2-60 μg/kg, lower limit of detection from 0.06 to 0.12 µg/kg, and considerable recovery from 80.5% to 116.7% for different food and feed sample matrices. It was found to be little cross-reactivity with other aflatoxins (B2, G1, G2, and M1. In the case of determination of aflatoxin B1 in peanuts, corn, soy sauce, vegetable oil, and mouse feed, excellent agreement was found when compared with aflatoxin B1 determination via the conversational high-performance liquid chromatography method. The chromatographic time-resolved fluoroimmunoassay affords a powerful alternative for rapid on-site determination of aflatoxin B1 and holds a promise for food safety in consideration of practical food safety and environmental monitoring.

  19. Rapid on-site sensing aflatoxin B1 in food and feed via a chromatographic time-resolved fluoroimmunoassay.

    Science.gov (United States)

    Zhang, Zhaowei; Tang, Xiaoqian; Wang, Du; Zhang, Qi; Li, Peiwu; Ding, Xiaoxia

    2015-01-01

    Aflatoxin B1 poses grave threats to food and feed safety due to its strong carcinogenesis and toxicity, thus requiring ultrasensitive rapid on-site determination. Herein, a portable immunosensor based on chromatographic time-resolved fluoroimmunoassay was developed for sensitive and on-site determination of aflatoxin B1 in food and feed samples. Chromatographic time-resolved fluoroimmunoassay offered a magnified positive signal and low signal-to-noise ratio in time-resolved mode due to the absence of noise interference caused by excitation light sources. Compared with the immunosensing performance in previous studies, this platform demonstrated a wider dynamic range of 0.2-60 μg/kg, lower limit of detection from 0.06 to 0.12 µg/kg, and considerable recovery from 80.5% to 116.7% for different food and feed sample matrices. It was found to be little cross-reactivity with other aflatoxins (B2, G1, G2, and M1). In the case of determination of aflatoxin B1 in peanuts, corn, soy sauce, vegetable oil, and mouse feed, excellent agreement was found when compared with aflatoxin B1 determination via the conversational high-performance liquid chromatography method. The chromatographic time-resolved fluoroimmunoassay affords a powerful alternative for rapid on-site determination of aflatoxin B1 and holds a promise for food safety in consideration of practical food safety and environmental monitoring.

  20. Self-adaptive method to distinguish inner and outer contours of industrial computed tomography image for rapid prototype

    International Nuclear Information System (INIS)

    Duan Liming; Ye Yong; Zhang Xia; Zuo Jian

    2013-01-01

    A self-adaptive identification method is proposed for realizing more accurate and efficient judgment about the inner and outer contours of industrial computed tomography (CT) slice images. The convexity-concavity of the single-pixel-wide closed contour is identified with angle method at first. Then, contours with concave vertices are distinguished to be inner or outer contours with ray method, and contours without concave vertices are distinguished with extreme coordinate value method. The method was chosen to automatically distinguish contours by means of identifying the convexity and concavity of the contours. Thus, the disadvantages of single distinguishing methods, such as ray method's time-consuming and extreme coordinate method's fallibility, can be avoided. The experiments prove the adaptability, efficiency, and accuracy of the self-adaptive method. (authors)

  1. Rapid Adjustment of Circadian Clocks to Simulated Travel to Time Zones across the Globe.

    Science.gov (United States)

    Harrison, Elizabeth M; Gorman, Michael R

    2015-12-01

    Daily rhythms in mammalian physiology and behavior are generated by a central pacemaker located in the hypothalamic suprachiasmatic nuclei (SCN), the timing of which is set by light from the environment. When the ambient light-dark cycle is shifted, as occurs with travel across time zones, the SCN and its output rhythms must reset or re-entrain their phases to match the new schedule-a sluggish process requiring about 1 day per hour shift. Using a global assay of circadian resetting to 6 equidistant time-zone meridians, we document this characteristically slow and distance-dependent resetting of Syrian hamsters under typical laboratory lighting conditions, which mimic summer day lengths. The circadian pacemaker, however, is additionally entrainable with respect to its waveform (i.e., the shape of the 24-h oscillation) allowing for tracking of seasonally varying day lengths. We here demonstrate an unprecedented, light exposure-based acceleration in phase resetting following 2 manipulations of circadian waveform. Adaptation of circadian waveforms to long winter nights (8 h light, 16 h dark) doubled the shift response in the first 3 days after the shift. Moreover, a bifurcated waveform induced by exposure to a novel 24-h light-dark-light-dark cycle permitted nearly instant resetting to phase shifts from 4 to 12 h in magnitude, representing a 71% reduction in the mismatch between the activity rhythm and the new photocycle. Thus, a marked enhancement of phase shifting can be induced via nonpharmacological, noninvasive manipulation of the circadian pacemaker waveform in a model species for mammalian circadian rhythmicity. Given the evidence of conserved flexibility in the human pacemaker waveform, these findings raise the promise of flexible resetting applicable to circadian disruption in shift workers, frequent time-zone travelers, and any individual forced to adjust to challenging schedules. © 2015 The Author(s).

  2. Rapid estimation of earthquake magnitude from the arrival time of the peak high‐frequency amplitude

    Science.gov (United States)

    Noda, Shunta; Yamamoto, Shunroku; Ellsworth, William L.

    2016-01-01

    We propose a simple approach to measure earthquake magnitude M using the time difference (Top) between the body‐wave onset and the arrival time of the peak high‐frequency amplitude in an accelerogram. Measured in this manner, we find that Mw is proportional to 2logTop for earthquakes 5≤Mw≤7, which is the theoretical proportionality if Top is proportional to source dimension and stress drop is scale invariant. Using high‐frequency (>2  Hz) data, the root mean square (rms) residual between Mw and MTop(M estimated from Top) is approximately 0.5 magnitude units. The rms residuals of the high‐frequency data in passbands between 2 and 16 Hz are uniformly smaller than those obtained from the lower‐frequency data. Top depends weakly on epicentral distance, and this dependence can be ignored for distances earthquake produces a final magnitude estimate of M 9.0 at 120 s after the origin time. We conclude that Top of high‐frequency (>2  Hz) accelerograms has value in the context of earthquake early warning for extremely large events.

  3. A one-step, real-time PCR assay for rapid detection of rhinovirus.

    Science.gov (United States)

    Do, Duc H; Laus, Stella; Leber, Amy; Marcon, Mario J; Jordan, Jeanne A; Martin, Judith M; Wadowsky, Robert M

    2010-01-01

    One-step, real-time PCR assays for rhinovirus have been developed for a limited number of PCR amplification platforms and chemistries, and some exhibit cross-reactivity with genetically similar enteroviruses. We developed a one-step, real-time PCR assay for rhinovirus by using a sequence detection system (Applied Biosystems; Foster City, CA). The primers were designed to amplify a 120-base target in the noncoding region of picornavirus RNA, and a TaqMan (Applied Biosystems) degenerate probe was designed for the specific detection of rhinovirus amplicons. The PCR assay had no cross-reactivity with a panel of 76 nontarget nucleic acids, which included RNAs from 43 enterovirus strains. Excellent lower limits of detection relative to viral culture were observed for the PCR assay by using 38 of 40 rhinovirus reference strains representing different serotypes, which could reproducibly detect rhinovirus serotype 2 in viral transport medium containing 10 to 10,000 TCID(50) (50% tissue culture infectious dose endpoint) units/ml of the virus. However, for rhinovirus serotypes 59 and 69, the PCR assay was less sensitive than culture. Testing of 48 clinical specimens from children with cold-like illnesses for rhinovirus by the PCR and culture assays yielded detection rates of 16.7% and 6.3%, respectively. For a batch of 10 specimens, the entire assay was completed in 4.5 hours. This real-time PCR assay enables detection of many rhinovirus serotypes with the Applied Biosystems reagent-instrument platform.

  4. Standardization and application of real-time polymerase chain reaction for rapid detection of bluetongue virus

    Directory of Open Access Journals (Sweden)

    I. Karthika Lakshmi

    2018-04-01

    Full Text Available Aim: The present study was designed to standardize real-time polymerase chain reaction (PCR for detecting the bluetongue virus from blood samples of sheep collected during outbreaks of bluetongue disease in the year 2014 in Andhra Pradesh and Telangana states of India. Materials and Methods: A 10-fold serial dilution of Plasmid PUC59 with bluetongue virus (BTV NS3 insert was used to plot the standard curve. BHK-21 and KC cells were used for in vitro propagation of virus BTV-9 at a TCID50/ml of 105 ml and RNA was isolated by the Trizol method. Both reverse transcription -PCR and real-time PCR using TaqMan probe were carried out with RNA extracted from virus-spiked culture medium and blood to compare the sensitivity by means of finding out the limit of detection (LoD. The results were verified by inoculating the detected and undetected dilutions onto cell cultures with further cytological (cytopathic effect and molecular confirmation (by BTV-NS1 group-specific PCR. The standardized technique was then applied to field samples (blood for detecting BTV. Results: The slope of the standard curve obtained was -3.23, and the efficiency was 103%. The LoD with RT-PCR was 8.269Ex103 number of copies of plasmid, whereas it was 13 with real-time PCR for plasmid dilutions. Similarly, LoD was determined for virus-spiked culture medium, and blood with both the types of PCR and the values were 103 TCID 50/ml and 104 TCID 50/ml with RT-PCR and 10° TCID 50/ml and 102 TCID 50/ml with real-time PCR, respectively. The standardized technique was applied to blood samples collected from BTV suspected animals; 10 among 20 samples were found positive with Cq values ranging from 27 to 39. The Cq value exhibiting samples were further processed in cell cultures and were confirmed to be BT positive. Likewise, Cq undetected samples on processing in cell cultures turned out to be BTV negative. Conclusion: Real-time PCR was found to be a very sensitive as well as reliable method

  5. Fast Megavoltage Computed Tomography: A Rapid Imaging Method for Total Body or Marrow Irradiation in Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Magome, Taiki [Department of Radiological Sciences, Faculty of Health Sciences, Komazawa University, Tokyo (Japan); Department of Radiology, The University of Tokyo Hospital, Tokyo (Japan); Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Haga, Akihiro [Department of Radiology, The University of Tokyo Hospital, Tokyo (Japan); Takahashi, Yutaka [Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Department of Radiation Oncology, Osaka University, Osaka (Japan); Nakagawa, Keiichi [Department of Radiology, The University of Tokyo Hospital, Tokyo (Japan); Dusenbery, Kathryn E. [Department of Therapeutic Radiology, University of Minnesota, Minneapolis, Minnesota (United States); Hui, Susanta K., E-mail: shui@coh.org [Masonic Cancer Center, University of Minnesota, Minneapolis, Minnesota (United States); Department of Therapeutic Radiology, University of Minnesota, Minneapolis, Minnesota (United States); Department of Radiation Oncology and Beckman Research Institute, City of Hope, Duarte, California (United States)

    2016-11-01

    Purpose: Megavoltage computed tomographic (MVCT) imaging has been widely used for the 3-dimensional (3-D) setup of patients treated with helical tomotherapy (HT). One drawback of MVCT is its very long imaging time, the result of slow couch speeds of approximately 1 mm/s, which can be difficult for the patient to tolerate. We sought to develop an MVCT imaging method allowing faster couch speeds and to assess its accuracy for image guidance for HT. Methods and Materials: Three cadavers were scanned 4 times with couch speeds of 1, 2, 3, and 4 mm/s. The resulting MVCT images were reconstructed using an iterative reconstruction (IR) algorithm with a penalty term of total variation and with a conventional filtered back projection (FBP) algorithm. The MVCT images were registered with kilovoltage CT images, and the registration errors from the 2 reconstruction algorithms were compared. This fast MVCT imaging was tested in 3 cases of total marrow irradiation as a clinical trial. Results: The 3-D registration errors of the MVCT images reconstructed with the IR algorithm were smaller than the errors of images reconstructed with the FBP algorithm at fast couch speeds (2, 3, 4 mm/s). The scan time and imaging dose at a speed of 4 mm/s were reduced to 30% of those from a conventional coarse mode scan. For the patient imaging, faster MVCT (3 mm/s couch speed) scanning reduced the imaging time and still generated images useful for anatomic registration. Conclusions: Fast MVCT with the IR algorithm is clinically feasible for large 3-D target localization, which may reduce the overall time for the treatment procedure. This technique may also be useful for calculating daily dose distributions or organ motion analyses in HT treatment over a wide area. Automated integration of this imaging is at least needed to further assess its clinical benefits.

  6. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  8. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate key elements of feasibility for a high speed automated time domain terahertz computed axial tomography (TD-THz CT) non destructive...

  9. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase 2 project, we propose to develop, construct, and deliver to NASA a computed axial tomography time-domain terahertz (CT TD-THz) non destructive...

  10. Rapid detection of Van genes in rectal swabs by real time PCR in Southern Brazil

    Directory of Open Access Journals (Sweden)

    Vlademir Cantarelli

    2011-10-01

    Full Text Available INTRODUCTION: Laboratory-based surveillance is an important component in the control of vancomycin resistant enterococci (VRE. METHODS: The study aimed to evaluate real-time polymerase chain reaction (RT-PCR (genes vanA-vanB for VRE detection on 115 swabs from patients included in a surveillance program. RESULTS: Sensitivity of RT-PCR was similar to primary culture (75% and 79.5%, respectively when compared to broth enriched culture, whereas specificity was 83.1%. CONCLUSIONS: RT-PCR provides same day results, however it showed low sensitivity for VRE detection.

  11. Decreasing Transition Times in Elementary School Classrooms: Using Computer-Assisted Instruction to Automate Intervention Components

    Science.gov (United States)

    Hine, Jeffrey F.; Ardoin, Scott P.; Foster, Tori E.

    2015-01-01

    Research suggests that students spend a substantial amount of time transitioning between classroom activities, which may reduce time spent academically engaged. This study used an ABAB design to evaluate the effects of a computer-assisted intervention that automated intervention components previously shown to decrease transition times. We examined…

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  13. Rapid data processing for ultrafast X-ray computed tomography using scalable and modular CUDA based pipelines

    Science.gov (United States)

    Frust, Tobias; Wagner, Michael; Stephan, Jan; Juckeland, Guido; Bieberle, André

    2017-10-01

    Ultrafast X-ray tomography is an advanced imaging technique for the study of dynamic processes basing on the principles of electron beam scanning. A typical application case for this technique is e.g. the study of multiphase flows, that is, flows of mixtures of substances such as gas-liquidflows in pipelines or chemical reactors. At Helmholtz-Zentrum Dresden-Rossendorf (HZDR) a number of such tomography scanners are operated. Currently, there are two main points limiting their application in some fields. First, after each CT scan sequence the data of the radiation detector must be downloaded from the scanner to a data processing machine. Second, the current data processing is comparably time-consuming compared to the CT scan sequence interval. To enable online observations or use this technique to control actuators in real-time, a modular and scalable data processing tool has been developed, consisting of user-definable stages working independently together in a so called data processing pipeline, that keeps up with the CT scanner's maximal frame rate of up to 8 kHz. The newly developed data processing stages are freely programmable and combinable. In order to achieve the highest processing performance all relevant data processing steps, which are required for a standard slice image reconstruction, were individually implemented in separate stages using Graphics Processing Units (GPUs) and NVIDIA's CUDA programming language. Data processing performance tests on different high-end GPUs (Tesla K20c, GeForce GTX 1080, Tesla P100) showed excellent performance. Program Files doi:http://dx.doi.org/10.17632/65sx747rvm.1 Licensing provisions: LGPLv3 Programming language: C++/CUDA Supplementary material: Test data set, used for the performance analysis. Nature of problem: Ultrafast computed tomography is performed with a scan rate of up to 8 kHz. To obtain cross-sectional images from projection data computer-based image reconstruction algorithms must be applied. The

  14. Rapid screening of fatty acid alkyl esters in olive oils by time domain reflectometry.

    Science.gov (United States)

    Berardinelli, Annachiara; Ragni, Luigi; Bendini, Alessandra; Valli, Enrico; Conte, Lanfranco; Guarnieri, Adriano; Toschi, Tullia Gallina

    2013-11-20

    The main aim of the present research is to assess the possibility of quickly screening fatty acid alkyl esters (FAAE) in olive oils using time domain reflectometry (TDR) and partial least-squares (PLS) multivariate statistical analysis. Eighteen virgin olive oil samples with fatty acid alkyl ester contents and fatty acid ethyl ester/methyl ester ratios (FAEE/FAME) ranging from 3 to 100 mg kg(-1) and from 0.3 to 2.6, respectively, were submitted to tests with time domain resolution of 1 ps. The results obtained in test set validation demonstrated that this new and fast analytical approach is able to predict FAME, FAEE, and FAME + FAEE contents with R(2) values of 0.905, 0.923, and 0.927, respectively. Further measurements on mixtures between olive oil and FAAE standards confirmed that the prediction is based on a direct influence of fatty acid alkyl esters on the TDR signal. The suggested technique appeared potentially suitable for monitoring one of the most important quality attribute of the olive oil in the extraction process.

  15. A rapid detection method for policy-sensitive amines real-time supervision.

    Science.gov (United States)

    Zhang, Haixu; Shu, Jinian; Yang, Bo; Zhang, Peng; Ma, Pengkun

    2018-02-01

    Many organic amines that comprise a benzene ring are policy-sensitive because of their toxicity and links to social harm. However, to date, detection of such compounds mainly relies on offline methods. This study proposes an online pptv (parts per trillion by volume) level of detection method for amines, using the recently-built vacuum ultraviolet photoionization mass spectrometer (VUV-PIMS) combined with a new doping technique. Thus, the dichloromethane doping-assisted photoionization mass spectra of aniline, benzylamine, phenethylamine, amphetamine, and their structural isomers were recorded. The dominant characteristic mass peaks for all amines are those afforded by protonated amines and the amino radical-loss. The signal intensities of the amines were enhanced by 60-130 times compared to those recorded without doping assistance. Under 10s detection time, the sensitivities of aniline and benzylamine in the gas phase were determined as 4.0 and 2.7 countspptv -1 , with limits of detection (LODs) of 36 and 22 pptv, respectively. Notably, the detection efficiency of this method can be tenfold better in future applications since the ion transmission efficiency of the mass spectrometer was intentionally reduced to ~ 10% in this study. Therefore, dichloromethane doping-assisted photoionization mass spectrometry has proven to be a highly promising on-line approach to amine detection in environmental and judicial supervision and shows great potential for application in the biological field. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.

    1978-03-01

    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  17. Direct Analysis in Real-time Mass Spectrometry for Rapid Identification of Traditional Chinese Medicines with Coumarins as Primary Characteristics.

    Science.gov (United States)

    Chen, Zhiyong; Yang, Yuanyuan; Tao, Hongxun; Liao, Liping; Li, Ye; Zhang, Zijia

    2017-05-01

    The increasing popularity of traditional Chinese medicines (TCMs) necessitates rapid and reliable methods for controlling their quality. Direct analysis in real-time mass spectrometry (DART-MS) represents a novel approach to analysing TCMs. To develop a quick and reliable method of identifying TCMs with coumarins as primary characteristics. DART-MS coupled with ion trap mass spectrometry was employed to rapidly identify TCMs with coumarins as primary characteristics and to explore the ionisation mechanisms of simple coumarins, furocoumarins and pyranocoumarins in detail. With minimal sample pretreatment, mass spectra of Fraxini Cortex, Angelicae Pubescentis Radix, Peucedani Radix and Psoraleae Fructus samples were obtained within seconds. The operating parameters of the DART ion source (e.g. grid electrode voltage and ionisation gas temperature) were carefully investigated to obtain high-quality mass spectra. The mass spectra of samples and DART-MS/MS spectra of marker compounds were used to identify sample materials. Successful authentication was achieved by analysing the same materials of different origins. Some simple coumarins, furocoumarins and pyranocoumarins can be directly detected by DART-MS as marker compounds. Our results demonstrated that DART-MS can provide a rapid and reliable method for the identification of TCMs containing different configurations of coumarins; the method may also be applicable to other plants. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  19. RAMSES - Rapid Measurement and Special Environment time-of-flight Spectrometer

    International Nuclear Information System (INIS)

    Schober, H.; Koza, M.; Mutka, H.; Zbiri, M.; Andersen, K.

    2011-01-01

    Time-of-flight spectrometers are ideally suited to study the dynamics of complex materials as encountered in all domains of current scientific interest ranging from health care, biology, earth and environmental sciences, cultural heritage to energy storage and preservation. Complex materials are often available in samples of small amount, or the scientific questions to study require environments limiting the sample size (e.g., Paris-Edinburgh cells and levitation furnaces). The proposed instrument would be optimized for these conditions offering a very high neutron flux over a small beam cross-section in combination with good resolution and extended dynamical range. The later asks for a wavelength band extending slightly into the thermal region. This is achieved on a cold guide with super-mirror coating. (authors)

  20. Real-time PCR-based method for rapid detection of Aspergillus niger and Aspergillus welwitschiae isolated from coffee.

    Science.gov (United States)

    von Hertwig, Aline Morgan; Sant'Ana, Anderson S; Sartori, Daniele; da Silva, Josué José; Nascimento, Maristela S; Iamanaka, Beatriz Thie; Pelegrinelli Fungaro, Maria Helena; Taniwaki, Marta Hiromi

    2018-05-01

    Some species from Aspergillus section Nigri are morphologically very similar and altogether have been called A. niger aggregate. Although the species included in this group are morphologically very similar, they differ in their ability to produce mycotoxins and other metabolites and their taxonomical status has evolved continuously. Among them, A. niger and A. welwitschiae are ochratoxin A and fumonisin B 2 producers and their detection and/or identification is of crucial importance for food safety. The aim of this study was the development of a real-time PCR-based method for simultaneous discrimination of A. niger and A. welwitschiae from other species of the A. niger aggregate isolated from coffee beans. One primer pair and a hybridization probe specific for detection of A. niger and A. welwitschiae strains were designed based on the BenA gene sequences, and used in a Real-time PCR assay for the rapid discrimination between both these species from all others of the A. niger aggregate. The Real-time PCR assay was shown to be 100% efficient in discriminating the 73 isolates of A. niger/A. welwitschiae from the other A. niger aggregate species analyzed as a negative control. This result testifies to the use of this technique as a good tool in the rapid detection of these important toxigenic species. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Rapid detection and typing of pathogenic nonpneumophila Legionella spp. isolates using a multiplex real-time PCR assay.

    Science.gov (United States)

    Benitez, Alvaro J; Winchell, Jonas M

    2016-04-01

    We developed a single tube multiplex real-time PCR assay that allows for the rapid detection and typing of 9 nonpneumophila Legionella spp. isolates that are clinically relevant. The multiplex assay is capable of simultaneously detecting and discriminating L. micdadei, L. bozemanii, L. dumoffii, L. longbeachae, L. feeleii, L. anisa, L. parisiensis, L. tucsonensis serogroup (sg) 1 and 3, and L. sainthelensis sg 1 and 2 isolates. Evaluation of the assay with nucleic acid from each of these species derived from both clinical and environmental isolates and typing strains demonstrated 100% sensitivity and 100% specificity when tested against 43 other Legionella spp. Typing of L. anisa, L. parisiensis, and L. tucsonensis sg 1 and 3 isolates was accomplished by developing a real-time PCR assay followed by high-resolution melt (HRM) analysis targeting the ssrA gene. Further typing of L. bozemanii, L. longbeachae, and L. feeleii isolates to the serogroup level was accomplished by developing a real-time PCR assay followed by HRM analysis targeting the mip gene. When used in conjunction with other currently available diagnostic tests, these assays may aid in rapidly identifying specific etiologies associated with Legionella outbreaks, clusters, sporadic cases, and potential environmental sources. Published by Elsevier Inc.

  2. Time-resolved temperature measurements in a rapid compression machine using quantum cascade laser absorption in the intrapulse mode

    KAUST Repository

    Nasir, Ehson Fawad

    2016-07-16

    A temperature sensor based on the intrapulse absorption spectroscopy technique has been developed to measure in situ temperature time-histories in a rapid compression machine (RCM). Two quantum-cascade lasers (QCLs) emitting near 4.55μm and 4.89μm were operated in pulsed mode, causing a frequency "down-chirp" across two ro-vibrational transitions of carbon monoxide. The down-chirp phenomenon resulted in large spectral tuning (δν ∼2.8cm-1) within a single pulse of each laser at a high pulse repetition frequency (100kHz). The wide tuning range allowed the application of the two-line thermometry technique, thus making the sensor quantitative and calibration-free. The sensor was first tested in non-reactive CO-N2 gas mixtures in the RCM and then applied to cases of n-pentane oxidation. Experiments were carried out for end of compression (EOC) pressures and temperatures ranging 9.21-15.32bar and 745-827K, respectively. Measured EOC temperatures agreed with isentropic calculations within 5%. Temperature rise measured during the first-stage ignition of n-pentane is over-predicted by zero-dimensional kinetic simulations. This work presents, for the first time, highly time-resolved temperature measurements in reactive and non-reactive rapid compression machine experiments. © 2016 Elsevier Ltd.

  3. Real-time in vivo diagnosis of laryngeal carcinoma with rapid fiber-optic Raman spectroscopy

    Science.gov (United States)

    Lin, Kan; Zheng, Wei; Lim, Chwee Ming; Huang, Zhiwei

    2016-01-01

    We assess the clinical utility of a unique simultaneous fingerprint (FP) (i.e., 800-1800 cm−1) and high-wavenumber (HW) (i.e., 2800-3600 cm−1) fiber-optic Raman spectroscopy for in vivo diagnosis of laryngeal cancer at endoscopy. A total of 2124 high-quality in vivo FP/HW Raman spectra (normal = 1321; cancer = 581) were acquired from 101 tissue sites (normal = 71; cancer = 30) of 60 patients (normal = 44; cancer = 16) undergoing routine endoscopic examination. FP/HW Raman spectra differ significantly between normal and cancerous laryngeal tissue that could be attributed to changes of proteins, lipids, nucleic acids, and the bound water content in the larynx. Partial least squares-discriminant analysis and leave-one tissue site-out, cross-validation were employed on the in vivo FP/HW tissue Raman spectra acquired, yielding a diagnostic accuracy of 91.1% (sensitivity: 93.3% (28/30); specificity: 90.1% (64/71)) for laryngeal cancer identification, which is superior to using either FP (accuracy: 86.1%; sensitivity: 86.7% (26/30); specificity: 85.9% (61/71)) or HW (accuracy: 84.2%; sensitivity: 76.7% (23/30); specificity: 87.3% (62/71)) Raman technique alone. Further receiver operating characteristic analysis reconfirms the best performance of the simultaneous FP/HW Raman technique for laryngeal cancer diagnosis. We demonstrate for the first time that the simultaneous FP/HW Raman spectroscopy technique can be used for improving real-time in vivo diagnosis of laryngeal carcinoma during endoscopic examination. PMID:27699131

  4. Development and validation of real-time PCR for rapid detection of Mecistocirrus digitatus.

    Directory of Open Access Journals (Sweden)

    Subhra Subhadra

    Full Text Available Hematophagous activity of Mecistocirrus digitatus, which causes substantial blood and weight loss in large ruminants, is an emerging challenge due to the economic loss it brings to the livestock industry. Infected animals are treated with anthelmintic drugs, based on the identification of helminth species and the severity of infection; however, traditional methods such as microscopic identification and the counting of eggs for diagnosis and determination of level of infection are laborious, cumbersome and unreliable. To facilitate the detection of this parasite, a SYBR green-based real-time PCR was standardized and validated for the detection of M. digitatus infection in cattle and buffaloes. Oligonucleotides were designed to amplify partial Internal Transcribed Spacer (ITS-1 sequence of M. digitatus. The specificity of the primers was confirmed by non-amplification of DNA extracted from other commonly occurring gastrointestinal nematodes in ruminants. Plasmids were ligated with partial ITS-1 sequence of M. digitatus, serially diluted (hundred fold and used as standards in the real-time PCR assay. The quantification cycle (Cq values were plotted against the standard DNA concentration to produce a standard curve. The assay was sensitive enough to detect one plasmid containing the M. digitatus DNA. Clinical application of this assay was validated by testing the DNA extracted from the faeces of naturally infected cattle (n = 40 and buffaloes (n = 25. The results were compared with our standard curve to calculate the quantity of M. digitatus in each faecal sample. The Cq value of the assay depicted a strong linear relationship with faecal DNA content, with a regression coefficient of 0.984 and efficiency of 99%. This assay has noteworthy advantages over the conventional methods of diagnosis because it is more specific, sensitive and reliable.

  5. Rapid qualitative urinary tract infection pathogen identification by SeptiFast real-time PCR.

    Directory of Open Access Journals (Sweden)

    Lutz E Lehmann

    2011-02-01

    Full Text Available Urinary tract infections (UTI are frequent in outpatients. Fast pathogen identification is mandatory for shortening the time of discomfort and preventing serious complications. Urine culture needs up to 48 hours until pathogen identification. Consequently, the initial antibiotic regimen is empirical.To evaluate the feasibility of qualitative urine pathogen identification by a commercially available real-time PCR blood pathogen test (SeptiFast® and to compare the results with dipslide and microbiological culture.Pilot study with prospectively collected urine samples.University hospital.82 prospectively collected urine samples from 81 patients with suspected UTI were included. Dipslide urine culture was followed by microbiological pathogen identification in dipslide positive samples. In parallel, qualitative DNA based pathogen identification (SeptiFast® was performed in all samples.61 samples were SeptiFast® positive, whereas 67 samples were dipslide culture positive. The inter-methodological concordance of positive and negative findings in the gram+, gram- and fungi sector was 371/410 (90%, 477/492 (97% and 238/246 (97%, respectively. Sensitivity and specificity of the SeptiFast® test for the detection of an infection was 0.82 and 0.60, respectively. SeptiFast® pathogen identifications were available at least 43 hours prior to culture results.The SeptiFast® platform identified bacterial DNA in urine specimens considerably faster compared to conventional culture. For UTI diagnosis sensitivity and specificity is limited by its present qualitative setup which does not allow pathogen quantification. Future quantitative assays may hold promise for PCR based UTI pathogen identification as a supplementation of conventional culture methods.

  6. Rapid detection of Lactobacillus kefiranofaciens in kefir grain and kefir milk using newly developed real-time PCR.

    Science.gov (United States)

    Kim, Dong-Hyeon; Chon, Jung-Whan; Kim, Hong-Seok; Yim, Jin-Hyeok; Kim, Hyunsook; Seo, Kun-Ho

    2015-04-01

    Lactobacillus kefiranofaciens is an indicator microorganism for kefir and a key factor in kefir grain formation and kefiran production. We designed a novel real-time PCR primer and probe set, LKF_KU504, for the rapid detection of L. kefiranofaciens. In inclusivity and exclusivity tests, only 14 L. kefiranofaciens strains were positive among 61 microorganisms, indicating 100 % sensitivity and specificity. The LKF_KU504 set also differentiated kefir milk from 30 commercial nonkefir yogurts. The levels of L. kefiranofaciens in kefir grain and kefir milk were significantly different, indicating L. kefiranofaciens was more concentrated in kefir grain than in kefir milk.

  7. Usefulness of Computed Tomography in pre-surgical evaluation of maxillo-facial pathology with rapid prototyping and surgical pre-planning by virtual reality

    International Nuclear Information System (INIS)

    Toso, Francesco; Zuiani, Chiara; Vergendo, Maurizio; Bazzocchi, Massimo; Salvo, Iolanda; Robiony, Massimo; Politi, Massimo

    2005-01-01

    Purpose. To validate a protocol for creating virtual models to be used in the construction of solid prototypes useful for the planning-simulation of maxillo-facial surgery, in particular for very complex anatomical and pathologic problems. To optimize communications between the radiology, engineering and surgical laboratories. Methods and materials. We studied 16 patients with different clinical problems of the maxillo-facial district. Exams were performed with multidetector computed tomography (MDCT) and single slice computed tomography (SDCT) with axial scans and collimation of 0.5-2 mm, and reconstruction interval of 1 mm. Subsequently we performed 2D multiplanar reconstructions and 3D volume-rendering reconstructions. We exported the DICOM images to the engineering laboratory, to recognize and isolate the bony structures by software. With these data the solid prototypes were generated using stereolitography. To date, surgery has been preformed on 12 patients after simulation of the procedure on the stereolitography model. Results. The solid prototypes constructed in the difficult cases were sufficiently detailed despite problems related to the artefacts generated by dental fillings and prostheses. In the remaining cases the MPR/3D images were sufficiently detailed for surgical planning. The surgical results were excellent in all patients who underwent surgery, and the surgeons were satisfied with the improvement in quality and the reduction in time required for the procedure. Conclusions. MDCT enables rapid prototyping using solid replication, which was very helpful in maxillofacial surgery, despite problems related to artifacts due to dental fillings and prosthesis within the acquisition field; solutions for this problem are work in progress. The protocol used for communication between the different laboratories was valid and reproducible [it

  8. Rapid increases and time-lagged declines in amphibian occupancy after wildfire.

    Science.gov (United States)

    Hossack, Blake R; Lowe, Winsor H; Corn, Paul Stephen

    2013-02-01

    Climate change is expected to increase the frequency and severity of drought and wildfire. Aquatic and moisture-sensitive species, such as amphibians, may be particularly vulnerable to these modified disturbance regimes because large wildfires often occur during extended droughts and thus may compound environmental threats. However, understanding of the effects of wildfires on amphibians in forests with long fire-return intervals is limited. Numerous stand-replacing wildfires have occurred since 1988 in Glacier National Park (Montana, U.S.A.), where we have conducted long-term monitoring of amphibians. We measured responses of 3 amphibian species to fires of different sizes, severity, and age in a small geographic area with uniform management. We used data from wetlands associated with 6 wildfires that burned between 1988 and 2003 to evaluate whether burn extent and severity and interactions between wildfire and wetland isolation affected the distribution of breeding populations. We measured responses with models that accounted for imperfect detection to estimate occupancy during prefire (0-4 years) and different postfire recovery periods. For the long-toed salamander (Ambystoma macrodactylum) and Columbia spotted frog (Rana luteiventris), occupancy was not affected for 6 years after wildfire. But 7-21 years after wildfire, occupancy for both species decreased ≥ 25% in areas where >50% of the forest within 500 m of wetlands burned. In contrast, occupancy of the boreal toad (Anaxyrus boreas) tripled in the 3 years after low-elevation forests burned. This increase in occupancy was followed by a gradual decline. Our results show that accounting for magnitude of change and time lags is critical to understanding population dynamics of amphibians after large disturbances. Our results also inform understanding of the potential threat of increases in wildfire frequency or severity to amphibians in the region. ©2012 Society for Conservation Biology.

  9. Time expenditure in computer aided time studies implemented for highly mechanized forest equipment

    Directory of Open Access Journals (Sweden)

    Elena Camelia Mușat

    2016-06-01

    Full Text Available Time studies represent important tools that are used in forest operations research to produce empirical models or to comparatively assess the performance of two or more operational alternatives with the general aim to predict the performance of operational behavior, choose the most adequate equipment or eliminate the useless time. There is a long tradition in collecting the needed data in a traditional fashion, but this approach has its limitations, and it is likely that in the future the use of professional software would be extended is such preoccupations as this kind of tools have been already implemented. However, little to no information is available in what concerns the performance of data analyzing tasks when using purpose-built professional time studying software in such research preoccupations, while the resources needed to conduct time studies, including here the time may be quite intensive. Our study aimed to model the relations between the variation of time needed to analyze the video-recorded time study data and the variation of some measured independent variables for a complex organization of a work cycle. The results of our study indicate that the number of work elements which were separated within a work cycle as well as the delay-free cycle time and the software functionalities that were used during data analysis, significantly affected the time expenditure needed to analyze the data (α=0.01, p<0.01. Under the conditions of this study, where the average duration of a work cycle was of about 48 seconds and the number of separated work elements was of about 14, the speed that was usedto replay the video files significantly affected the mean time expenditure which averaged about 273 seconds for half of the real speed and about 192 seconds for an analyzing speed that equaled the real speed. We argue that different study designs as well as the parameters used within the software are likely to produce

  10. Rapid and sensitive detection of Yersinia pestis using amplification of plague diagnostic bacteriophages monitored by real-time PCR.

    Directory of Open Access Journals (Sweden)

    Kirill V Sergueev

    Full Text Available BACKGROUND: Yersinia pestis, the agent of plague, has caused many millions of human deaths and still poses a serious threat to global public health. Timely and reliable detection of such a dangerous pathogen is of critical importance. Lysis by specific bacteriophages remains an essential method of Y. pestis detection and plague diagnostics. METHODOLOGY/PRINCIPAL FINDINGS: The objective of this work was to develop an alternative to conventional phage lysis tests--a rapid and highly sensitive method of indirect detection of live Y. pestis cells based on quantitative real-time PCR (qPCR monitoring of amplification of reporter Y. pestis-specific bacteriophages. Plague diagnostic phages phiA1122 and L-413C were shown to be highly effective diagnostic tools for the detection and identification of Y. pestis by using qPCR with primers specific for phage DNA. The template DNA extraction step that usually precedes qPCR was omitted. phiA1122-specific qPCR enabled the detection of an initial bacterial concentration of 10(3 CFU/ml (equivalent to as few as one Y. pestis cell per 1-microl sample in four hours. L-413C-mediated detection of Y. pestis was less sensitive (up to 100 bacteria per sample but more specific, and thus we propose parallel qPCR for the two phages as a rapid and reliable method of Y. pestis identification. Importantly, phiA1122 propagated in simulated clinical blood specimens containing EDTA and its titer rise was detected by both a standard plating test and qPCR. CONCLUSIONS/SIGNIFICANCE: Thus, we developed a novel assay for detection and identification of Y. pestis using amplification of specific phages monitored by qPCR. The method is simple, rapid, highly sensitive, and specific and allows the detection of only live bacteria.

  11. Rapid sympatric ecological differentiation of crater lake cichlid fishes within historic times

    Directory of Open Access Journals (Sweden)

    Harrod Chris

    2010-05-01

    Full Text Available Abstract Background After a volcano erupts, a lake may form in the cooled crater and become an isolated aquatic ecosystem. This makes fishes in crater lakes informative for understanding sympatric evolution and ecological diversification in barren environments. From a geological and limnological perspective, such research offers insight about the process of crater lake ecosystem establishment and speciation. In the present study we use genetic and coalescence approaches to infer the colonization history of Midas cichlid fishes (Amphilophus cf. citrinellus that inhabit a very young crater lake in Nicaragua-the ca. 1800 year-old Lake Apoyeque. This lake holds two sympatric, endemic morphs of Midas cichlid: one with large, hypertrophied lips (~20% of the total population and another with thin lips. Here we test the associated ecological, morphological and genetic diversification of these two morphs and their potential to represent incipient speciation. Results Gene coalescence analyses [11 microsatellite loci and mitochondrial DNA (mtDNA sequences] suggest that crater lake Apoyeque was colonized in a single event from the large neighbouring great lake Managua only about 100 years ago. This founding in historic times is also reflected in the extremely low nuclear and mitochondrial genetic diversity in Apoyeque. We found that sympatric adult thin- and thick-lipped fishes occupy distinct ecological trophic niches. Diet, body shape, head width, pharyngeal jaw size and shape and stable isotope values all differ significantly between the two lip-morphs. The eco-morphological features pharyngeal jaw shape, body shape, stomach contents and stable isotopes (δ15N all show a bimodal distribution of traits, which is compatible with the expectations of an initial stage of ecological speciation under disruptive selection. Genetic differentiation between the thin- and thick-lipped population is weak at mtDNA sequence (FST = 0.018 and absent at nuclear

  12. VNAP2: a computer program for computation of two-dimensional, time-dependent, compressible, turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Cline, M.C.

    1981-08-01

    VNAP2 is a computer program for calculating turbulent (as well as laminar and inviscid), steady, and unsteady flow. VNAP2 solves the two-dimensional, time-dependent, compressible Navier-Stokes equations. The turbulence is modeled with either an algebraic mixing-length model, a one-equation model, or the Jones-Launder two-equation model. The geometry may be a single- or a dual-flowing stream. The interior grid points are computed using the unsplit MacCormack scheme. Two options to speed up the calculations for high Reynolds number flows are included. The boundary grid points are computed using a reference-plane-characteristic scheme with the viscous terms treated as source functions. An explicit artificial viscosity is included for shock computations. The fluid is assumed to be a perfect gas. The flow boundaries may be arbitrary curved solid walls, inflow/outflow boundaries, or free-jet envelopes. Typical problems that can be solved concern nozzles, inlets, jet-powered afterbodies, airfoils, and free-jet expansions. The accuracy and efficiency of the program are shown by calculations of several inviscid and turbulent flows. The program and its use are described completely, and six sample cases and a code listing are included.

  13. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  14. TaqMan MGB probe fluorescence real-time quantitative PCR for rapid detection of Chinese Sacbrood virus.

    Directory of Open Access Journals (Sweden)

    Ma Mingxiao

    Full Text Available Sacbrood virus (SBV is a picorna-like virus that affects honey bees (Apis mellifera and results in the death of the larvae. Several procedures are available to detect Chinese SBV (CSBV in clinical samples, but not to estimate the level of CSBV infection. The aim of this study was develop an assay for rapid detection and quantification of this virus. Primers and probes were designed that were specific for CSBV structural protein genes. A TaqMan minor groove binder (MGB probe-based, fluorescence real-time quantitative PCR was established. The specificity, sensitivity and stability of the assay were assessed; specificity was high and there were no cross-reactivity with healthy larvae or other bee viruses. The assay was applied to detect CSBV in 37 clinical samples and its efficiency was compared with clinical diagnosis, electron microscopy observation, and conventional RT-PCR. The TaqMan MGB-based probe fluorescence real-time quantitative PCR for CSBV was more sensitive than other methods tested. This assay was a reliable, fast, and sensitive method that was used successfully to detect CSBV in clinical samples. The technology can provide a useful tool for rapid detection of CSBV. This study has established a useful protocol for CSBV testing, epidemiological investigation, and development of animal models.

  15. Rapid determination of ginkgolic acids in Ginkgo biloba kernels and leaves by direct analysis in real time-mass spectrometry.

    Science.gov (United States)

    Huang, Zhongping; Xu, Yueting; Huang, Yilei; Liu, Charles; Jiang, Kezhi; Wang, Lili

    2017-12-01

    A novel method based on direct analysis in real time integrated with mass spectrometry was established and applied into rapid determination of ginkgolic acids in Ginkgo biloba kernels and leaves. Instrument parameter settings were optimized to obtain the sensitive and accurate determination of ginkgolic acids. At the sample introduction speed of 0.2 mm/s, high intensity of [M-H] - ions for ginkgolic acids were observed in the negative ion mode by utilization of high-purity helium gas at 450°C. Two microliters of methanol extract of G. biloba kernels or leaves dropped on the surface of Quick-Strip module was analyzed after solvent evaporated to dryness. A series of standard solutions of ginkgolic acid 13:0 in the range of 2-50 mg/L were analyzed with a correlation coefficient r = 0.9981 and relative standard deviation (n = 5) from 12.5 to 13.7%. The limit of detection was 0.5 mg/L. The results of direct analysis in real time-mass spectrometry were in agreement with those observed by thermochemolysis gas chromatography. The proposed method demonstrated significant potential in the application of the high-throughput screening and rapid analysis for ginkgolic acids in dietary supplements. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Terahertz time-domain attenuated total reflection spectroscopy applied to the rapid discrimination of the botanical origin of honeys

    Science.gov (United States)

    Liu, Wen; Zhang, Yuying; Yang, Si; Han, Donghai

    2018-05-01

    A new technique to identify the floral resources of honeys is demanded. Terahertz time-domain attenuated total reflection spectroscopy combined with chemometrics methods was applied to discriminate different categorizes (Medlar honey, Vitex honey, and Acacia honey). Principal component analysis (PCA), cluster analysis (CA) and partial least squares-discriminant analysis (PLS-DA) have been used to find information of the botanical origins of honeys. Spectral range also was discussed to increase the precision of PLS-DA model. The accuracy of 88.46% for validation set was obtained, using PLS-DA model in 0.5-1.5 THz. This work indicated terahertz time-domain attenuated total reflection spectroscopy was an available approach to evaluate the quality of honey rapidly.

  17. Rapid and minimum invasive functional brain mapping by real-time visualization of high gamma activity during awake craniotomy.

    Science.gov (United States)

    Ogawa, Hiroshi; Kamada, Kyousuke; Kapeller, Christoph; Hiroshima, Satoru; Prueckl, Robert; Guger, Christoph

    2014-11-01

    Electrocortical stimulation (ECS) is the gold standard for functional brain mapping during an awake craniotomy. The critical issue is to set aside enough time to identify eloquent cortices by ECS. High gamma activity (HGA) ranging between 80 and 120 Hz on electrocorticogram is assumed to reflect localized cortical processing. In this report, we used real-time HGA mapping and functional neuronavigation integrated with functional magnetic resonance imaging (fMRI) for rapid and reliable identification of motor and language functions. Four patients with intra-axial tumors in their dominant hemisphere underwent preoperative fMRI and lesion resection with an awake craniotomy. All patients showed significant fMRI activation evoked by motor and language tasks. During the craniotomy, we recorded electrocorticogram activity by placing subdural grids directly on the exposed brain surface. Each patient performed motor and language tasks and demonstrated real-time HGA dynamics in hand motor areas and parts of the inferior frontal gyrus. Sensitivity and specificity of HGA mapping were 100% compared with ECS mapping in the frontal lobe, which suggested HGA mapping precisely indicated eloquent cortices. We found different HGA dynamics of language tasks in frontal and temporal regions. Specificities of the motor and language-fMRI did not reach 85%. The results of HGA mapping was mostly consistent with those of ECS mapping, although fMRI tended to overestimate functional areas. This novel technique enables rapid and accurate identification of motor and frontal language areas. Furthermore, real-time HGA mapping sheds light on underlying physiological mechanisms related to human brain functions. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  19. Real-time computing platform for spiking neurons (RT-spike).

    Science.gov (United States)

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  20. Variation in computer time with geometry prescription in monte carlo code KENO-IV

    International Nuclear Information System (INIS)

    Gopalakrishnan, C.R.

    1988-01-01

    In most studies, the Monte Carlo criticality code KENO-IV has been compared with other Monte Carlo codes, but evaluation of its performance with different box descriptions has not been done so far. In Monte Carlo computations, any fractional savings of computing time is highly desirable. Variation in computation time with box description in KENO for two different fast reactor fuel subassemblies of FBTR and PFBR is studied. The K eff of an infinite array of fuel subassemblies is calculated by modelling the subassemblies in two different ways (i) multi-region, (ii) multi-box. In addition to these two cases, excess reactivity calculations of FBTR are also performed in two ways to study this effect in a complex geometry. It is observed that the K eff values calculated by multi-region and multi-box models agree very well. However the increase in computation time from the multi-box to the multi-region is considerable, while the difference in computer storage requirements for the two models is negligible. This variation in computing time arises from the way the neutron is tracked in the two cases. (author)

  1. Pepsi-SAXS: an adaptive method for rapid and accurate computation of small-angle X-ray scattering profiles.

    Science.gov (United States)

    Grudinin, Sergei; Garkavenko, Maria; Kazennov, Andrei

    2017-05-01

    A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist-Shannon-Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion order, this method has the same quadratic dependence on the number of atoms in the model as the Debye-based approach, but with a much smaller prefactor in the computational complexity. The method has been systematically validated on a large set of over 50 models collected from the BioIsis and SASBDB databases. Using a laptop, it was demonstrated that Pepsi-SAXS is about seven, 29 and 36 times faster compared with CRYSOL, FoXS and the three-dimensional Zernike method in SAStbx, respectively, when tested on data from the BioIsis database, and is about five, 21 and 25 times faster compared with CRYSOL, FoXS and SAStbx, respectively, when tested on data from SASBDB. On average, Pepsi-SAXS demonstrates comparable accuracy in terms of χ 2 to CRYSOL and FoXS when tested on BioIsis and SASBDB profiles. Together with a small allowed variation of adjustable parameters, this demonstrates the effectiveness of the method. Pepsi-SAXS is available at http://team.inria.fr/nano-d/software/pepsi-saxs.

  2. Computing the Maximum Detour of a Plane Graph in Subquadratic Time

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    Let G be a plane graph where each edge is a line segment. We consider the problem of computing the maximum detour of G, defined as the maximum over all pairs of distinct points p and q of G of the ratio between the distance between p and q in G and the distance |pq|. The fastest known algorithm f...... for this problem has O(n^2) running time. We show how to obtain O(n^{3/2}*(log n)^3) expected running time. We also show that if G has bounded treewidth, its maximum detour can be computed in O(n*(log n)^3) expected time....

  3. Continuous-variable quantum computing in optical time-frequency modes using quantum memories.

    Science.gov (United States)

    Humphreys, Peter C; Kolthammer, W Steven; Nunn, Joshua; Barbieri, Marco; Datta, Animesh; Walmsley, Ian A

    2014-09-26

    We develop a scheme for time-frequency encoded continuous-variable cluster-state quantum computing using quantum memories. In particular, we propose a method to produce, manipulate, and measure two-dimensional cluster states in a single spatial mode by exploiting the intrinsic time-frequency selectivity of Raman quantum memories. Time-frequency encoding enables the scheme to be extremely compact, requiring a number of memories that are a linear function of only the number of different frequencies in which the computational state is encoded, independent of its temporal duration. We therefore show that quantum memories can be a powerful component for scalable photonic quantum information processing architectures.

  4. Computation of transit times using the milestoning method with applications to polymer translocation

    Science.gov (United States)

    Hawk, Alexander T.; Konda, Sai Sriharsha M.; Makarov, Dmitrii E.

    2013-08-01

    Milestoning is an efficient approximation for computing long-time kinetics and thermodynamics of large molecular systems, which are inaccessible to brute-force molecular dynamics simulations. A common use of milestoning is to compute the mean first passage time (MFPT) for a conformational transition of interest. However, the MFPT is not always the experimentally observed timescale. In particular, the duration of the transition path, or the mean transit time, can be measured in single-molecule experiments, such as studies of polymers translocating through pores and fluorescence resonance energy transfer studies of protein folding. Here we show how to use milestoning to compute transit times and illustrate our approach by applying it to the translocation of a polymer through a narrow pore.

  5. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    International Nuclear Information System (INIS)

    Lent, Wineke A.M. van; Deetman, Joost W.; Teertstra, H. Jelle; Muller, Sara H.; Hans, Erwin W.; Harten, Wim H. van

    2012-01-01

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  6. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lent, Wineke A.M. van, E-mail: w.v.lent@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands); Deetman, Joost W., E-mail: j.deetman@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Teertstra, H. Jelle, E-mail: h.teertstra@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Muller, Sara H., E-mail: s.muller@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Hans, Erwin W., E-mail: e.w.hans@utwente.nl [University of Twente, School of Management and Governance, Dept. of Industrial Engineering and Business Intelligence Systems, Enschede (Netherlands); Harten, Wim H. van, E-mail: w.v.harten@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands)

    2012-11-15

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  7. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    Energy Technology Data Exchange (ETDEWEB)

    Goings, Joshua J.; Li, Xiaosong, E-mail: xsli@uw.edu [Department of Chemistry, University of Washington, Seattle, Washington 98195 (United States)

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  8. Real-time data acquisition and feedback control using Linux Intel computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Piglowski, D.A.; Johnson, R.D.; Walker, M.L.

    2006-01-01

    This paper describes the experiences of the DIII-D programming staff in adapting Linux based Intel computing hardware for use in real-time data acquisition and feedback control systems. Due to the highly dynamic and unstable nature of magnetically confined plasmas in tokamak fusion experiments, real-time data acquisition and feedback control systems are in routine use with all major tokamaks. At DIII-D, plasmas are created and sustained using a real-time application known as the digital plasma control system (PCS). During each experiment, the PCS periodically samples data from hundreds of diagnostic signals and provides these data to control algorithms implemented in software. These algorithms compute the necessary commands to send to various actuators that affect plasma performance. The PCS consists of a group of rack mounted Intel Xeon computer systems running an in-house customized version of the Linux operating system tailored specifically to meet the real-time performance needs of the plasma experiments. This paper provides a more detailed description of the real-time computing hardware and custom developed software, including recent work to utilize dual Intel Xeon equipped computers within the PCS

  9. Kajian dan Implementasi Real TIME Operating System pada Single Board Computer Berbasis Arm

    OpenAIRE

    A, Wiedjaja; M, Handi; L, Jonathan; Christian, Benyamin; Kristofel, Luis

    2014-01-01

    Operating System is an important software in computer system. For personal and office use the operating system is sufficient. However, to critical mission applications such as nuclear power plants and braking system on the car (auto braking system) which need a high level of reliability, it requires operating system which operates in real time. The study aims to assess the implementation of the Linux-based operating system on a Single Board Computer (SBC) ARM-based, namely Pandaboard ES with ...

  10. Y2K issues for real time computer systems for fast breeder test reactor

    International Nuclear Information System (INIS)

    Swaminathan, P.

    1999-01-01

    Presentation shows the classification of real time systems related to operation, control and monitoring of the fast breeder test reactor. Software life cycle includes software requirement specification, software design description, coding, commissioning, operation and management. A software scheme in supervisory computer of fast breeder test rector is described with the twenty years of experience in design, development, installation, commissioning, operation and maintenance of computer based supervision control system for nuclear installation with a particular emphasis on solving the Y2K problem

  11. Near real-time digital holographic microscope based on GPU parallel computing

    Science.gov (United States)

    Zhu, Gang; Zhao, Zhixiong; Wang, Huarui; Yang, Yan

    2018-01-01

    A transmission near real-time digital holographic microscope with in-line and off-axis light path is presented, in which the parallel computing technology based on compute unified device architecture (CUDA) and digital holographic microscopy are combined. Compared to other holographic microscopes, which have to implement reconstruction in multiple focal planes and are time-consuming the reconstruction speed of the near real-time digital holographic microscope can be greatly improved with the parallel computing technology based on CUDA, so it is especially suitable for measurements of particle field in micrometer and nanometer scale. Simulations and experiments show that the proposed transmission digital holographic microscope can accurately measure and display the velocity of particle field in micrometer scale, and the average velocity error is lower than 10%.With the graphic processing units(GPU), the computing time of the 100 reconstruction planes(512×512 grids) is lower than 120ms, while it is 4.9s using traditional reconstruction method by CPU. The reconstruction speed has been raised by 40 times. In other words, it can handle holograms at 8.3 frames per second and the near real-time measurement and display of particle velocity field are realized. The real-time three-dimensional reconstruction of particle velocity field is expected to achieve by further optimization of software and hardware. Keywords: digital holographic microscope,

  12. Real-Time Continuous Response Spectra Exceedance Calculation Displayed in a Web-Browser Enables Rapid and Robust Damage Evaluation by First Responders

    Science.gov (United States)

    Franke, M.; Skolnik, D. A.; Harvey, D.; Lindquist, K.

    2014-12-01

    A novel and robust approach is presented that provides near real-time earthquake alarms for critical structures at distributed locations and large facilities using real-time estimation of response spectra obtained from near free-field motions. Influential studies dating back to the 1980s identified spectral response acceleration as a key ground motion characteristic that correlates well with observed damage in structures. Thus, monitoring and reporting on exceedance of spectra-based thresholds are useful tools for assessing the potential for damage to facilities or multi-structure campuses based on input ground motions only. With as little as one strong-motion station per site, this scalable approach can provide rapid alarms on the damage status of remote towns, critical infrastructure (e.g., hospitals, schools) and points of interests (e.g., bridges) for a very large number of locations enabling better rapid decision making during critical and difficult immediate post-earthquake response actions. Details on the novel approach are presented along with an example implementation for a large energy company. Real-time calculation of PSA exceedance and alarm dissemination are enabled with Bighorn, an extension module based on the Antelope software package that combines real-time spectral monitoring and alarm capabilities with a robust built-in web display server. Antelope is an environmental data collection software package from Boulder Real Time Technologies (BRTT) typically used for very large seismic networks and real-time seismic data analyses. The primary processing engine produces continuous time-dependent response spectra for incoming acceleration streams. It utilizes expanded floating-point data representations within object ring-buffer packets and waveform files in a relational database. This leads to a very fast method for computing response spectra for a large number of channels. A Python script evaluates these response spectra for exceedance of one or more

  13. OpenVX-based Python Framework for real-time cross platform acceleration of embedded computer vision applications

    Directory of Open Access Journals (Sweden)

    Ori Heimlich

    2016-11-01

    Full Text Available Embedded real-time vision applications are being rapidly deployed in a large realm of consumer electronics, ranging from automotive safety to surveillance systems. However, the relatively limited computational power of embedded platforms is considered as a bottleneck for many vision applications, necessitating optimization. OpenVX is a standardized interface, released in late 2014, in an attempt to provide both system and kernel level optimization to vision applications. With OpenVX, Vision processing are modeled with coarse-grained data flow graphs, which can be optimized and accelerated by the platform implementer. Current full implementations of OpenVX are given in the programming language C, which does not support advanced programming paradigms such as object-oriented, imperative and functional programming, nor does it have runtime or type-checking. Here we present a python-based full Implementation of OpenVX, which eliminates much of the discrepancies between the object-oriented paradigm used by many modern applications and the native C implementations. Our open-source implementation can be used for rapid development of OpenVX applications in embedded platforms. Demonstration includes static and real-time image acquisition and processing using a Raspberry Pi and a GoPro camera. Code is given as supplementary information. Code project and linked deployable virtual machine are located on GitHub: https://github.com/NBEL-lab/PythonOpenVX.

  14. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints.

    Science.gov (United States)

    Sako, Shunji; Sugiura, Hiromichi; Tanoue, Hironori; Kojima, Makoto; Kono, Mitsunobu; Inaba, Ryoichi

    2014-08-01

    This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1) the distal position (DP), in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2) the proximal position (PP), in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses), operating efficiencies (based on word counts), and fatigue levels (based on the visual analog scale - VAS). Oxygen consumption (VO(2)), the ratio of inspiration time to respiration time (T(i)/T(total)), respiratory rate (RR), minute ventilation (VE), and the ratio of expiration to inspiration (Te/T(i)) were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT), carbon dioxide output rates (VCO(2)/VE), and oxygen extraction fractions (VO(2)/VE) were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when operating a computer.

  15. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  16. A Rapid, Onsite, Ultrasensitive Melamine Quantitation Method for Protein Beverages Using Time-Resolved Fluorescence Detection Paper.

    Science.gov (United States)

    Li, Guanghua; Wang, Du; Zhou, Aijun; Sun, Yimin; Zhang, Qi; Poapolathep, Amnart; Zhang, Li; Fan, Zhiyong; Zhang, Zhaowei; Li, Peiwu

    2018-05-02

    To ensure protein beverage safety and prevent illegal melamine use to artificially increase protein content, a rapid, onsite, ultrasensitive detection method for melamine must be developed because melamine is detrimental to human health and life. Herein, an ultrasensitive time-resolved fluorescence detection paper (TFDP) was developed to detect melamine in protein beverages within 15 min using a one-step sample preparation. The lower limits of detection were 0.89, 0.94, and 1.05 ng/mL, and the linear ranges were 2.67-150, 2.82-150, and 3.15-150 ng/mL (R2>0.982) for peanut, walnut, and coconut beverages, respectively. The recovery rates were 85.86-110.60% with a coefficient of variation beverage samples, the TFDP and ultra-performance liquid chromatography-tandem mass spectrometer (UPLC-MS/MS) results were consistent. This method is a promising alternative for rapid, onsite detection of melamine in beverages.

  17. Rapid detection and differentiation of Clonorchis sinensis and Opisthorchis viverrini using real-time PCR and high resolution melting analysis.

    Science.gov (United States)

    Cai, Xian-Quan; Yu, Hai-Qiong; Li, Rong; Yue, Qiao-Yun; Liu, Guo-Hua; Bai, Jian-Shan; Deng, Yan; Qiu, De-Yi; Zhu, Xing-Quan

    2014-01-01

    Clonorchis sinensis and Opisthorchis viverrini are both important fish-borne pathogens, causing serious public health problem in Asia. The present study developed an assay integrating real-time PCR and high resolution melting (HRM) analysis for the specific detection and rapid identification of C. sinensis and O. viverrini. Primers targeting COX1 gene were highly specific for these liver flukes, as evidenced by the negative amplification of closely related trematodes. Assays using genomic DNA extracted from the two flukes yielded specific amplification and their identity was confirmed by sequencing, having the accuracy of 100% in reference to conventional methods. The assay was proved to be highly sensitive with a detection limit below 1 pg of purified genomic DNA, 5 EPG, or 1 metacercaria of C. sinensis. Moreover, C. sinensis and O. viverrini were able to be differentiated by their HRM profiles. The method can reduce labor of microscopic examination and the contamination of agarose electrophoresis. Moreover, it can differentiate these two flukes which are difficult to be distinguished using other methods. The established method provides an alternative tool for rapid, simple, and duplex detection of C. sinensis and O. viverrini.

  18. Rapid Detection and Differentiation of Clonorchis sinensis and Opisthorchis viverrini Using Real-Time PCR and High Resolution Melting Analysis

    Directory of Open Access Journals (Sweden)

    Xian-Quan Cai

    2014-01-01

    Full Text Available Clonorchis sinensis and Opisthorchis viverrini are both important fish-borne pathogens, causing serious public health problem in Asia. The present study developed an assay integrating real-time PCR and high resolution melting (HRM analysis for the specific detection and rapid identification of C. sinensis and O. viverrini. Primers targeting COX1 gene were highly specific for these liver flukes, as evidenced by the negative amplification of closely related trematodes. Assays using genomic DNA extracted from the two flukes yielded specific amplification and their identity was confirmed by sequencing, having the accuracy of 100% in reference to conventional methods. The assay was proved to be highly sensitive with a detection limit below 1 pg of purified genomic DNA, 5 EPG, or 1 metacercaria of C. sinensis. Moreover, C. sinensis and O. viverrini were able to be differentiated by their HRM profiles. The method can reduce labor of microscopic examination and the contamination of agarose electrophoresis. Moreover, it can differentiate these two flukes which are difficult to be distinguished using other methods. The established method provides an alternative tool for rapid, simple, and duplex detection of C. sinensis and O. viverrini.

  19. Rapid detection of sugar alcohol precursors and corresponding nitrate ester explosives using direct analysis in real time mass spectrometry.

    Science.gov (United States)

    Sisco, Edward; Forbes, Thomas P

    2015-04-21

    This work highlights the rapid detection of nitrate ester explosives and their sugar alcohol precursors by direct analysis in real time mass spectrometry (DART-MS) using an off-axis geometry. Demonstration of the effect of various parameters, such as ion polarity and in-source collision induced dissociation (CID) on the detection of these compounds is presented. Sensitivity of sugar alcohols and nitrate ester explosives was found to be greatest in negative ion mode with sensitivities ranging from hundreds of picograms to hundreds of nanograms, depending on the characteristics of the particular molecule. Altering the in-source CID potential allowed for acquisition of characteristic molecular ion spectra as well as fragmentation spectra. Additional studies were completed to identify the role of different experimental parameters on the sensitivity for these compounds. Variables that were examined included the DART gas stream temperature, the presence of a related compound (i.e., the effect of a precursor on the detection of a nitrate ester explosive), incorporation of dopant species and the role of the analysis surface. It was determined that each variable affected the response and detection of both sugar alcohols and the corresponding nitrate ester explosives. From this work, a rapid and sensitive method for the detection of individual sugar alcohols and corresponding nitrate ester explosives, or mixtures of the two, has been developed, providing a useful tool in the real-world identification of homemade explosives.

  20. Non-Invasive Rapid Harvest Time Determination of Oil-Producing Microalgae Cultivations for Biodiesel Production by Using Chlorophyll Fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Qiao, Yaqin [Key Laboratory of Algal Biology, Institute of Hydrobiology, Chinese Academy of Sciences, Wuhan (China); University of Chinese Academy of Sciences, Beijing (China); Rong, Junfeng [SINOPEC Research Institute of Petroleum Processing, Beijing (China); Chen, Hui; He, Chenliu; Wang, Qiang, E-mail: wangqiang@ihb.ac.cn [Key Laboratory of Algal Biology, Institute of Hydrobiology, Chinese Academy of Sciences, Wuhan (China)

    2015-10-05

    For the large-scale cultivation of microalgae for biodiesel production, one of the key problems is the determination of the optimum time for algal harvest when algae cells are saturated with neutral lipids. In this study, a method to determine the optimum harvest time in oil-producing microalgal cultivations by measuring the maximum photochemical efficiency of photosystem II, also called Fv/Fm, was established. When oil-producing Chlorella strains were cultivated and then treated with nitrogen starvation, it not only stimulated neutral lipid accumulation, but also affected the photosynthesis system, with the neutral lipid contents in all four algae strains – Chlorella sorokiniana C1, Chlorella sp. C2, C. sorokiniana C3, and C. sorokiniana C7 – correlating negatively with the Fv/Fm values. Thus, for the given oil-producing algae, in which a significant relationship between the neutral lipid content and Fv/Fm value under nutrient stress can be established, the optimum harvest time can be determined by measuring the value of Fv/Fm. It is hoped that this method can provide an efficient way to determine the harvest time rapidly and expediently in large-scale oil-producing microalgae cultivations for biodiesel production.

  1. Explicit time marching methods for the time-dependent Euler computations

    International Nuclear Information System (INIS)

    Tai, C.H.; Chiang, D.C.; Su, Y.P.

    1997-01-01

    Four explicit type time marching methods, including one proposed by the authors, are examined. The TVD conditions of this method are analyzed with the linear conservation law as the model equation. Performance of these methods when applied to the Euler equations are numerically tested. Seven examples are tested, the main concern is the performance of the methods when discontinuities with different strengths are encountered. When the discontinuity is getting stronger, spurious oscillation shows up for three existing methods, while the method proposed by the authors always gives the results with satisfaction. The effect of the limiter is also investigated. To put these methods in the same basis for the comparison the same spatial discretization is used. Roe's solver is used to evaluate the fluxes at the cell interface; spatially second-order accuracy is achieved by the MUSCL reconstruction. 19 refs., 8 figs

  2. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  7. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  8. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  9. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  10. Two-dimensional time-resolved x-ray diffraction study of dual phase rapid solidification in steels

    Science.gov (United States)

    Yonemura, Mitsuharu; Osuki, Takahiro; Terasaki, Hidenori; Komizo, Yuichi; Sato, Masugu; Toyokawa, Hidenori; Nozaki, Akiko

    2010-01-01

    The high intensity heat source used for fusion welding creates steep thermal gradients of 100 °C/s from 1800 °C. Further, the influence of preferred orientation is important for the observation of a directional solidification that follows the dendrite growth along the ⟨100⟩ direction toward the moving heat source. In the present study, we observed the rapid solidification of weld metal at a time resolution of 0.01-0.1 s by a two-dimensional time-resolved x-ray diffraction (2DTRXRD) system for real welding. The diffraction rings were dynamically observed by 2DTRXRD with synchrotron energy of 18 keV while the arc passes over the irradiation area of the x-rays. The arc power output was 10 V-150 A, and the scan speed of the arc was 1.0 mm/s. The temperature rise in instruments was suppressed by a water-cooled copper plate under the specimen. Further, the temperature distribution of the weld metal was measured by a thermocouple and correlated with the diffraction patterns. Consequently, solidification and solid phase transformation of low carbon steels and stainless steels were observed during rapid cooling by 2DTRXRD. In the low carbon steel, the microstructure is formed in a two step process, (i) formation of crystallites and (ii) increase of crystallinity. In stainless steel, the irregular interface layer of δ/γ in the quenched metal after solidification is expected to show the easy movement of dendrites at a lower temperature. In carbide precipitation stainless steel, it is easy for NbC to grow on δ phase with a little undercooling. Further, a mistlike pattern, which differs from the halo pattern, in the fusion zone gave some indication of the possibilities to observe the nucleation and the early solidification by 2DTRXRD.

  11. Rapid and sensitive detection of canine distemper virus by real-time reverse transcription recombinase polymerase amplification.

    Science.gov (United States)

    Wang, Jianchang; Wang, Jinfeng; Li, Ruiwen; Liu, Libing; Yuan, Wanzhe

    2017-08-15

    Canine distemper, caused by Canine distemper virus (CDV), is a highly contagious and fatal systemic disease in free-living and captive carnivores worldwide. Recombinase polymerase amplification (RPA), as an isothermal gene amplification technique, has been explored for the molecular detection of diverse pathogens. A real-time reverse transcription RPA (RT-RPA) assay for the detection of canine distemper virus (CDV) using primers and exo probe targeting the CDV nucleocapsid protein gene was developed. A series of other viruses were tested by the RT-RPA.Thirty-two field samples were further tested by RT-RPA, and the resuts were compared with those obtained by the real-time RT-PCR. The RT-RPA assay was performed successfully at 40 °C, and the results were obtained within 3 min-12 min. The assay could detect CDV, but did not show cross-detection of canine parvovirus-2 (CPV-2), canine coronavirus (CCoV), canine parainfluenza virus (CPIV), pseudorabies virus (PRV) or Newcastle disease virus (NDV), demonstrating high specificity. The analytical sensitivity of RT-RPA was 31.8 copies in vitro transcribed CDV RNA, which is 10 times lower than the real-time RT-PCR. The assay performance was validated by testing 32 field samples and compared to real-time RT-PCR. The results indicated an excellent correlation between RT-RPA and a reference real-time RT-PCR method. Both assays provided the same results, and R 2 value of the positive results was 0.947. The results demonstrated that the RT-RPA assay offers an alternative tool for simple, rapid, and reliable detection of CDV both in the laboratory and point-of-care facility, especially in the resource-limited settings.

  12. A computationally simple and robust method to detect determinism in a time series

    DEFF Research Database (Denmark)

    Lu, Sheng; Ju, Ki Hwan; Kanters, Jørgen K.

    2006-01-01

    We present a new, simple, and fast computational technique, termed the incremental slope (IS), that can accurately distinguish between deterministic from stochastic systems even when the variance of noise is as large or greater than the signal, and remains robust for time-varying signals. The IS ......We present a new, simple, and fast computational technique, termed the incremental slope (IS), that can accurately distinguish between deterministic from stochastic systems even when the variance of noise is as large or greater than the signal, and remains robust for time-varying signals...

  13. SLMRACE: a noise-free RACE implementation with reduced computational time

    Science.gov (United States)

    Chauvin, Juliet; Provenzi, Edoardo

    2017-05-01

    We present a faster and noise-free implementation of the RACE algorithm. RACE has mixed characteristics between the famous Retinex model of Land and McCann and the automatic color equalization (ACE) color-correction algorithm. The original random spray-based RACE implementation suffers from two main problems: its computational time and the presence of noise. Here, we will show that it is possible to adapt two techniques recently proposed by Banić et al. to the RACE framework in order to drastically decrease the computational time and noise generation. The implementation will be called smart-light-memory-RACE (SLMRACE).

  14. Ultrasonic divergent-beam scanner for time-of-flight tomography with computer evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Glover, G H

    1978-03-02

    The rotatable ultrasonic divergent-beam scanner is designed for time-of-flight tomography with computer evaluation. With it there can be measured parameters that are of importance for the structure of soft tissues, e.g. time as a function of the velocity distribution along a certain path of flight(the method is analogous to the transaxial X-ray tomography). Moreover it permits to perform the quantitative measurement of two-dimensional velocity distributions and may therefore be applied to serial examinations for detecting cancer of the breast. As computers digital memories as well as analog-digital-hybrid systems are suitable.

  15. A computationally simple model for determining the time dependent spectral neutron flux in a nuclear reactor core

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, E.A. [Department of Mechanical Engineering, University of Texas, Austin, TX (United States); Deinert, M.R. [Theoretical and Applied Mechanics, Cornell University, 219 Kimball Hall, Ithaca, NY 14853 (United States)]. E-mail: mrd6@cornell.edu; Cady, K.B. [Theoretical and Applied Mechanics, Cornell University, 219 Kimball Hall, Ithaca, NY 14853 (United States)

    2006-10-15

    The balance of isotopes in a nuclear reactor core is key to understanding the overall performance of a given fuel cycle. This balance is in turn most strongly affected by the time and energy-dependent neutron flux. While many large and involved computer packages exist for determining this spectrum, a simplified approach amenable to rapid computation is missing from the literature. We present such a model, which accepts as inputs the fuel element/moderator geometry and composition, reactor geometry, fuel residence time and target burnup and we compare it to OECD/NEA benchmarks for homogeneous MOX and UOX LWR cores. Collision probability approximations to the neutron transport equation are used to decouple the spatial and energy variables. The lethargy dependent neutron flux, governed by coupled integral equations for the fuel and moderator/coolant regions is treated by multigroup thermalization methods, and the transport of neutrons through space is modeled by fuel to moderator transport and escape probabilities. Reactivity control is achieved through use of a burnable poison or adjustable control medium. The model calculates the buildup of 24 actinides, as well as fission products, along with the lethargy dependent neutron flux and the results of several simulations are compared with benchmarked standards.

  16. Qualitative and Quantitative Analysis of Andrographis paniculata by Rapid Resolution Liquid Chromatography/Time-of-Flight Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Jian-Fei Qin

    2013-09-01

    Full Text Available A rapid resolution liquid chromatography/time-of-flight tandem mass spectrometry (RRLC-TOF/MS method was developed for qualitative and quantitative analysis of the major chemical constituents in Andrographis paniculata. Fifteen compounds, including flavonoids and diterpenoid lactones, were unambiguously or tentatively identified in 10 min by comparing their retention times and accurate masses with standards or literature data. The characteristic fragmentation patterns of flavonoids and diterpenoid lactones were summarized, and the structures of the unknown compounds were predicted. Andrographolide, dehydroandrographolide and neoandrographolide were further quantified as marker substances. It was found that the calibration curves for all analytes showed good linearity (R2 > 0.9995 within the test ranges. The overall limits of detection (LODs and limits of quantification (LOQs were 0.02 μg/mL to 0.06 μg/mL and 0.06 μg/mL to 0.2 μg/mL, respectively. The relative standard deviations (RSDs for intra- and inter-day precisions were below 3.3% and 4.2%, respectively. The mean recovery rates ranged from 96.7% to 104.5% with the relative standard deviations (RSDs less than 2.72%. It is concluded that RRLC-TOF/MS is powerful and practical in qualitative and quantitative analysis of complex plant samples due to time savings, sensitivity, precision, accuracy and lowering solvent consumption.

  17. Rapid detection of Salmonella in pet food: design and evaluation of integrated methods based on real-time PCR detection.

    Science.gov (United States)

    Balachandran, Priya; Friberg, Maria; Vanlandingham, V; Kozak, K; Manolis, Amanda; Brevnov, Maxim; Crowley, Erin; Bird, Patrick; Goins, David; Furtado, Manohar R; Petrauskene, Olga V; Tebbs, Robert S; Charbonneau, Duane

    2012-02-01

    Reducing the risk of Salmonella contamination in pet food is critical for both companion animals and humans, and its importance is reflected by the substantial increase in the demand for pathogen testing. Accurate and rapid detection of foodborne pathogens improves food safety, protects the public health, and benefits food producers by assuring product quality while facilitating product release in a timely manner. Traditional culture-based methods for Salmonella screening are laborious and can take 5 to 7 days to obtain definitive results. In this study, we developed two methods for the detection of low levels of Salmonella in pet food using real-time PCR: (i) detection of Salmonella in 25 g of dried pet food in less than 14 h with an automated magnetic bead-based nucleic acid extraction method and (ii) detection of Salmonella in 375 g of composite dry pet food matrix in less than 24 h with a manual centrifugation-based nucleic acid preparation method. Both methods included a preclarification step using a novel protocol that removes food matrix-associated debris and PCR inhibitors and improves the sensitivity of detection. Validation studies revealed no significant differences between the two real-time PCR methods and the standard U.S. Food and Drug Administration Bacteriological Analytical Manual (chapter 5) culture confirmation method.

  18. Matching time and spatial scales of rapid solidification: dynamic TEM experiments coupled to CALPHAD-informed phase-field simulations

    Science.gov (United States)

    Perron, Aurelien; Roehling, John D.; Turchi, Patrice E. A.; Fattebert, Jean-Luc; McKeown, Joseph T.

    2018-01-01

    A combination of dynamic transmission electron microscopy (DTEM) experiments and CALPHAD-informed phase-field simulations was used to study rapid solidification in Cu-Ni thin-film alloys. Experiments—conducted in the DTEM—consisted of in situ laser melting and determination of the solidification kinetics by monitoring the solid-liquid interface and the overall microstructure evolution (time-resolved measurements) during the solidification process. Modelling of the Cu-Ni alloy microstructure evolution was based on a phase-field model that included realistic Gibbs energies and diffusion coefficients from the CALPHAD framework (thermodynamic and mobility databases). DTEM and post mortem experiments highlighted the formation of microsegregation-free columnar grains with interface velocities varying from ˜0.1 to ˜0.6 m s-1. After an ‘incubation’ time, the velocity of the planar solid-liquid interface accelerated until solidification was complete. In addition, a decrease of the temperature gradient induced a decrease in the interface velocity. The modelling strategy permitted the simulation (in 1D and 2D) of the solidification process from the initially diffusion-controlled to the nearly partitionless regimes. Finally, results of DTEM experiments and phase-field simulations (grain morphology, solute distribution, and solid-liquid interface velocity) were consistent at similar time (μs) and spatial scales (μm).

  19. Computer-games for gravitational wave science outreach: Black Hole Pong and Space Time Quest

    International Nuclear Information System (INIS)

    Carbone, L; Bond, C; Brown, D; Brückner, F; Grover, K; Lodhia, D; Mingarelli, C M F; Fulda, P; Smith, R J E; Unwin, R; Vecchio, A; Wang, M; Whalley, L; Freise, A

    2012-01-01

    We have established a program aimed at developing computer applications and web applets to be used for educational purposes as well as gravitational wave outreach activities. These applications and applets teach gravitational wave physics and technology. The computer programs are generated in collaboration with undergraduates and summer students as part of our teaching activities, and are freely distributed on a dedicated website. As part of this program, we have developed two computer-games related to gravitational wave science: 'Black Hole Pong' and 'Space Time Quest'. In this article we present an overview of our computer related outreach activities and discuss the games and their educational aspects, and report on some positive feedback received.

  20. A users manual for a computer program which calculates time optical geocentric transfers using solar or nuclear electric and high thrust propulsion

    Science.gov (United States)

    Sackett, L. L.; Edelbaum, T. N.; Malchow, H. L.

    1974-01-01

    This manual is a guide for using a computer program which calculates time optimal trajectories for high-and low-thrust geocentric transfers. Either SEP or NEP may be assumed and a one or two impulse, fixed total delta V, initial high thrust phase may be included. Also a single impulse of specified delta V may be included after the low thrust state. The low thrust phase utilizes equinoctial orbital elements to avoid the classical singularities and Kryloff-Boguliuboff averaging to help insure more rapid computation time. The program is written in FORTRAN 4 in double precision for use on an IBM 360 computer. The manual includes a description of the problem treated, input/output information, examples of runs, and source code listings.

  1. Computational Procedures for a Class of GI/D/k Systems in Discrete Time

    Directory of Open Access Journals (Sweden)

    Md. Mostafizur Rahman

    2009-01-01

    Full Text Available A class of discrete time GI/D/k systems is considered for which the interarrival times have finite support and customers are served in first-in first-out (FIFO order. The system is formulated as a single server queue with new general independent interarrival times and constant service duration by assuming cyclic assignment of customers to the identical servers. Then the queue length is set up as a quasi-birth-death (QBD type Markov chain. It is shown that this transformed GI/D/1 system has special structures which make the computation of the matrix R simple and efficient, thereby reducing the number of multiplications in each iteration significantly. As a result we were able to keep the computation time very low. Moreover, use of the resulting structural properties makes the computation of the distribution of queue length of the transformed system efficient. The computation of the distribution of waiting time is also shown to be simple by exploiting the special structures.

  2. Television viewing, computer use and total screen time in Canadian youth.

    Science.gov (United States)

    Mark, Amy E; Boyce, William F; Janssen, Ian

    2006-11-01

    Research has linked excessive television viewing and computer use in children and adolescents to a variety of health and social problems. Current recommendations are that screen time in children and adolescents should be limited to no more than 2 h per day. To determine the percentage of Canadian youth meeting the screen time guideline recommendations. The representative study sample consisted of 6942 Canadian youth in grades 6 to 10 who participated in the 2001/2002 World Health Organization Health Behaviour in School-Aged Children survey. Only 41% of girls and 34% of boys in grades 6 to 10 watched 2 h or less of television per day. Once the time of leisure computer use was included and total daily screen time was examined, only 18% of girls and 14% of boys met the guidelines. The prevalence of those meeting the screen time guidelines was higher in girls than boys. Fewer than 20% of Canadian youth in grades 6 to 10 met the total screen time guidelines, suggesting that increased public health interventions are needed to reduce the number of leisure time hours that Canadian youth spend watching television and using the computer.

  3. Rapid diagnostic tests as a source of DNA for Plasmodium species-specific real-time PCR

    Directory of Open Access Journals (Sweden)

    Van Esbroeck Marjan

    2011-03-01

    Full Text Available Abstract Background This study describes the use of malaria rapid diagnostic tests (RDTs as a source of DNA for Plasmodium species-specific real-time PCR. Methods First, the best method to recover DNA from RDTs was investigated and then the applicability of this DNA extraction method was assessed on 12 different RDT brands. Finally, two RDT brands (OptiMAL Rapid Malaria Test and SDFK60 malaria Ag Plasmodium falciparum/Pan test were comprehensively evaluated on a panel of clinical samples submitted for routine malaria diagnosis at ITM. DNA amplification was done with the 18S rRNA real-time PCR targeting the four Plasmodium species. Results of PCR on RDT were compared to those obtained by PCR on whole blood samples. Results Best results were obtained by isolating DNA from the proximal part of the nitrocellulose component of the RDT strip with a simple DNA elution method. The PCR on RDT showed a detection limit of 0.02 asexual parasites/μl, which was identical to the same PCR on whole blood. For all 12 RDT brands tested, DNA was detected except for one brand when a low parasite density sample was applied. In RDTs with a plastic seal covering the nitrocellulose strip, DNA extraction was hampered. PCR analysis on clinical RDT samples demonstrated correct identification for single species infections for all RDT samples with asexual parasites of P. falciparum (n = 60, Plasmodium vivax (n = 10, Plasmodium ovale (n = 10 and Plasmodium malariae (n = 10. Samples with only gametocytes were detected in all OptiMAL and in 10 of the 11 SDFK60 tests. None of the negative samples (n = 20 gave a signal by PCR on RDT. With PCR on RDT, higher Ct-values were observed than with PCR on whole blood, with a mean difference of 2.68 for OptiMAL and 3.53 for SDFK60. Mixed infections were correctly identified with PCR on RDT in 4/5 OptiMAL tests and 2/5 SDFK60 tests. Conclusions RDTs are a reliable source of DNA for Plasmodium real-time PCR. This study demonstrates the

  4. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    Science.gov (United States)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  5. Transnasal Humidified Rapid-Insufflation Ventilatory Exchange (THRIVE): a physiological method of increasing apnoea time in patients with difficult airways.

    Science.gov (United States)

    Patel, A; Nouraei, S A R

    2015-03-01

    Emergency and difficult tracheal intubations are hazardous undertakings where successive laryngoscopy-hypoxaemia-re-oxygenation cycles can escalate to airway loss and the 'can't intubate, can't ventilate' scenario. Between 2013 and 2014, we extended the apnoea times of 25 patients with difficult airways who were undergoing general anaesthesia for hypopharyngeal or laryngotracheal surgery. This was achieved through continuous delivery of transnasal high-flow humidified oxygen, initially to provide pre-oxygenation, and continuing as post-oxygenation during intravenous induction of anaesthesia and neuromuscular blockade until a definitive airway was secured. Apnoea time commenced at administration of neuromuscular blockade and ended with commencement of jet ventilation, positive-pressure ventilation or recommencement of spontaneous ventilation. During this time, upper airway patency was maintained with jaw-thrust. Transnasal Humidified Rapid-Insufflation Ventilatory Exchange (THRIVE) was used in 15 males and 10 females. Mean (SD [range]) age at treatment was 49 (15 [25-81]) years. The median (IQR [range]) Mallampati grade was 3 (2-3 [2-4]) and direct laryngoscopy grade was 3 (3-3 [2-4]). There were 12 obese patients and nine patients were stridulous. The median (IQR [range]) apnoea time was 14 (9-19 [5-65]) min. No patient experienced arterial desaturation gaseous exchange through flow-dependent deadspace flushing. It has the potential to transform the practice of anaesthesia by changing the nature of securing a definitive airway in emergency and difficult intubations from a pressured stop-start process to a smooth and unhurried undertaking. © 2014 The Authors Anaesthesia published by John Wiley & Sons Ltd on behalf of Association of Anaesthetists of Great Britain and Ireland.

  6. Efficient Buffer Capacity and Scheduler Setting Computation for Soft Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Bekooij, Marco; Bekooij, Marco Jan Gerrit; Wiggers, M.H.; van Meerbergen, Jef

    2007-01-01

    Soft real-time applications that process data streams can often be intuitively described as dataflow process networks. In this paper we present a novel analysis technique to compute conservative estimates of the required buffer capacities in such process networks. With the same analysis technique

  7. Computational time analysis of the numerical solution of 3D electrostatic Poisson's equation

    Science.gov (United States)

    Kamboh, Shakeel Ahmed; Labadin, Jane; Rigit, Andrew Ragai Henri; Ling, Tech Chaw; Amur, Khuda Bux; Chaudhary, Muhammad Tayyab

    2015-05-01

    3D Poisson's equation is solved numerically to simulate the electric potential in a prototype design of electrohydrodynamic (EHD) ion-drag micropump. Finite difference method (FDM) is employed to discretize the governing equation. The system of linear equations resulting from FDM is solved iteratively by using the sequential Jacobi (SJ) and sequential Gauss-Seidel (SGS) methods, simulation results are also compared to examine the difference between the results. The main objective was to analyze the computational time required by both the methods with respect to different grid sizes and parallelize the Jacobi method to reduce the computational time. In common, the SGS method is faster than the SJ method but the data parallelism of Jacobi method may produce good speedup over SGS method. In this study, the feasibility of using parallel Jacobi (PJ) method is attempted in relation to SGS method. MATLAB Parallel/Distributed computing environment is used and a parallel code for SJ method is implemented. It was found that for small grid size the SGS method remains dominant over SJ method and PJ method while for large grid size both the sequential methods may take nearly too much processing time to converge. Yet, the PJ method reduces computational time to some extent for large grid sizes.

  8. Math modeling and computer mechanization for real time simulation of rotary-wing aircraft

    Science.gov (United States)

    Howe, R. M.

    1979-01-01

    Mathematical modeling and computer mechanization for real time simulation of rotary wing aircraft is discussed. Error analysis in the digital simulation of dynamic systems, such as rotary wing aircraft is described. The method for digital simulation of nonlinearities with discontinuities, such as exist in typical flight control systems and rotor blade hinges, is discussed.

  9. Efficient Geo-Computational Algorithms for Constructing Space-Time Prisms in Road Networks

    Directory of Open Access Journals (Sweden)

    Hui-Ping Chen

    2016-11-01

    Full Text Available The Space-time prism (STP is a key concept in time geography for analyzing human activity-travel behavior under various Space-time constraints. Most existing time-geographic studies use a straightforward algorithm to construct STPs in road networks by using two one-to-all shortest path searches. However, this straightforward algorithm can introduce considerable computational overhead, given the fact that accessible links in a STP are generally a small portion of the whole network. To address this issue, an efficient geo-computational algorithm, called NTP-A*, is proposed. The proposed NTP-A* algorithm employs the A* and branch-and-bound techniques to discard inaccessible links during two shortest path searches, and thereby improves the STP construction performance. Comprehensive computational experiments are carried out to demonstrate the computational advantage of the proposed algorithm. Several implementation techniques, including the label-correcting technique and the hybrid link-node labeling technique, are discussed and analyzed. Experimental results show that the proposed NTP-A* algorithm can significantly improve STP construction performance in large-scale road networks by a factor of 100, compared with existing algorithms.

  10. Computer-generated versus nurse-determined strategy for incubator humidity and time to regain birthweight

    NARCIS (Netherlands)

    Helder, Onno K.; Mulder, Paul G. H.; van Goudoever, Johannes B.

    2008-01-01

    To compare effects on premature infants' weight gain of a computer-generated and a nurse-determined incubator humidity strategy. An optimal humidity protocol is thought to reduce time to regain birthweight. Prospective randomized controlled design. Level IIIC neonatal intensive care unit in the

  11. Computing Camps for Girls : A First-Time Experience at the University of Limerick

    NARCIS (Netherlands)

    McInerney, Clare; Lamprecht, A.L.; Margaria, Tiziana

    2018-01-01

    Increasing the number of females in ICT-related university courses has been a major concern for several years. In 2015, we offered a girls-only computing summer camp for the first time, as a new component in our education and outreach activities to foster students’ interest in our discipline. In

  12. A Real-Time Plagiarism Detection Tool for Computer-Based Assessments

    Science.gov (United States)

    Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P.

    2018-01-01

    Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…

  13. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  14. Green computing: power optimisation of VFI-based real-time multiprocessor dataflow applications (extended version)

    NARCIS (Netherlands)

    Ahmad, W.; Holzenspies, P.K.F.; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2015-01-01

    Execution time is no longer the only performance metric for computer systems. In fact, a trend is emerging to trade raw performance for energy savings. Techniques like Dynamic Power Management (DPM, switching to low power state) and Dynamic Voltage and Frequency Scaling (DVFS, throttling processor

  15. Green computing: power optimisation of vfi-based real-time multiprocessor dataflow applications

    NARCIS (Netherlands)

    Ahmad, W.; Holzenspies, P.K.F.; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2015-01-01

    Execution time is no longer the only performance metric for computer systems. In fact, a trend is emerging to trade raw performance for energy savings. Techniques like Dynamic Power Management (DPM, switching to low power state) and Dynamic Voltage and Frequency Scaling (DVFS, throttling processor

  16. Effect of exposure time reduction towards sensitivity and SNR for computed radiography (CR) application in NDT

    International Nuclear Information System (INIS)

    Sapizah Rahim; Khairul Anuar Mohd Salleh; Noorhazleena Azaman; Shaharudin Sayuti; Siti Madiha Muhammad Amir; Arshad Yassin; Abdul Razak Hamzah

    2010-01-01

    Signal-to-noise ratio (SNR) and sensitivity study of Computed Radiography (CR) system with reduction of exposure time is presented. The purposes of this research are to determine the behavior of SNR toward three different thicknesses (step wedge; 5, 10 and 15 mm) and the ability of CR system to recognize hole type penetrameter when the exposure time decreased up to 80 % according to the exposure chart (D7; ISOVOLT Titan E). It is shown that the SNR is decreased with decreasing of exposure time percentage but the high quality image is achieved until 80 % reduction of exposure time. (author)

  17. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    Science.gov (United States)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  18. Real-Time PCR Typing of Escherichia coli Based on Multiple Single Nucleotide Polymorphisms--a Convenient and Rapid Method.

    Science.gov (United States)

    Lager, Malin; Mernelius, Sara; Löfgren, Sture; Söderman, Jan

    2016-01-01

    Healthcare-associated infections caused by Escherichia coli and antibiotic resistance due to extended-spectrum beta-lactamase (ESBL) production constitute a threat against patient safety. To identify, track, and control outbreaks and to detect emerging virulent clones, typing tools of sufficient discriminatory power that generate reproducible and unambiguous data are needed. A probe based real-time PCR method targeting multiple single nucleotide polymorphisms (SNP) was developed. The method was based on the multi locus sequence typing scheme of Institute Pasteur and by adaptation of previously described typing assays. An 8 SNP-panel that reached a Simpson's diversity index of 0.95 was established, based on analysis of sporadic E. coli cases (ESBL n = 27 and non-ESBL n = 53). This multi-SNP assay was used to identify the sequence type 131 (ST131) complex according to the Achtman's multi locus sequence typing scheme. However, it did not fully discriminate within the complex but provided a diagnostic signature that outperformed a previously described detection assay. Pulsed-field gel electrophoresis typing of isolates from a presumed outbreak (n = 22) identified two outbreaks (ST127 and ST131) and three different non-outbreak-related isolates. Multi-SNP typing generated congruent data except for one non-outbreak-related ST131 isolate. We consider multi-SNP real-time PCR typing an accessible primary generic E. coli typing tool for rapid and uniform type identification.

  19. Rapid and Accurate Identification by Real-Time PCR of Biotoxin-Producing Dinoflagellates from the Family Gymnodiniaceae

    Directory of Open Access Journals (Sweden)

    Kirsty F. Smith

    2014-03-01

    Full Text Available The identification of toxin-producing dinoflagellates for monitoring programmes and bio-compound discovery requires considerable taxonomic expertise. It can also be difficult to morphologically differentiate toxic and non-toxic species or strains. Various molecular methods have been used for dinoflagellate identification and detection, and this study describes the development of eight real-time polymerase chain reaction (PCR assays targeting the large subunit ribosomal RNA (LSU rRNA gene of species from the genera Gymnodinium, Karenia, Karlodinium, and Takayama. Assays proved to be highly specific and sensitive, and the assay for G. catenatum was further developed for quantification in response to a bloom in Manukau Harbour, New Zealand. The assay estimated cell densities from environmental samples as low as 0.07 cells per PCR reaction, which equated to three cells per litre. This assay not only enabled conclusive species identification but also detected the presence of cells below the limit of detection for light microscopy. This study demonstrates the usefulness of real-time PCR as a sensitive and rapid molecular technique for the detection and quantification of micro-algae from environmental samples.

  20. Rapid and accurate identification by real-time PCR of biotoxin-producing dinoflagellates from the family gymnodiniaceae.

    Science.gov (United States)

    Smith, Kirsty F; de Salas, Miguel; Adamson, Janet; Rhodes, Lesley L

    2014-03-07

    The identification of toxin-producing dinoflagellates for monitoring programmes and bio-compound discovery requires considerable taxonomic expertise. It can also be difficult to morphologically differentiate toxic and non-toxic species or strains. Various molecular methods have been used for dinoflagellate identification and detection, and this study describes the development of eight real-time polymerase chain reaction (PCR) assays targeting the large subunit ribosomal RNA (LSU rRNA) gene of species from the genera Gymnodinium, Karenia, Karlodinium, and Takayama. Assays proved to be highly specific and sensitive, and the assay for G. catenatum was further developed for quantification in response to a bloom in Manukau Harbour, New Zealand. The assay estimated cell densities from environmental samples as low as 0.07 cells per PCR reaction, which equated to three cells per litre. This assay not only enabled conclusive species identification but also detected the presence of cells below the limit of detection for light microscopy. This study demonstrates the usefulness of real-time PCR as a sensitive and rapid molecular technique for the detection and quantification of micro-algae from environmental samples.

  1. Soil Baiting, Rapid PCR Assay and Quantitative Real Time PCR to Diagnose Late Blight of Potato in Quarantine Programs

    Directory of Open Access Journals (Sweden)

    Touseef Hussain

    2018-05-01

    Full Text Available Phytophthora infestans (mont de Bary is a pathogen of great concern across the globe, and accurate detection is an important component in responding to the outbreaks of potential disease. Although the molecular diagnostic protocol used in regulatory programs has been evaluated but till date methods implying direct comparison has rarely used. In this study, a known area soil samples from potato fields where light blight appear every year (both A1 and A2 mating type was assayed by soil bait method, PCR assay detection and quantification of the inoculums. Suspected disease symptoms appeared on bait tubers were further confirmed by rapid PCR, inoculums were quantified through Real Time PCR, which confirms presence of P. infestans. These diagnostic methods can be highly correlated with one another. Potato tuber baiting increased the sensitivity of the assay compared with direct extraction of DNA from tuber and soil samples. Our study determines diagnostic sensitivity and specificity of the assays to determine the performance of each method. Overall, molecular techniques based on different types of PCR amplification and Real-time PCR can lead to high throughput, faster and more accurate detection method which can be used in quarantine programmes in potato industry and diagnostic laboratory.

  2. A computer-based time study system for timber harvesting operations

    Science.gov (United States)

    Jingxin Wang; Joe McNeel; John Baumgras

    2003-01-01

    A computer-based time study system was developed for timber harvesting operations. Object-oriented techniques were used to model and design the system. The front-end of the time study system resides on the MS Windows CE and the back-end is supported by MS Access. The system consists of three major components: a handheld system, data transfer interface, and data storage...

  3. Cochrane Rapid Reviews Methods Group to play a leading role in guiding the production of informed high-quality, timely research evidence syntheses.

    Science.gov (United States)

    Garritty, Chantelle; Stevens, Adrienne; Gartlehner, Gerald; King, Valerie; Kamel, Chris

    2016-10-28

    Policymakers and healthcare stakeholders are increasingly seeking evidence to inform the policymaking process, and often use existing or commissioned systematic reviews to inform decisions. However, the methodologies that make systematic reviews authoritative take time, typically 1 to 2 years to complete. Outside the traditional SR timeline, "rapid reviews" have emerged as an efficient tool to get evidence to decision-makers more quickly. However, the use of rapid reviews does present challenges. To date, there has been limited published empirical information about this approach to compiling evidence. Thus, it remains a poorly understood and ill-defined set of diverse methodologies with various labels. In recent years, the need to further explore rapid review methods, characteristics, and their use has been recognized by a growing network of healthcare researchers, policymakers, and organizations, several with ties to Cochrane, which is recognized as representing an international gold standard for high-quality, systematic reviews. In this commentary, we introduce the newly established Cochrane Rapid Reviews Methods Group developed to play a leading role in guiding the production of rapid reviews given they are increasingly employed as a research synthesis tool to support timely evidence-informed decision-making. We discuss how the group was formed and outline the group's structure and remit. We also discuss the need to establish a more robust evidence base for rapid reviews in the published literature, and the importance of promoting registration of rapid review protocols in an effort to promote efficiency and transparency in research. As with standard systematic reviews, the core principles of evidence-based synthesis should apply to rapid reviews in order to minimize bias to the extent possible. The Cochrane Rapid Reviews Methods Group will serve to establish a network of rapid review stakeholders and provide a forum for discussion and training. By facilitating

  4. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Directory of Open Access Journals (Sweden)

    Yeqing Zhang

    2018-02-01

    Full Text Available For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully.

  5. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Science.gov (United States)

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  6. Computationally determining the salience of decision points for real-time wayfinding support

    Directory of Open Access Journals (Sweden)

    Makoto Takemiya

    2012-06-01

    Full Text Available This study introduces the concept of computational salience to explain the discriminatory efficacy of decision points, which in turn may have applications to providing real-time assistance to users of navigational aids. This research compared algorithms for calculating the computational salience of decision points and validated the results via three methods: high-salience decision points were used to classify wayfinders; salience scores were used to weight a conditional probabilistic scoring function for real-time wayfinder performance classification; and salience scores were correlated with wayfinding-performance metrics. As an exploratory step to linking computational and cognitive salience, a photograph-recognition experiment was conducted. Results reveal a distinction between algorithms useful for determining computational and cognitive saliences. For computational salience, information about the structural integration of decision points is effective, while information about the probability of decision-point traversal shows promise for determining cognitive salience. Limitations from only using structural information and motivations for future work that include non-structural information are elicited.

  7. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo; Abdelaziz, Ibrahim; Aldilaijan, Abdulla; Canini, Marco; Kalnis, Panos

    2017-01-01

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers' computation time.

  8. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo

    2017-11-27

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers\\' computation time.

  9. Real-time computer treatment of THz passive device images with the high image quality

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  10. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  12. Quantification of Artifact Reduction With Real-Time Cine Four-Dimensional Computed Tomography Acquisition Methods

    International Nuclear Information System (INIS)

    Langner, Ulrich W.; Keall, Paul J.

    2010-01-01

    Purpose: To quantify the magnitude and frequency of artifacts in simulated four-dimensional computed tomography (4D CT) images using three real-time acquisition methods- direction-dependent displacement acquisition, simultaneous displacement and phase acquisition, and simultaneous displacement and velocity acquisition- and to compare these methods with commonly used retrospective phase sorting. Methods and Materials: Image acquisition for the four 4D CT methods was simulated with different displacement and velocity tolerances for spheres with radii of 0.5 cm, 1.5 cm, and 2.5 cm, using 58 patient-measured tumors and respiratory motion traces. The magnitude and frequency of artifacts, CT doses, and acquisition times were computed for each method. Results: The mean artifact magnitude was 50% smaller for the three real-time methods than for retrospective phase sorting. The dose was ∼50% lower, but the acquisition time was 20% to 100% longer for the real-time methods than for retrospective phase sorting. Conclusions: Real-time acquisition methods can reduce the frequency and magnitude of artifacts in 4D CT images, as well as the imaging dose, but they increase the image acquisition time. The results suggest that direction-dependent displacement acquisition is the preferred real-time 4D CT acquisition method, because on average, the lowest dose is delivered to the patient and the acquisition time is the shortest for the resulting number and magnitude of artifacts.

  13. An Efficient Integer Coding and Computing Method for Multiscale Time Segment

    Directory of Open Access Journals (Sweden)

    TONG Xiaochong

    2016-12-01

    Full Text Available This article focus on the exist problem and status of current time segment coding, proposed a new set of approach about time segment coding: multi-scale time segment integer coding (MTSIC. This approach utilized the tree structure and the sort by size formed among integer, it reflected the relationship among the multi-scale time segments: order, include/contained, intersection, etc., and finally achieved an unity integer coding processing for multi-scale time. On this foundation, this research also studied the computing method for calculating the time relationships of MTSIC, to support an efficient calculation and query based on the time segment, and preliminary discussed the application method and prospect of MTSIC. The test indicated that, the implement of MTSIC is convenient and reliable, and the transformation between it and the traditional method is convenient, it has the very high efficiency in query and calculating.

  14. Impact of rapid molecular diagnostic tests on time to treatment initiation and outcomes in patients with multidrug-resistant tuberculosis, Tamil Nadu, India.

    Science.gov (United States)

    Nair, Dina; Navneethapandian, Pooranaganga D; Tripathy, Jaya Prasad; Harries, Anthony D; Klinton, Joel S; Watson, Basilea; Sivaramakrishnan, Gomathi N; Reddy, Devarajulu S; Murali, Lakshmi; Natrajan, Mohan; Swaminathan, Soumya

    2016-09-01

    India is replacing culture and drug sensitivity testing (CDST) with rapid molecular tests for diagnosing MDR-TB. We assessed the impact of rapid tests on time to initiation of treatment and outcomes in patients with MDR-TB compared with CDST. A retrospective cohort study involving MDR-TB patients from six districts in Tamil Nadu state, who underwent CDST (2010-2011) and rapid tests (2012-2013). There were 135 patients in the CDST group and 389 in the rapid diagnostic test group. Median time from sputum receipt at the laboratory to initiation of MDR-TB treatment was 130 days (IQR 75-213) in the CDST group and 22 days (IQR 14-38) in the rapid diagnostic test group (p30% in both groups and missing data were higher in CDST (13%) compared with rapid tests (3%). There were significantly higher risks of unfavourable treatment outcomes in males (aRR 1.3, 95% CI 1.1-1.5) and those with treatment initiation delays >30 days (aRR 1.3, 95% CI 1.0-1.6). Rapid molecular diagnostic tests shortened the time to initiate treatment which was associated with reduced unfavourable outcomes in MDR-TB patients. This supports the policy to scale up these tests in India. © The Author 2016. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. CNR considerations for rapid real-time MRI tumor tracking in radiotherapy hybrid devices: Effects of B0 field strength

    International Nuclear Information System (INIS)

    Wachowicz, K.; De Zanche, N.; Yip, E.; Volotovskyy, V.; Fallone, B. G.

    2016-01-01

    Purpose: This work examines the subject of contrast-to-noise ratio (CNR), specifically between tumor and tissue background, and its dependence on the MRI field strength, B 0 . This examination is motivated by the recent interest and developments in MRI/radiotherapy hybrids where real-time imaging can be used to guide treatment beams. The ability to distinguish a tumor from background tissue is of primary importance in this field, and this work seeks to elucidate the complex relationship between the CNR and B 0 that is too often assumed to be purely linear. Methods: Experimentally based models of B 0 -dependant relaxation for various tumor and normal tissues from the literature were used in conjunction with signal equations for MR sequences suitable for rapid real-time imaging to develop field-dependent predictions for CNR. These CNR models were developed for liver, lung, breast, glioma, and kidney tumors for spoiled gradient-echo, balanced steady-state free precession (bSSFP), and single-shot half-Fourier fast spin echo sequences. Results: Due to the pattern in which the relaxation properties of tissues are found to vary over B 0 field (specifically the T 1 time), there was always an improved CNR at lower fields compared to linear dependency. Further, in some tumor sites, the CNR at lower fields was found to be comparable to, or sometimes higher than those at higher fields (i.e., bSSFP CNR for glioma, kidney, and liver tumors). Conclusions: In terms of CNR, lower B 0 fields have been shown to perform as well or better than higher fields for some tumor sites due to superior T 1 contrast. In other sites this effect was less pronounced, reversing the CNR advantage. This complex relationship between CNR and B 0 reveals both low and high magnetic fields as viable options for tumor tracking in MRI/radiotherapy hybrids.

  16. Time-of-night variations in the story-like organization of dream experience developed during rapid eye movement sleep.

    Science.gov (United States)

    Cipolli, Carlo; Guazzelli, Mario; Bellucci, Claudia; Mazzetti, Michela; Palagini, Laura; Rosenlicht, Nicholas; Feinberg, Irwin

    2015-04-01

    This study aimed to investigate the cycles (2nd/4th) and duration-related (5/10 min) variations in the story-like organization of dream experience elaborated during rapid eye movement (REM) sleep. Dream reports were analysed using story grammar rules. Reports were provided by those subjects (14 of 22) capable of reporting a dream after each of the four awakenings provoked in 2 consecutive nights during REM sleep of the 2nd and 4th cycles, after periods of either 5 or 10 min, counterbalanced across the nights. Two researchers who were blind as to the sleep condition scored the dream reports independently. The values of the indicators of report length (measured as value of total word count) and of story-like organization of dream reports were matched taking time-of-night (2nd and 4th cycles) and REM duration (5 versus 10 min) as factors. Two-way analyses of variance showed that report length increased significantly in 4th-cycle REM sleep and nearly significantly for longer REM duration, whereas the number of dream-stories per report did not vary. The indices of sequential (number of statements describing the event structure developed in the story) and hierarchical (number of episodes per story) organization increased significantly only in dream-stories reported after 10 min of 4th-cycle REM sleep. These findings indicate that the characteristics of structural organization of dream-stories vary along with time of night, and suggest that the elaboration of a long and complex dream-story requires a fairly long time and the availability of a great amount of cognitive resources to maintain its continuity and coherence. © 2014 European Sleep Research Society.

  17. PRO-QUEST: a rapid assessment method based on progressive saturation for quantifying exchange rates using saturation times in CEST.

    Science.gov (United States)

    Demetriou, Eleni; Tachrount, Mohamed; Zaiss, Moritz; Shmueli, Karin; Golay, Xavier

    2018-03-05

    To develop a new MRI technique to rapidly measure exchange rates in CEST MRI. A novel pulse sequence for measuring chemical exchange rates through a progressive saturation recovery process, called PRO-QUEST (progressive saturation for quantifying exchange rates using saturation times), has been developed. Using this method, the water magnetization is sampled under non-steady-state conditions, and off-resonance saturation is interleaved with the acquisition of images obtained through a Look-Locker type of acquisition. A complete theoretical framework has been set up, and simple equations to obtain the exchange rates have been derived. A reduction of scan time from 58 to 16 minutes has been obtained using PRO-QUEST versus the standard QUEST. Maps of both T 1 of water and B 1 can simply be obtained by repetition of the sequence without off-resonance saturation pulses. Simulations and calculated exchange rates from experimental data using amino acids such as glutamate, glutamine, taurine, and alanine were compared and found to be in good agreement. The PRO-QUEST sequence was also applied on healthy and infarcted rats after 24 hours, and revealed that imaging specificity to ischemic acidification during stroke was substantially increased relative to standard amide proton transfer-weighted imaging. Because of the reduced scan time and insensitivity to nonchemical exchange factors such as direct water saturation, PRO-QUEST can serve as an excellent alternative for researchers and clinicians interested to map pH changes in vivo. © 2018 International Society for Magnetic Resonance in Medicine.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  19. An assessment of the real-time application capabilities of the SIFT computer system

    Science.gov (United States)

    Butler, R. W.

    1982-01-01

    The real-time capabilities of the SIFT computer system, a highly reliable multicomputer architecture developed to support the flight controls of a relaxed static stability aircraft, are discussed. The SIFT computer system was designed to meet extremely high reliability requirements and to facilitate a formal proof of its correctness. Although SIFT represents a significant achievement in fault-tolerant system research it presents an unusual and restrictive interface to its users. The characteristics of the user interface and its impact on application system design are assessed.

  20. Theory and computation of disturbance invariant sets for discrete-time linear systems

    Directory of Open Access Journals (Sweden)

    Kolmanovsky Ilya

    1998-01-01

    Full Text Available This paper considers the characterization and computation of invariant sets for discrete-time, time-invariant, linear systems with disturbance inputs whose values are confined to a specified compact set but are otherwise unknown. The emphasis is on determining maximal disturbance-invariant sets X that belong to a specified subset Γ of the state space. Such d-invariant sets have important applications in control problems where there are pointwise-in-time state constraints of the form χ ( t ∈ Γ . One purpose of the paper is to unite and extend in a rigorous way disparate results from the prior literature. In addition there are entirely new results. Specific contributions include: exploitation of the Pontryagin set difference to clarify conceptual matters and simplify mathematical developments, special properties of maximal invariant sets and conditions for their finite determination, algorithms for generating concrete representations of maximal invariant sets, practical computational questions, extension of the main results to general Lyapunov stable systems, applications of the computational techniques to the bounding of state and output response. Results on Lyapunov stable systems are applied to the implementation of a logic-based, nonlinear multimode regulator. For plants with disturbance inputs and state-control constraints it enlarges the constraint-admissible domain of attraction. Numerical examples illustrate the various theoretical and computational results.

  1. Modifications of ORNL's computer programs MSF-21 and VTE-21 for the evaluation and rapid optimization of multistage flash and vertical tube evaporators

    Energy Technology Data Exchange (ETDEWEB)

    Glueckstern, P.; Wilson, J.V.; Reed, S.A.

    1976-06-01

    Design and cost modifications were made to ORNL's Computer Programs MSF-21 and VTE-21 originally developed for the rapid calculation and design optimization of multistage flash (MSF) and multieffect vertical tube evaporator (VTE) desalination plants. The modifications include additional design options to make possible the evaluation of desalting plants based on current technology (the original programs were based on conceptual designs applying advanced and not yet proven technological developments and design features) and new materials and equipment costs updated to mid-1975.

  2. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  3. Joint Time-Frequency-Space Classification of EEG in a Brain-Computer Interface Application

    Directory of Open Access Journals (Sweden)

    Molina Gary N Garcia

    2003-01-01

    Full Text Available Brain-computer interface is a growing field of interest in human-computer interaction with diverse applications ranging from medicine to entertainment. In this paper, we present a system which allows for classification of mental tasks based on a joint time-frequency-space decorrelation, in which mental tasks are measured via electroencephalogram (EEG signals. The efficiency of this approach was evaluated by means of real-time experimentations on two subjects performing three different mental tasks. To do so, a number of protocols for visualization, as well as training with and without feedback, were also developed. Obtained results show that it is possible to obtain good classification of simple mental tasks, in view of command and control, after a relatively small amount of training, with accuracies around 80%, and in real time.

  4. Flexible structure control experiments using a real-time workstation for computer-aided control engineering

    Science.gov (United States)

    Stieber, Michael E.

    1989-01-01

    A Real-Time Workstation for Computer-Aided Control Engineering has been developed jointly by the Communications Research Centre (CRC) and Ruhr-Universitaet Bochum (RUB), West Germany. The system is presently used for the development and experimental verification of control techniques for large space systems with significant structural flexibility. The Real-Time Workstation essentially is an implementation of RUB's extensive Computer-Aided Control Engineering package KEDDC on an INTEL micro-computer running under the RMS real-time operating system. The portable system supports system identification, analysis, control design and simulation, as well as the immediate implementation and test of control systems. The Real-Time Workstation is currently being used by CRC to study control/structure interaction on a ground-based structure called DAISY, whose design was inspired by a reflector antenna. DAISY emulates the dynamics of a large flexible spacecraft with the following characteristics: rigid body modes, many clustered vibration modes with low frequencies and extremely low damping. The Real-Time Workstation was found to be a very powerful tool for experimental studies, supporting control design and simulation, and conducting and evaluating tests withn one integrated environment.

  5. Real time computer control of a nonlinear Multivariable System via Linearization and Stability Analysis

    International Nuclear Information System (INIS)

    Raza, K.S.M.

    2004-01-01

    This paper demonstrates that if a complicated nonlinear, non-square, state-coupled multi variable system is smartly linearized and subjected to a thorough stability analysis then we can achieve our design objectives via a controller which will be quite simple (in term of resource usage and execution time) and very efficient (in terms of robustness). Further the aim is to implement this controller via computer in a real time environment. Therefore first a nonlinear mathematical model of the system is achieved. An intelligent work is done to decouple the multivariable system. Linearization and stability analysis techniques are employed for the development of a linearized and mathematically sound control law. Nonlinearities like the saturation in actuators are also been catered. The controller is then discretized using Runge-Kutta integration. Finally the discretized control law is programmed in a computer in a real time environment. The programme is done in RT -Linux using GNU C for the real time realization of the control scheme. The real time processes, like sampling and controlled actuation, and the non real time processes, like graphical user interface and display, are programmed as different tasks. The issue of inter process communication, between real time and non real time task is addressed quite carefully. The results of this research pursuit are presented graphically. (author)

  6. Rapid Identification of Intact Staphylococcal Bacteriophages Using Matrix-Assisted Laser Desorption Ionization-Time-of-Flight Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Dana Štveráková

    2018-04-01

    Full Text Available Staphylococcus aureus is a major causative agent of infections associated with hospital environments, where antibiotic-resistant strains have emerged as a significant threat. Phage therapy could offer a safe and effective alternative to antibiotics. Phage preparations should comply with quality and safety requirements; therefore, it is important to develop efficient production control technologies. This study was conducted to develop and evaluate a rapid and reliable method for identifying staphylococcal bacteriophages, based on detecting their specific proteins using matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS profiling that is among the suggested methods for meeting the regulations of pharmaceutical authorities. Five different phage purification techniques were tested in combination with two MALDI-TOF MS matrices. Phages, either purified by CsCl density gradient centrifugation or as resuspended phage pellets, yielded mass spectra with the highest information value if ferulic acid was used as the MALDI matrix. Phage tail and capsid proteins yielded the strongest signals whereas the culture conditions had no effect on mass spectral quality. Thirty-seven phages from Myoviridae, Siphoviridae or Podoviridae families were analysed, including 23 siphophages belonging to the International Typing Set for human strains of S. aureus, as well as phages in preparations produced by Microgen, Bohemia Pharmaceuticals and MB Pharma. The data obtained demonstrate that MALDI-TOF MS can be used to effectively distinguish between Staphylococcus-specific bacteriophages.

  7. Standardisation and evaluation of a quantitative multiplex real-time PCR assay for the rapid identification of Streptococcus pneumoniae

    Directory of Open Access Journals (Sweden)

    Feroze Ahmed Ganaie

    2015-01-01

    Full Text Available Rapid diagnosis of Streptococcus pneumoniae can play a significant role in decreasing morbidity and mortality of infection. The accurate diagnosis of pneumococcal disease is hampered by the difficulties in growing the isolates from clinical specimens and also by misidentification. Molecular methods have gained popularity as they offer improvement in the detection of causative pathogens with speed and ease. The present study aims at validating and standardising the use of 4 oligonucleotide primer-probe sets (pneumolysin [ply], autolysin [lytA], pneumococcal surface adhesion A [psaA] and Spn9802 [DNA fragment] in a single-reaction mixture for the detection and discrimination of S. pneumoniae. Here, we validate a quantitative multiplex real-time PCR (qmPCR assay with a panel consisting of 43 S. pneumoniae and 29 non-pneumococcal isolates, 20 culture positive, 26 culture negative and 30 spiked serum samples. A standard curve was obtained using S. pneumoniae ATCC 49619 strain and glyceraldehyde 3-phosphate dehydrogenase (GAPDH gene was used as an endogenous internal control. The experiment showed high sensitivity with lower limit of detection equivalent to 4 genome copies/µl. The efficiency of the reaction was 100% for ply, lytA, Spn9802 and 97% for psaA. The test showed sensitivity and specificity of 100% with culture isolates and serum specimens. This study demonstrates that qmPCR analysis of sera using 4 oligonucleotide primers appears to be an appropriate method for the genotypic identification of S. pneumoniae infection.

  8. Copyright and Computer Generated Materials – Is it Time to Reboot the Discussion About Authorship?

    Directory of Open Access Journals (Sweden)

    Anne Fitzgerald

    2013-12-01

    Full Text Available Computer generated materials are ubiquitous and we encounter them on a daily basis, even though most people are unaware that this is the case. Blockbuster movies, television weather reports and telephone directories all include material that is produced by utilising computer technologies. Copyright protection for materials generated by a programmed computer was considered by the Federal Court and Full Court of the Federal Court in Telstra Corporation Limited v Phone Directories Company Pty Ltd.  The court held that the White and Yellow pages telephone directories produced by Telstra and its subsidiary, Sensis, were not protected by copyright because they were computer-generated works which lacked the requisite human authorship.The Copyright Act 1968 (Cth does not contain specific provisions on the subsistence of copyright in computer-generated materials. Although the issue of copyright protection for computer-generated materials has been examined in Australia on two separate occasions by independently-constituted Copyright Law Review Committees over a period of 10 years (1988 to 1998, the Committees’ recommendations for legislative clarification by the enactment of specific amendments to the Copyright Act have not yet been implemented and the legal position remains unclear. In the light of the decision of the Full Federal Court in Telstra v Phone Directories it is timely to consider whether specific provisions should be enacted to clarify the position of computer-generated works under copyright law and, in particular, whether the requirement of human authorship for original works protected under Part III of the Copyright Act should now be reconceptualised to align with the realities of how copyright materials are created in the digital era.

  9. Rapid prototyping of SoC-based real-time vision system: application to image preprocessing and face detection

    Science.gov (United States)

    Jridi, Maher; Alfalou, Ayman

    2017-05-01

    By this paper, the major goal is to investigate the Multi-CPU/FPGA SoC (System on Chip) design flow and to transfer a know-how and skills to rapidly design embedded real-time vision system. Our aim is to show how the use of these devices can be benefit for system level integration since they make possible simultaneous hardware and software development. We take the facial detection and pretreatments as case study since they have a great potential to be used in several applications such as video surveillance, building access control and criminal identification. The designed system use the Xilinx Zedboard platform. The last is the central element of the developed vision system. The video acquisition is performed using either standard webcam connected to the Zedboard via USB interface or several camera IP devices. The visualization of video content and intermediate results are possible with HDMI interface connected to HD display. The treatments embedded in the system are as follow: (i) pre-processing such as edge detection implemented in the ARM and in the reconfigurable logic, (ii) software implementation of motion detection and face detection using either ViolaJones or LBP (Local Binary Pattern), and (iii) application layer to select processing application and to display results in a web page. One uniquely interesting feature of the proposed system is that two functions have been developed to transmit data from and to the VDMA port. With the proposed optimization, the hardware implementation of the Sobel filter takes 27 ms and 76 ms for 640x480, and 720p resolutions, respectively. Hence, with the FPGA implementation, an acceleration of 5 times is obtained which allow the processing of 37 fps and 13 fps for 640x480, and 720p resolutions, respectively.

  10. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry for rapid identification of fungal rhinosinusitis pathogens.

    Science.gov (United States)

    Huang, Yanfei; Wang, Jinglin; Zhang, Mingxin; Zhu, Min; Wang, Mei; Sun, Yufeng; Gu, Haitong; Cao, Jingjing; Li, Xue; Zhang, Shaoya; Lu, Xinxin

    2017-03-01

    Filamentous fungi are among the most important pathogens, causing fungal rhinosinusitis (FRS). Current laboratory diagnosis of FRS pathogens mainly relies on phenotypic identification by culture and microscopic examination, which is time consuming and expertise dependent. Although matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) MS has been employed to identify various fungi, its efficacy in the identification of FRS fungi is less clear. A total of 153 FRS isolates obtained from patients were analysed at the Clinical Laboratory at the Beijing Tongren Hospital affiliated to the Capital Medical University, between January 2014 and December 2015. They were identified by traditional phenotypic methods and Bruker MALDI-TOF MS (Bruker, Biotyper version 3.1), respectively. Discrepancies between the two methods were further validated by sequencing. Among the 153 isolates, 151 had correct species identification using MALDI-TOF MS (Bruker, Biot 3.1, score ≥2.0 or 2.3). MALDI-TOF MS enabled identification of some very closely related species that were indistinguishable by conventional phenotypic methods, including 1/10 Aspergillus versicolor, 3/20 Aspergillus flavus, 2/30 Aspergillus fumigatus and 1/20 Aspergillus terreus, which were misidentified by conventional phenotypic methods as Aspergillus nidulans, Aspergillus oryzae, Aspergillus japonicus and Aspergillus nidulans, respectively. In addition, 2/2 Rhizopus oryzae and 1/1 Rhizopus stolonifer that were identified only to the genus level by the phenotypic method were correctly identified by MALDI-TOF MS. MALDI-TOF MS is a rapid and accurate technique, and could replace the conventional phenotypic method for routine identification of FRS fungi in clinical microbiology laboratories.

  11. Control bandwidth improvements in GRAVITY fringe tracker by switching to a synchronous real time computer architecture

    Science.gov (United States)

    Abuter, Roberto; Dembet, Roderick; Lacour, Sylvestre; di Lieto, Nicola; Woillez, Julien; Eisenhauer, Frank; Fedou, Pierre; Phan Duc, Than

    2016-08-01

    The new VLTI (Very Large Telescope Interferometer) 1 instrument GRAVITY5, 22, 23 is equipped with a fringe tracker16 able to stabilize the K-band fringes on six baselines at the same time. It has been designed to achieve a performance for average seeing conditions of a residual OPD (Optical Path Difference) lower than 300 nm with objects brighter than K = 10. The control loop implementing the tracking is composed of a four stage real time computer system compromising: a sensor where the detector pixels are read in and the OPD and GD (Group Delay) are calculated; a controller receiving the computed sensor quantities and producing commands for the piezo actuators; a concentrator which combines both the OPD commands with the real time tip/tilt corrections offloading them to the piezo actuator; and finally a Kalman15 parameter estimator. This last stage is used to monitor current measurements over a window of few seconds and estimate new values for the main Kalman15 control loop parameters. The hardware and software implementation of this design runs asynchronously and communicates the four computers for data transfer via the Reflective Memory Network3. With the purpose of improving the performance of the GRAVITY5, 23 fringe tracking16, 22 control loop, a deviation from the standard asynchronous communication mechanism has been proposed and implemented. This new scheme operates the four independent real time computers involved in the tracking loop synchronously using the Reflective Memory Interrupts2 as the coordination signal. This synchronous mechanism had the effect of reducing the total pure delay of the loop from 3.5 [ms] to 2.0 [ms] which then translates on a better stabilization of the fringes as the bandwidth of the system is substantially improved. This paper will explain in detail the real time architecture of the fringe tracker in both is synchronous and synchronous implementation. The achieved improvements on reducing the delay via this mechanism will be

  12. Flood Foresight: A near-real time flood monitoring and forecasting tool for rapid and predictive flood impact assessment

    Science.gov (United States)

    Revilla-Romero, Beatriz; Shelton, Kay; Wood, Elizabeth; Berry, Robert; Bevington, John; Hankin, Barry; Lewis, Gavin; Gubbin, Andrew; Griffiths, Samuel; Barnard, Paul; Pinnell, Marc; Huyck, Charles

    2017-04-01

    The hours and days immediately after a major flood event are often chaotic and confusing, with first responders rushing to mobilise emergency responders, provide alleviation assistance and assess loss to assets of interest (e.g., population, buildings or utilities). Preparations in advance of a forthcoming event are becoming increasingly important; early warning systems have been demonstrated to be useful tools for decision markers. The extent of damage, human casualties and economic loss estimates can vary greatly during an event, and the timely availability of an accurate flood extent allows emergency response and resources to be optimised, reduces impacts, and helps prioritise recovery. In the insurance sector, for example, insurers are under pressure to respond in a proactive manner to claims rather than waiting for policyholders to report losses. Even though there is a great demand for flood inundation extents and severity information in different sectors, generating flood footprints for large areas from hydraulic models in real time remains a challenge. While such footprints can be produced in real time using remote sensing, weather conditions and sensor availability limit their ability to capture every single flood event across the globe. In this session, we will present Flood Foresight (www.floodforesight.com), an operational tool developed to meet the universal requirement for rapid geographic information, before, during and after major riverine flood events. The tool provides spatial data with which users can measure their current or predicted impact from an event - at building, basin, national or continental scales. Within Flood Foresight, the Screening component uses global rainfall predictions to provide a regional- to continental-scale view of heavy rainfall events up to a week in advance, alerting the user to potentially hazardous situations relevant to them. The Forecasting component enhances the predictive suite of tools by providing a local

  13. A novel photoinduced electron transfer (PET) primer technique for rapid real-time PCR detection of Cryptosporidium spp

    International Nuclear Information System (INIS)

    Jothikumar, N.; Hill, Vincent R.

    2013-01-01

    Highlights: •Uses a single-labeled fluorescent primer for real-time PCR. •The detection sensitivity of PET PCR was comparable to TaqMan PCR. •Melt curve analysis can be performed to confirm target amplicon production. •Conventional PCR primers can be converted to PET PCR primers. -- Abstract: We report the development of a fluorescently labeled oligonucleotide primer that can be used to monitor real-time PCR. The primer has two parts, the 3′-end of the primer is complimentary to the target and a universal 17-mer stem loop at the 5′-end forms a hairpin structure. A fluorescent dye is attached to 5′-end of either the forward or reverse primer. The presence of guanosine residues at the first and second position of the 3′ dangling end effectively quenches the fluorescence due to the photo electron transfer (PET) mechanism. During the synthesis of nucleic acid, the hairpin structure is linearized and the fluorescence of the incorporated primer increases several-fold due to release of the fluorescently labeled tail and the absence of guanosine quenching. As amplicons are synthesized during nucleic acid amplification, the fluorescence increase in the reaction mixture can be measured with commercially available real-time PCR instruments. In addition, a melting procedure can be performed to denature the double-stranded amplicons, thereby generating fluorescence peaks that can differentiate primer dimers and other non-specific amplicons if formed during the reaction. We demonstrated the application of PET-PCR for the rapid detection and quantification of Cryptosporidium parvum DNA. Comparison with a previously published TaqMan® assay demonstrated that the two real-time PCR assays exhibited similar sensitivity for a dynamic range of detection of 6000–0.6 oocysts per reaction. PET PCR primers are simple to design and less-expensive than dual-labeled probe PCR methods, and should be of interest for use by laboratories operating in resource

  14. A novel photoinduced electron transfer (PET) primer technique for rapid real-time PCR detection of Cryptosporidium spp

    Energy Technology Data Exchange (ETDEWEB)

    Jothikumar, N., E-mail: jin2@cdc.gov; Hill, Vincent R.

    2013-06-28

    Highlights: •Uses a single-labeled fluorescent primer for real-time PCR. •The detection sensitivity of PET PCR was comparable to TaqMan PCR. •Melt curve analysis can be performed to confirm target amplicon production. •Conventional PCR primers can be converted to PET PCR primers. -- Abstract: We report the development of a fluorescently labeled oligonucleotide primer that can be used to monitor real-time PCR. The primer has two parts, the 3′-end of the primer is complimentary to the target and a universal 17-mer stem loop at the 5′-end forms a hairpin structure. A fluorescent dye is attached to 5′-end of either the forward or reverse primer. The presence of guanosine residues at the first and second position of the 3′ dangling end effectively quenches the fluorescence due to the photo electron transfer (PET) mechanism. During the synthesis of nucleic acid, the hairpin structure is linearized and the fluorescence of the incorporated primer increases several-fold due to release of the fluorescently labeled tail and the absence of guanosine quenching. As amplicons are synthesized during nucleic acid amplification, the fluorescence increase in the reaction mixture can be measured with commercially available real-time PCR instruments. In addition, a melting procedure can be performed to denature the double-stranded amplicons, thereby generating fluorescence peaks that can differentiate primer dimers and other non-specific amplicons if formed during the reaction. We demonstrated the application of PET-PCR for the rapid detection and quantification of Cryptosporidium parvum DNA. Comparison with a previously published TaqMan® assay demonstrated that the two real-time PCR assays exhibited similar sensitivity for a dynamic range of detection of 6000–0.6 oocysts per reaction. PET PCR primers are simple to design and less-expensive than dual-labeled probe PCR methods, and should be of interest for use by laboratories operating in resource

  15. Rapid, Real-time Methane Detection in Ground Water Using a New Gas-Water Equilibrator Design

    Science.gov (United States)

    Ruybal, C. J.; DiGiulio, D. C.; Wilkin, R. T.; Hargrove, K. D.; McCray, J. E.

    2014-12-01

    Recent increases in unconventional gas development have been accompanied by public concern for methane contamination in drinking water wells near production areas. Although not a regulated pollutant, methane may be a marker contaminant for others that are less mobile in groundwater and thus may be detected later, or at a location closer to the source. In addition, methane poses an explosion hazard if exsolved concentrations reach 5 - 15% volume in air. Methods for determining dissolved gases, such as methane, have evolved over 60 years. However, the response time of these methods is insufficient to monitor trends in methane concentration in real-time. To enable rapid, real-time monitoring of aqueous methane concentrations during ground water purging, a new gas-water equilibrator (GWE) was designed that increases gas-water mass exchange rates of methane for measurement. Monitoring of concentration trends allows a comparison of temporal trends between sampling events and comparison of baseline conditions with potential post-impact conditions. These trends may be a result of removal of stored casing water, pre-purge ambient borehole flow, formation physical and chemical heterogeneity, or flow outside of well casing due to inadequate seals. Real-time information in the field can help focus an investigation, aid in determining when to collect a sample, save money by limiting costs (e.g. analytical, sample transport and storage), and provide an immediate assessment of local methane concentrations. Four domestic water wells, one municipal water well, and one agricultural water well were sampled for traditional laboratory analysis and compared to the field GWE results. Aqueous concentrations measured on the GWE ranged from non-detect to 1,470 μg/L methane. Some trends in aqueous methane concentrations measured on the GWE were observed during purging. Applying a paired t-test comparing the new GWE method and traditional laboratory analysis yielded a p-value 0

  16. Characterization of a Reconfigurable Free-Space Optical Channel for Embedded Computer Applications with Experimental Validation Using Rapid Prototyping Technology

    Directory of Open Access Journals (Sweden)

    Rafael Gil-Otero

    2007-02-01

    Full Text Available Free-space optical interconnects (FSOIs are widely seen as a potential solution to current and future bandwidth bottlenecks for parallel processors. In this paper, an FSOI system called optical highway (OH is proposed. The OH uses polarizing beam splitter-liquid crystal plate (PBS/LC assemblies to perform reconfigurable beam combination functions. The properties of the OH make it suitable for embedding complex network topologies such as completed connected mesh or hypercube. This paper proposes the use of rapid prototyping technology for implementing an optomechanical system suitable for studying the reconfigurable characteristics of a free-space optical channel. Additionally, it reports how the limited contrast ratio of the optical components can affect the attenuation of the optical signal and the crosstalk caused by misdirected signals. Different techniques are also proposed in order to increase the optical modulation amplitude (OMA of the system.

  17. Characterization of a Reconfigurable Free-Space Optical Channel for Embedded Computer Applications with Experimental Validation Using Rapid Prototyping Technology

    Directory of Open Access Journals (Sweden)

    Lim Theodore

    2007-01-01

    Full Text Available Free-space optical interconnects (FSOIs are widely seen as a potential solution to current and future bandwidth bottlenecks for parallel processors. In this paper, an FSOI system called optical highway (OH is proposed. The OH uses polarizing beam splitter-liquid crystal plate (PBS/LC assemblies to perform reconfigurable beam combination functions. The properties of the OH make it suitable for embedding complex network topologies such as completed connected mesh or hypercube. This paper proposes the use of rapid prototyping technology for implementing an optomechanical system suitable for studying the reconfigurable characteristics of a free-space optical channel. Additionally, it reports how the limited contrast ratio of the optical components can affect the attenuation of the optical signal and the crosstalk caused by misdirected signals. Different techniques are also proposed in order to increase the optical modulation amplitude (OMA of the system.

  18. Rapid communication: Computational simulation and analysis of a candidate for the design of a novel silk-based biopolymer.

    Science.gov (United States)

    Golas, Ewa I; Czaplewski, Cezary

    2014-09-01

    This work theoretically investigates the mechanical properties of a novel silk-derived biopolymer as polymerized in silico from sericin and elastin-like monomers. Molecular Dynamics simulations and Steered Molecular Dynamics were the principal computational methods used, the latter of which applies an external force onto the system and thereby enables an observation of its response to stress. The models explored herein are single-molecule approximations, and primarily serve as tools in a rational design process for the preliminary assessment of properties in a new material candidate. © 2014 Wiley Periodicals, Inc.

  19. Rapid computation of single PET scan rest-stress myocardial blood flow parametric images by table look up.

    Science.gov (United States)

    Guehl, Nicolas J; Normandin, Marc D; Wooten, Dustin W; Rozen, Guy; Ruskin, Jeremy N; Shoup, Timothy M; Woo, Jonghye; Ptaszek, Leon M; Fakhri, Georges El; Alpert, Nathaniel M

    2017-09-01

    We have recently reported a method for measuring rest-stress myocardial blood flow (MBF) using a single, relatively short, PET scan session. The method requires two IV tracer injections, one to initiate rest imaging and one at peak stress. We previously validated absolute flow quantitation in ml/min/cc for standard bull's eye, segmental analysis. In this work, we extend the method for fast computation of rest-stress MBF parametric images. We provide an analytic solution to the single-scan rest-stress flow model which is then solved using a two-dimensional table lookup method (LM). Simulations were performed to compare the accuracy and precision of the lookup method with the original nonlinear method (NLM). Then the method was applied to 16 single scan rest/stress measurements made in 12 pigs: seven studied after infarction of the left anterior descending artery (LAD) territory, and nine imaged in the native state. Parametric maps of rest and stress MBF as well as maps of left (f LV ) and right (f RV ) ventricular spill-over fractions were generated. Regions of interest (ROIs) for 17 myocardial segments were defined in bull's eye fashion on the parametric maps. The mean of each ROI was then compared to the rest (K 1r ) and stress (K 1s ) MBF estimates obtained from fitting the 17 regional TACs with the NLM. In simulation, the LM performed as well as the NLM in terms of precision and accuracy. The simulation did not show that bias was introduced by the use of a predefined two-dimensional lookup table. In experimental data, parametric maps demonstrated good statistical quality and the LM was computationally much more efficient than the original NLM. Very good agreement was obtained between the mean MBF calculated on the parametric maps for each of the 17 ROIs and the regional MBF values estimated by the NLM (K 1map LM  = 1.019 × K 1 ROI NLM  + 0.019, R 2  = 0.986; mean difference = 0.034 ± 0.036 mL/min/cc). We developed a table lookup method for fast

  20. Computer-aided detection (CAD) of lung nodules in CT scans: radiologist performance and reading time with incremental CAD assistance

    International Nuclear Information System (INIS)

    Roos, Justus E.; Paik, David; Olsen, David; Liu, Emily G.; Leung, Ann N.; Mindelzun, Robert; Choudhury, Kingshuk R.; Napel, Sandy; Rubin, Geoffrey D.; Chow, Lawrence C.; Naidich, David P.

    2010-01-01

    The diagnostic performance of radiologists using incremental CAD assistance for lung nodule detection on CT and their temporal variation in performance during CAD evaluation was assessed. CAD was applied to 20 chest multidetector-row computed tomography (MDCT) scans containing 190 non-calcified ≥3-mm nodules. After free search, three radiologists independently evaluated a maximum of up to 50 CAD detections/patient. Multiple free-response ROC curves were generated for free search and successive CAD evaluation, by incrementally adding CAD detections one at a time to the radiologists' performance. The sensitivity for free search was 53% (range, 44%-59%) at 1.15 false positives (FP)/patient and increased with CAD to 69% (range, 59-82%) at 1.45 FP/patient. CAD evaluation initially resulted in a sharp rise in sensitivity of 14% with a minimal increase in FP over a time period of 100 s, followed by flattening of the sensitivity increase to only 2%. This transition resulted from a greater prevalence of true positive (TP) versus FP detections at early CAD evaluation and not by a temporal change in readers' performance. The time spent for TP (9.5 s ± 4.5 s) and false negative (FN) (8.4 s ± 6.7 s) detections was similar; FP decisions took two- to three-times longer (14.4 s ± 8.7 s) than true negative (TN) decisions (4.7 s ± 1.3 s). When CAD output is ordered by CAD score, an initial period of rapid performance improvement slows significantly over time because of non-uniformity in the distribution of TP CAD output and not to a changing reader performance over time. (orig.)

  1. Bound on quantum computation time: Quantum error correction in a critical environment

    International Nuclear Information System (INIS)

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-01-01

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  2. Real-time computing in environmental monitoring of a nuclear power plant

    International Nuclear Information System (INIS)

    Deme, S.; Lang, E.; Nagy, Gy.

    1987-06-01

    A real-time computing method is described for calculating the environmental radiation exposure due to a nuclear power plant both at normal operation and at accident. The effects of the Gaussian plume are recalculated in every ten minutes based on meteorological parameters measured at a height of 20 and 120 m as well as on emission data. At normal operation the quantity of radioactive materials released through the stacks is measured and registered while, at an accident, the source strength is unknown and the calculated relative data are normalized to the values measured at the eight environmental monitoring stations. The doses due to noble gases and to dry and wet deposition as well as the time integral of 131 I concentration are calculated and stored by a professional personal computer for 720 points of the environment of 11 km radius. (author)

  3. Rapid and efficient radiosynthesis of [{sup 123}I]I-PK11195, a single photon emission computed tomography tracer for peripheral benzodiazepine receptors

    Energy Technology Data Exchange (ETDEWEB)

    Pimlott, Sally L. [Department of Clinical Physics, West of Scotland Radionuclide Dispensary, Western Infirmary, G11 6NT Glasgow (United Kingdom)], E-mail: s.pimlott@clinmed.gla.ac.uk; Stevenson, Louise [Department of Chemistry, WestCHEM, University of Glasgow, G12 8QQ Glasgow (United Kingdom); Wyper, David J. [Institute of Neurological Sciences, Southern General Hospital, G51 4TF Glasgow (United Kingdom); Sutherland, Andrew [Department of Chemistry, WestCHEM, University of Glasgow, G12 8QQ Glasgow (United Kingdom)

    2008-07-15

    Introduction: [{sup 123}I]I-PK11195 is a high-affinity single photon emission computed tomography radiotracer for peripheral benzodiazepine receptors that has previously been used to measure activated microglia and to assess neuroinflammation in the living human brain. This study investigates the radiosynthesis of [{sup 123}I]I-PK11195 in order to develop a rapid and efficient method that obtains [{sup 123}I]I-PK11195 with a high specific activity for in vivo animal and human imaging studies. Methods: The synthesis of [{sup 123}I]I-PK11195 was evaluated using a solid-state interhalogen exchange method and an electrophilic iododestannylation method, where bromine and trimethylstannyl derivatives were used as precursors, respectively. In the electrophilic iododestannylation method, the oxidants peracetic acid and chloramine-T were both investigated. Results: Electrophilic iododestannylation produced [{sup 123}I]I-PK11195 with a higher isolated radiochemical yield and a higher specific activity than achievable using the halogen exchange method investigated. Using chloramine-T as oxidant provided a rapid and efficient method of choice for the synthesis of [{sup 123}I]I-PK11195. Conclusions: [{sup 123}I]I-PK11195 has been successfully synthesized via a rapid and efficient electrophilic iododestannylation method, producing [{sup 123}I]I-PK11195 with a higher isolated radiochemical yield and a higher specific activity than previously achieved.

  4. Rapid and efficient radiosynthesis of [123I]I-PK11195, a single photon emission computed tomography tracer for peripheral benzodiazepine receptors

    International Nuclear Information System (INIS)

    Pimlott, Sally L.; Stevenson, Louise; Wyper, David J.; Sutherland, Andrew

    2008-01-01

    Introduction: [ 123 I]I-PK11195 is a high-affinity single photon emission computed tomography radiotracer for peripheral benzodiazepine receptors that has previously been used to measure activated microglia and to assess neuroinflammation in the living human brain. This study investigates the radiosynthesis of [ 123 I]I-PK11195 in order to develop a rapid and efficient method that obtains [ 123 I]I-PK11195 with a high specific activity for in vivo animal and human imaging studies. Methods: The synthesis of [ 123 I]I-PK11195 was evaluated using a solid-state interhalogen exchange method and an electrophilic iododestannylation method, where bromine and trimethylstannyl derivatives were used as precursors, respectively. In the electrophilic iododestannylation method, the oxidants peracetic acid and chloramine-T were both investigated. Results: Electrophilic iododestannylation produced [ 123 I]I-PK11195 with a higher isolated radiochemical yield and a higher specific activity than achievable using the halogen exchange method investigated. Using chloramine-T as oxidant provided a rapid and efficient method of choice for the synthesis of [ 123 I]I-PK11195. Conclusions: [ 123 I]I-PK11195 has been successfully synthesized via a rapid and efficient electrophilic iododestannylation method, producing [ 123 I]I-PK11195 with a higher isolated radiochemical yield and a higher specific activity than previously achieved

  5. The Educator´s Approach to Media Training and Computer Games within Leisure Time of School-children

    OpenAIRE

    MORAVCOVÁ, Dagmar

    2009-01-01

    The paper describes possible ways of approaching computer games playing as part of leisure time of school-children and deals with the significance of media training in leisure time. At first it specifies the concept of leisure time and its functions, then shows some positive and negative effects of the media. It further describes classical computer games, the problem of excess computer game playing and means of prevention. The paper deals with the educator's personality and the importance of ...

  6. Computation of the Short-Time Linear Canonical Transform with Dual Window

    Directory of Open Access Journals (Sweden)

    Lei Huang

    2017-01-01

    Full Text Available The short-time linear canonical transform (STLCT, which maps the time domain signal into the joint time and frequency domain, has recently attracted some attention in the area of signal processing. However, its applications are still limited due to the fact that selection of coefficients of the short-time linear canonical series (STLCS is not unique, because time and frequency elementary functions (together known as basis function of STLCS do not constitute an orthogonal basis. To solve this problem, this paper investigates a dual window solution. First, the nonorthogonal problem that suffered from original window is fulfilled by orthogonal condition with dual window. Then based on the obtained condition, a dual window computation approach of the GT is extended to the STLCS. In addition, simulations verify the validity of the proposed condition and solutions. Furthermore, some possible applied directions are discussed.

  7. Memristive Computational Architecture of an Echo State Network for Real-Time Speech Emotion Recognition

    Science.gov (United States)

    2015-05-28

    recognition is simpler and requires less computational resources compared to other inputs such as facial expressions . The Berlin database of Emotional ...Processing Magazine, IEEE, vol. 18, no. 1, pp. 32– 80, 2001. [15] K. R. Scherer, T. Johnstone, and G. Klasmeyer, “Vocal expression of emotion ...Network for Real-Time Speech- Emotion Recognition 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62788F 6. AUTHOR(S) Q

  8. Computational model for real-time determination of tritium inventory in a detritiation installation

    International Nuclear Information System (INIS)

    Bornea, Anisia; Stefanescu, Ioan; Zamfirache, Marius; Stefan, Iuliana; Sofalca, Nicolae; Bidica, Nicolae

    2008-01-01

    Full text: At ICIT Rm.Valcea an experimental pilot plant was built having as main objective the development of a technology for detritiation of heavy water processed in the CANDU-type reactors of the nuclear power plant at Cernavoda, Romania. The aspects related to safeguards and safety for such a detritiation installation being of great importance, a complex computational model has been developed. The model allows real-time calculation of tritium inventory in a working installation. The applied detritiation technology is catalyzed isotopic exchange coupled with cryogenic distillation. Computational models for non-steady working conditions have been developed for each process of isotopic exchange. By coupling these processes tritium inventory can be determined in real-time. The computational model was developed based on the experience gained on the pilot installation. The model uses a set of parameters specific to isotopic exchange processes. These parameters were experimentally determined in the pilot installation. The model is included in the monitoring system and uses as input data the parameters acquired in real-time from automation system of the pilot installation. A friendly interface has been created to visualize the final results as data or graphs. (authors)

  9. A real-time computational model for estimating kinematics of ankle ligaments.

    Science.gov (United States)

    Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Sheng Quan

    2016-01-01

    An accurate assessment of ankle ligament kinematics is crucial in understanding the injury mechanisms and can help to improve the treatment of an injured ankle, especially when used in conjunction with robot-assisted therapy. A number of computational models have been developed and validated for assessing the kinematics of ankle ligaments. However, few of them can do real-time assessment to allow for an input into robotic rehabilitation programs. An ankle computational model was proposed and validated to quantify the kinematics of ankle ligaments as the foot moves in real-time. This model consists of three bone segments with three rotational degrees of freedom (DOFs) and 12 ankle ligaments. This model uses inputs for three position variables that can be measured from sensors in many ankle robotic devices that detect postures within the foot-ankle environment and outputs the kinematics of ankle ligaments. Validation of this model in terms of ligament length and strain was conducted by comparing it with published data on cadaver anatomy and magnetic resonance imaging. The model based on ligament lengths and strains is in concurrence with those from the published studies but is sensitive to ligament attachment positions. This ankle computational model has the potential to be used in robot-assisted therapy for real-time assessment of ligament kinematics. The results provide information regarding the quantification of kinematics associated with ankle ligaments related to the disability level and can be used for optimizing the robotic training trajectory.

  10. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Marlowe Thomas J.

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants. We illustrate this approach using threshhold graphs, and show that any computation of reliability using Gilbert's formula will be polynomial-time if and only if the number of invariants considered is polynomial; we then show families of graphs with polynomial-time, and non-polynomial reliability computation, and show that these encompass most previously known results. We then codify our approach to indicate how it can be used for other classes of graphs, and suggest several classes to which the technique can be applied.

  11. Extending the length and time scales of Gram–Schmidt Lyapunov vector computations

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Anthony B., E-mail: acosta@northwestern.edu [Department of Chemistry, Northwestern University, Evanston, IL 60208 (United States); Green, Jason R., E-mail: jason.green@umb.edu [Department of Chemistry, Northwestern University, Evanston, IL 60208 (United States); Department of Chemistry, University of Massachusetts Boston, Boston, MA 02125 (United States)

    2013-08-01

    Lyapunov vectors have found growing interest recently due to their ability to characterize systems out of thermodynamic equilibrium. The computation of orthogonal Gram–Schmidt vectors requires multiplication and QR decomposition of large matrices, which grow as N{sup 2} (with the particle count). This expense has limited such calculations to relatively small systems and short time scales. Here, we detail two implementations of an algorithm for computing Gram–Schmidt vectors. The first is a distributed-memory message-passing method using Scalapack. The second uses the newly-released MAGMA library for GPUs. We compare the performance of both codes for Lennard–Jones fluids from N=100 to 1300 between Intel Nahalem/Infiniband DDR and NVIDIA C2050 architectures. To our best knowledge, these are the largest systems for which the Gram–Schmidt Lyapunov vectors have been computed, and the first time their calculation has been GPU-accelerated. We conclude that Lyapunov vector calculations can be significantly extended in length and time by leveraging the power of GPU-accelerated linear algebra.

  12. Extending the length and time scales of Gram–Schmidt Lyapunov vector computations

    International Nuclear Information System (INIS)

    Costa, Anthony B.; Green, Jason R.

    2013-01-01

    Lyapunov vectors have found growing interest recently due to their ability to characterize systems out of thermodynamic equilibrium. The computation of orthogonal Gram–Schmidt vectors requires multiplication and QR decomposition of large matrices, which grow as N 2 (with the particle count). This expense has limited such calculations to relatively small systems and short time scales. Here, we detail two implementations of an algorithm for computing Gram–Schmidt vectors. The first is a distributed-memory message-passing method using Scalapack. The second uses the newly-released MAGMA library for GPUs. We compare the performance of both codes for Lennard–Jones fluids from N=100 to 1300 between Intel Nahalem/Infiniband DDR and NVIDIA C2050 architectures. To our best knowledge, these are the largest systems for which the Gram–Schmidt Lyapunov vectors have been computed, and the first time their calculation has been GPU-accelerated. We conclude that Lyapunov vector calculations can be significantly extended in length and time by leveraging the power of GPU-accelerated linear algebra

  13. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  14. Reliability of real-time computing with radiation data feedback at accidental release

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Lang, E.

    1990-01-01

    At the first workshop in 1985 we reported on the real-time dose computing method used at the Paks Nuclear Power Plant and on the telemetric system developed for the normalization of the computed data. At present, the computing method normalized for the telemetric data represents the primary information for deciding on any necessary counter measures in case of a nuclear reactor accident. In this connection we analyzed the reliability of the results obtained in this manner. The points of the analysis were: how the results are influenced by the choice of certain parameters that cannot be determined by direct methods and how the improperly chosen diffusion parameters would distort the determination of environmental radiation parameters normalized on the basis of the measurements ( 131 I activity concentration, gamma dose rate) at points lying at a given distance from the measuring stations. A further source of errors may be that, when determining the level of gamma radiation, the radionuclide doses in the cloud and on the ground surface are measured together by the environmental monitoring stations, whereas these doses appear separately in the computations. At the Paks NPP it is the time integral of the aiborne activity concentration of vapour form 131 I which is determined. This quantity includes neither the other physical and chemical forms of 131 I nor the other isotopes of radioiodine. We gave numerical examples for the uncertainties due to the above factors. As a result, we arrived at the conclusions that there is a need to decide on accident-related measures based on the computing method that the dose uncertainties may reach one order of magnitude for points lying far from the monitoring stations. Different measures are discussed to make the uncertainties significantly lower

  15. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  16. Real-time DNA barcoding in a rainforest using nanopore sequencing: opportunities for rapid biodiversity assessments and local capacity building.

    Science.gov (United States)

    Pomerantz, Aaron; Peñafiel, Nicolás; Arteaga, Alejandro; Bustamante, Lucas; Pichardo, Frank; Coloma, Luis A; Barrio-Amorós, César L; Salazar-Valenzuela, David; Prost, Stefan

    2018-04-01

    Advancements in portable scientific instruments provide promising avenues to expedite field work in order to understand the diverse array of organisms that inhabit our planet. Here, we tested the feasibility for in situ molecular analyses of endemic fauna using a portable laboratory fitting within a single backpack in one of the world's most imperiled biodiversity hotspots, the Ecuadorian Chocó rainforest. We used portable equipment, including the MinION nanopore sequencer (Oxford Nanopore Technologies) and the miniPCR (miniPCR), to perform DNA extraction, polymerase chain reaction amplification, and real-time DNA barcoding of reptile specimens in the field. We demonstrate that nanopore sequencing can be implemented in a remote tropical forest to quickly and accurately identify species using DNA barcoding, as we generated consensus sequences for species resolution with an accuracy of >99% in less than 24 hours after collecting specimens. The flexibility of our mobile laboratory further allowed us to generate sequence information at the Universidad Tecnológica Indoamérica in Quito for rare, endangered, and undescribed species. This includes the recently rediscovered Jambato toad, which was thought to be extinct for 28 years. Sequences generated on the MinION required as few as 30 reads to achieve high accuracy relative to Sanger sequencing, and with further multiplexing of samples, nanopore sequencing can become a cost-effective approach for rapid and portable DNA barcoding. Overall, we establish how mobile laboratories and nanopore sequencing can help to accelerate species identification in remote areas to aid in conservation efforts and be applied to research facilities in developing countries. This opens up possibilities for biodiversity studies by promoting local research capacity building, teaching nonspecialists and students about the environment, tackling wildlife crime, and promoting conservation via research-focused ecotourism.

  17. Computer simulation of the time evolution of a quenched model alloy in the nucleation region

    International Nuclear Information System (INIS)

    Marro, J.; Lebowitz, J.L.; Kalos, M.H.

    1979-01-01

    The time evolution of the structure function and of the cluster (or grain) distribution following quenching in a model binary alloy with a small concentration of minority atoms is obtained from computer simulations. The structure function S-bar (k,t) obeys a simple scaling relation, S-bar (k,t) = K -3 F (k/K) with K (t) proportional t/sup -a/, a approx. = 0.25, during the latter and larger part of the evolution. During the same period, the mean cluster size grows approximately linearly with time

  18. Manual cross check of computed dose times for motorised wedged fields

    International Nuclear Information System (INIS)

    Porte, J.

    2001-01-01

    If a mass of tissue equivalent material is exposed in turn to wedged and open radiation fields of the same size, for equal times, it is incorrect to assume that the resultant isodose pattern will be effectively that of a wedge having half the angle of the wedged field. Computer programs have been written to address the problem of creating an intermediate wedge field, commonly known as a motorized wedge. The total exposure time is apportioned between the open and wedged fields, to produce a beam modification equivalent to that of a wedged field of a given wedge angle. (author)

  19. Computations of concentration of radon and its decay products against time. Computer program; Obliczanie koncentracji radonu i jego produktow rozpadu w funkcji czasu. Program komputerowy

    Energy Technology Data Exchange (ETDEWEB)

    Machaj, B. [Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    1996-12-31

    This research is aimed to develop a device for continuous monitoring of radon in the air, by measuring alpha activity of radon and its short lived decay products. The influence of alpha activity variation of radon and its daughters on the measured results is of importance and requires a knowledge of this variation with time. Employing the measurement of alpha radiation of radon and of its short lived decay products, require knowledge of radon concentration variation and its decay products against the time. A computer program in Turbo Pascal language was therefore developed performing the computations employing the known relations involved, the program being adapted for IBM PC computers. The presented program enables computation of activity of {sup 222}Rn and its daughter products: {sup 218}Po, {sup 214}Pb, {sup 214}Bi and {sup 214}Po every 1 min within the period of 0-255 min for any state of radiation equilibrium between the radon and its daughter products. The program permits also to compute alpha activity of {sup 222}Rn + {sup 218}Po + {sup 214}Po against time and the total alpha activity at selected interval of time. The results of computations are stored on the computer hard disk in ASCII format and are used a graphic program e.g. by DrawPerfect program to make diagrams. Equations employed for computation of the alpha activity of radon and its decay products as well as the description of program functions are given. (author). 2 refs, 4 figs.

  20. Rapid time-resolved magnetic resonance angiography via a multiecho radial trajectory and GraDeS reconstruction.

    Science.gov (United States)

    Lee, Gregory R; Seiberlich, Nicole; Sunshine, Jeffrey L; Carroll, Timothy J; Griswold, Mark A

    2013-02-01

    Contrast-enhanced magnetic resonance angiography is challenging due to the need for both high spatial and temporal resolution. A multishot trajectory composed of pseudo-random rotations of a single multiecho radial readout was developed. The trajectory is designed to give incoherent aliasing artifacts and a relatively uniform distribution of projections over all time scales. A field map (computed from the same data set) is used to avoid signal dropout in regions of substantial field inhomogeneity. A compressed sensing reconstruction using the GraDeS algorithm was used. Whole brain angiograms were reconstructed at 1-mm isotropic resolution and a 1.1-s frame rate (corresponding to an acceleration factor > 100). The only parameter which must be chosen is the number of iterations of the GraDeS algorithm. A larger number of iterations improves the temporal behavior at cost of decreased image signal-to-noise ratio. The resulting images provide a good depiction of the cerebral vasculature and have excellent arterial/venous separation. Copyright © 2012 Wiley Periodicals, Inc.

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  2. Television Viewing, Computer Use, Time Driving and All‐Cause Mortality: The SUN Cohort

    Science.gov (United States)

    Basterra‐Gortari, Francisco Javier; Bes‐Rastrollo, Maira; Gea, Alfredo; Núñez‐Córdoba, Jorge María; Toledo, Estefanía; Martínez‐González, Miguel Ángel

    2014-01-01

    Background Sedentary behaviors have been directly associated with all‐cause mortality. However, little is known about different types of sedentary behaviors in relation to overall mortality. Our objective was to assess the association between different sedentary behaviors and all‐cause mortality. Methods and Results In this prospective, dynamic cohort study (the SUN Project) 13 284 Spanish university graduates with a mean age of 37 years were followed‐up for a median of 8.2 years. Television, computer, and driving time were assessed at baseline. Poisson regression models were fitted to examine the association between each sedentary behavior and total mortality. All‐cause mortality incidence rate ratios (IRRs) per 2 hours per day were 1.40 (95% confidence interval (CI): 1.06 to 1.84) for television viewing, 0.96 (95% CI: 0.79 to 1.18) for computer use, and 1.14 (95% CI: 0.90 to 1.44) for driving, after adjustment for age, sex, smoking status, total energy intake, Mediterranean diet adherence, body mass index, and physical activity. The risk of mortality was twofold higher for participants reporting ≥3 h/day of television viewing than for those reporting Television viewing was directly associated with all‐cause mortality. However, computer use and time spent driving were not significantly associated with higher mortality. Further cohort studies and trials designed to assess whether reductions in television viewing are able to reduce mortality are warranted. The lack of association between computer use or time spent driving and mortality needs further confirmation. PMID:24965030

  3. Diversification of Angraecum (Orchidaceae, Vandeae) in Madagascar: Revised Phylogeny Reveals Species Accumulation through Time Rather than Rapid Radiation.

    Science.gov (United States)

    Andriananjamanantsoa, Herinandrianina N; Engberg, Shannon; Louis, Edward E; Brouillet, Luc

    Angraecum. The macroevolutionary model-based phylogeny failed to detect shifts in diversification that could be associated directly with morphological diversification. Diversification in Angraecum resulted from gradual species accumulation through time rather than from rapid radiation, a diversification pattern often encountered in tropical rain forests.

  4. Virtual photons in imaginary time: Computing exact Casimir forces via standard numerical electromagnetism techniques

    International Nuclear Information System (INIS)

    Rodriguez, Alejandro; Ibanescu, Mihai; Joannopoulos, J. D.; Johnson, Steven G.; Iannuzzi, Davide

    2007-01-01

    We describe a numerical method to compute Casimir forces in arbitrary geometries, for arbitrary dielectric and metallic materials, with arbitrary accuracy (given sufficient computational resources). Our approach, based on well-established integration of the mean stress tensor evaluated via the fluctuation-dissipation theorem, is designed to directly exploit fast methods developed for classical computational electromagnetism, since it only involves repeated evaluation of the Green's function for imaginary frequencies (equivalently, real frequencies in imaginary time). We develop the approach by systematically examining various formulations of Casimir forces from the previous decades and evaluating them according to their suitability for numerical computation. We illustrate our approach with a simple finite-difference frequency-domain implementation, test it for known geometries such as a cylinder and a plate, and apply it to new geometries. In particular, we show that a pistonlike geometry of two squares sliding between metal walls, in both two and three dimensions with both perfect and realistic metallic materials, exhibits a surprising nonmonotonic ''lateral'' force from the walls

  5. Computational derivation of quantum relativist electromagnetic systems with forward-backward space-time shifts

    International Nuclear Information System (INIS)

    Dubois, Daniel M.

    2000-01-01

    This paper is a continuation of our preceding paper dealing with computational derivation of the Klein-Gordon quantum relativist equation and the Schroedinger quantum equation with forward and backward space-time shifts. The first part introduces forward and backward derivatives for discrete and continuous systems. Generalized complex discrete and continuous derivatives are deduced. The second part deduces the Klein-Gordon equation from the space-time complex continuous derivatives. These derivatives take into account forward-backward space-time shifts related to an internal phase velocity u. The internal group velocity v is related to the speed of light u.v=c 2 and to the external group and phase velocities u.v=v g .v p . Without time shift, the Schroedinger equation is deduced, with a supplementary term, which could represent a reference potential. The third part deduces the Quantum Relativist Klein-Gordon equation for a particle in an electromagnetic field

  6. Resolving time of scintillation camera-computer system and methods of correction for counting loss, 2

    International Nuclear Information System (INIS)

    Iinuma, Takeshi; Fukuhisa, Kenjiro; Matsumoto, Toru

    1975-01-01

    Following the previous work, counting-rate performance of camera-computer systems was investigated for two modes of data acquisition. The first was the ''LIST'' mode in which image data and timing signals were sequentially stored on magnetic disk or tape via a buffer memory. The second was the ''HISTOGRAM'' mode in which image data were stored in a core memory as digital images and then the images were transfered to magnetic disk or tape by the signal of frame timing. Firstly, the counting-rates stored in the buffer memory was measured as a function of display event-rates of the scintillation camera for the two modes. For both modes, stored counting-rated (M) were expressed by the following formula: M=N(1-Ntau) where N was the display event-rates of the camera and tau was the resolving time including analog-to-digital conversion time and memory cycle time. The resolving time for each mode may have been different, but it was about 10 μsec for both modes in our computer system (TOSBAC 3400 model 31). Secondly, the date transfer speed from the buffer memory to the external memory such as magnetic disk or tape was considered for the two modes. For the ''LIST'' mode, the maximum value of stored counting-rates from the camera was expressed in terms of size of the buffer memory, access time and data transfer-rate of the external memory. For the ''HISTOGRAM'' mode, the minimum time of the frame was determined by size of the buffer memory, access time and transfer rate of the external memory. In our system, the maximum value of stored counting-rates were about 17,000 counts/sec. with the buffer size of 2,000 words, and minimum frame time was about 130 msec. with the buffer size of 1024 words. These values agree well with the calculated ones. From the author's present analysis, design of the camera-computer system becomes possible for quantitative dynamic imaging and future improvements are suggested. (author)

  7. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems

    Science.gov (United States)

    Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D.

    2016-01-01

    In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems. PMID:27463718

  8. Cloud computing platform for real-time measurement and verification of energy performance

    International Nuclear Information System (INIS)

    Ke, Ming-Tsun; Yeh, Chia-Hung; Su, Cheng-Jie

    2017-01-01

    Highlights: • Application of PSO algorithm can improve the accuracy of the baseline model. • M&V cloud platform automatically calculates energy performance. • M&V cloud platform can be applied in all energy conservation measures. • Real-time operational performance can be monitored through the proposed platform. • M&V cloud platform facilitates the development of EE programs and ESCO industries. - Abstract: Nations worldwide are vigorously promoting policies to improve energy efficiency. The use of measurement and verification (M&V) procedures to quantify energy performance is an essential topic in this field. Currently, energy performance M&V is accomplished via a combination of short-term on-site measurements and engineering calculations. This requires extensive amounts of time and labor and can result in a discrepancy between actual energy savings and calculated results. In addition, the M&V period typically lasts for periods as long as several months or up to a year, the failure to immediately detect abnormal energy performance not only decreases energy performance, results in the inability to make timely correction, and misses the best opportunity to adjust or repair equipment and systems. In this study, a cloud computing platform for the real-time M&V of energy performance is developed. On this platform, particle swarm optimization and multivariate regression analysis are used to construct accurate baseline models. Instantaneous and automatic calculations of the energy performance and access to long-term, cumulative information about the energy performance are provided via a feature that allows direct uploads of the energy consumption data. Finally, the feasibility of this real-time M&V cloud platform is tested for a case study involving improvements to a cold storage system in a hypermarket. Cloud computing platform for real-time energy performance M&V is applicable to any industry and energy conservation measure. With the M&V cloud platform, real-time

  9. Automated selection of brain regions for real-time fMRI brain-computer interfaces

    Science.gov (United States)

    Lührs, Michael; Sorger, Bettina; Goebel, Rainer; Esposito, Fabrizio

    2017-02-01

    Objective. Brain-computer interfaces (BCIs) implemented with real-time functional magnetic resonance imaging (rt-fMRI) use fMRI time-courses from predefined regions of interest (ROIs). To reach best performances, localizer experiments and on-site expert supervision are required for ROI definition. To automate this step, we developed two unsupervised computational techniques based on the general linear model (GLM) and independent component analysis (ICA) of rt-fMRI data, and compared their performances on a communication BCI. Approach. 3 T fMRI data of six volunteers were re-analyzed in simulated real-time. During a localizer run, participants performed three mental tasks following visual cues. During two communication runs, a letter-spelling display guided the subjects to freely encode letters by performing one of the mental tasks with a specific timing. GLM- and ICA-based procedures were used to decode each letter, respectively using compact ROIs and whole-brain distributed spatio-temporal patterns of fMRI activity, automatically defined from subject-specific or group-level maps. Main results. Letter-decoding performances were comparable to supervised methods. In combination with a similarity-based criterion, GLM- and ICA-based approaches successfully decoded more than 80% (average) of the letters. Subject-specific maps yielded optimal performances. Significance. Automated solutions for ROI selection may help accelerating the translation of rt-fMRI BCIs from research to clinical applications.

  10. A sub-cubic time algorithm for computing the quartet distance between two general trees

    DEFF Research Database (Denmark)

    Nielsen, Jesper; Kristensen, Anders Kabell; Mailund, Thomas

    2011-01-01

    Background When inferring phylogenetic trees different algorithms may give different trees. To study such effects a measure for the distance between two trees is useful. Quartet distance is one such measure, and is the number of quartet topologies that differ between two trees. Results We have...... derived a new algorithm for computing the quartet distance between a pair of general trees, i.e. trees where inner nodes can have any degree ≥ 3. The time and space complexity of our algorithm is sub-cubic in the number of leaves and does not depend on the degree of the inner nodes. This makes...... it the fastest algorithm so far for computing the quartet distance between general trees independent of the degree of the inner nodes. Conclusions We have implemented our algorithm and two of the best competitors. Our new algorithm is significantly faster than the competition and seems to run in close...

  11. Computer vision system in real-time for color determination on flat surface food

    Directory of Open Access Journals (Sweden)

    Erick Saldaña

    2013-03-01

    Full Text Available Artificial vision systems also known as computer vision are potent quality inspection tools, which can be applied in pattern recognition for fruits and vegetables analysis. The aim of this research was to design, implement and calibrate a new computer vision system (CVS in real-time for the color measurement on flat surface food. For this purpose was designed and implemented a device capable of performing this task (software and hardware, which consisted of two phases: a image acquisition and b image processing and analysis. Both the algorithm and the graphical interface (GUI were developed in Matlab. The CVS calibration was performed using a conventional colorimeter (Model CIEL* a* b*, where were estimated the errors of the color parameters: eL* = 5.001%, and ea* = 2.287%, and eb* = 4.314 % which ensure adequate and efficient automation application in industrial processes in the quality control in the food industry sector.

  12. Computer vision system in real-time for color determination on flat surface food

    Directory of Open Access Journals (Sweden)

    Erick Saldaña

    2013-01-01

    Full Text Available Artificial vision systems also known as computer vision are potent quality inspection tools, which can be applied in pattern recognition for fruits and vegetables analysis. The aim of this research was to design, implement and calibrate a new computer vision system (CVS in real - time f or the color measurement on flat surface food. For this purpose was designed and implemented a device capable of performing this task (software and hardware, which consisted of two phases: a image acquisition and b image processing and analysis. Both th e algorithm and the graphical interface (GUI were developed in Matlab. The CVS calibration was performed using a conventional colorimeter (Model CIEL* a* b*, where were estimated the errors of the color parameters: e L* = 5.001%, and e a* = 2.287%, and e b* = 4.314 % which ensure adequate and efficient automation application in industrial processes in the quality control in the food industry sector.

  13. A stable computational scheme for stiff time-dependent constitutive equations

    International Nuclear Information System (INIS)

    Shih, C.F.; Delorenzi, H.G.; Miller, A.K.

    1977-01-01

    Viscoplasticity and creep type constitutive equations are increasingly being employed in finite element codes for evaluating the deformation of high temperature structural members. These constitutive equations frequently exhibit stiff regimes which makes an analytical assessment of the structure very costly. A computational scheme for handling deformation in stiff regimes is proposed in this paper. By the finite element discretization, the governing partial differential equations in the spatial (x) and time (t) variables are reduced to a system of nonlinear ordinary differential equations in the independent variable t. The constitutive equations are expanded in a Taylor's series about selected values of t. The resulting system of differential equations are then integrated by an implicit scheme which employs a predictor technique to initiate the Newton-Raphson procedure. To examine the stability and accuracy of the computational scheme, a series of calculations were carried out for uniaxial specimens and thick wall tubes subjected to mechanical and thermal loading. (Auth.)

  14. Real-time dynamics of lattice gauge theories with a few-qubit quantum computer

    Science.gov (United States)

    Martinez, Esteban A.; Muschik, Christine A.; Schindler, Philipp; Nigg, Daniel; Erhard, Alexander; Heyl, Markus; Hauke, Philipp; Dalmonte, Marcello; Monz, Thomas; Zoller, Peter; Blatt, Rainer

    2016-06-01

    Gauge theories are fundamental to our understanding of interactions between the elementary constituents of matter as mediated by gauge bosons. However, computing the real-time dynamics in gauge theories is a notorious challenge for classical computational methods. This has recently stimulated theoretical effort, using Feynman’s idea of a quantum simulator, to devise schemes for simulating such theories on engineered quantum-mechanical devices, with the difficulty that gauge invariance and the associated local conservation laws (Gauss laws) need to be implemented. Here we report the experimental demonstration of a digital quantum simulation of a lattice gauge theory, by realizing (1 + 1)-dimensional quantum electrodynamics (the Schwinger model) on a few-qubit trapped-ion quantum computer. We are interested in the real-time evolution of the Schwinger mechanism, describing the instability of the bare vacuum due to quantum fluctuations, which manifests itself in the spontaneous creation of electron-positron pairs. To make efficient use of our quantum resources, we map the original problem to a spin model by eliminating the gauge fields in favour of exotic long-range interactions, which can be directly and efficiently implemented on an ion trap architecture. We explore the Schwinger mechanism of particle-antiparticle generation by monitoring the mass production and the vacuum persistence amplitude. Moreover, we track the real-time evolution of entanglement in the system, which illustrates how particle creation and entanglement generation are directly related. Our work represents a first step towards quantum simulation of high-energy theories using atomic physics experiments—the long-term intention is to extend this approach to real-time quantum simulations of non-Abelian lattice gauge theories.

  15. Fault tolerant distributed real time computer systems for I and C of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2014-03-15

    Highlights: • Architecture of distributed real time computer system (DRTCS) used in I and C of PFBR is explained. • Fault tolerant (hot standby) architecture, fault detection and switch over are detailed. • Scaled down model was used to study functional and performance requirements of DRTCS. • Quality of service parameters for scaled down model was critically studied. - Abstract: Prototype fast breeder reactor (PFBR) is in the advanced stage of construction at Kalpakkam, India. Three-tier architecture is adopted for instrumentation and control (I and C) of PFBR wherein bottom tier consists of real time computer (RTC) systems, middle tier consists of process computers and top tier constitutes of display stations. These RTC systems are geographically distributed and networked together with process computers and display stations. Hot standby architecture comprising of dual redundant RTC systems with switch over logic system is deployed in order to achieve fault tolerance. Fault tolerant dual redundant network connectivity is provided in each RTC system and TCP/IP protocol is selected for network communication. In order to assess the performance of distributed RTC systems, scaled down model was developed with 9 representative systems and nearly 15% of I and C signals of PFBR were connected and monitored. Functional and performance testing were carried out for each RTC system and the fault tolerant characteristics were studied by creating various faults into the system and observed the performance. Various quality of service parameters like connection establishment delay, priority parameter, transit delay, throughput, residual error ratio, etc., are critically studied for the network.

  16. Rapid identification of pearl powder from Hyriopsis cumingii by Tri-step infrared spectroscopy combined with computer vision technology

    Science.gov (United States)

    Liu, Siqi; Wei, Wei; Bai, Zhiyi; Wang, Xichang; Li, Xiaohong; Wang, Chuanxian; Liu, Xia; Liu, Yuan; Xu, Changhua

    2018-01-01

    Pearl powder, an important raw material in cosmetics and Chinese patent medicines, is commonly uneven in quality and frequently adulterated with low-cost shell powder in the market. The aim of this study is to establish an adequate approach based on Tri-step infrared spectroscopy with enhancing resolution combined with chemometrics for qualitative identification of pearl powder originated from three different quality grades of pearls and quantitative prediction of the proportions of shell powder adulterated in pearl powder. Additionally, computer vision technology (E-eyes) can investigate the color difference among different pearl powders and make it traceable to the pearl quality trait-visual color categories. Though the different grades of pearl powder or adulterated pearl powder have almost identical IR spectra, SD-IR peak intensity at about 861 cm- 1 (v2 band) exhibited regular enhancement with the increasing quality grade of pearls, while the 1082 cm- 1 (v1 band), 712 cm- 1 and 699 cm- 1 (v4 band) were just the reverse. Contrastly, only the peak intensity at 862 cm- 1 was enhanced regularly with the increasing concentration of shell powder. Thus, the bands in the ranges of (1550-1350 cm- 1, 730-680 cm- 1) and (830-880 cm- 1, 690-725 cm- 1) could be exclusive ranges to discriminate three distinct pearl powders and identify adulteration, respectively. For massive sample analysis, a qualitative classification model and a quantitative prediction model based on IR spectra was established successfully by principal component analysis (PCA) and partial least squares (PLS), respectively. The developed method demonstrated great potential for pearl powder quality control and authenticity identification in a direct, holistic manner.

  17. An Epidemic Model of Computer Worms with Time Delay and Variable Infection Rate

    Directory of Open Access Journals (Sweden)

    Yu Yao

    2018-01-01

    Full Text Available With rapid development of Internet, network security issues become increasingly serious. Temporary patches have been put on the infectious hosts, which may lose efficacy on occasions. This leads to a time delay when vaccinated hosts change to susceptible hosts. On the other hand, the worm infection is usually a nonlinear process. Considering the actual situation, a variable infection rate is introduced to describe the spread process of worms. According to above aspects, we propose a time-delayed worm propagation model with variable infection rate. Then the existence condition and the stability of the positive equilibrium are derived. Due to the existence of time delay, the worm propagation system may be unstable and out of control. Moreover, the threshold τ0 of Hopf bifurcation is obtained. The worm propagation system is stable if time delay is less than τ0. When time delay is over τ0, the system will be unstable. In addition, numerical experiments have been performed, which can match the conclusions we deduce. The numerical experiments also show that there exists a threshold in the parameter a, which implies that we should choose appropriate infection rate β(t to constrain worm prevalence. Finally, simulation experiments are carried out to prove the validity of our conclusions.

  18. 21 CFR 10.20 - Submission of documents to Division of Dockets Management; computation of time; availability for...

    Science.gov (United States)

    2010-04-01

    ... Management; computation of time; availability for public disclosure. 10.20 Section 10.20 Food and Drugs FOOD... Management; computation of time; availability for public disclosure. (a) A submission to the Division of Dockets Management of a petition, comment, objection, notice, compilation of information, or any other...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  20. Computer modelling of structures with account of the construction stages and the time dependent material properties

    Directory of Open Access Journals (Sweden)

    Traykov Alexander

    2015-01-01

    Full Text Available Numerical studies are performed on computer models taking into account the stages of construction and time dependent material properties defined in two forms. A 2D model of three storey two spans frame is created. The first form deals with material defined in the usual design practice way - without taking into account the time dependent properties of the concrete. The second form creep and shrinkage of the concrete are taken into account. Displacements and internal forces in specific elements and sections are reported. The influence of the time dependent material properties on the displacement and the internal forces in the main structural elements is tracked down. The results corresponding to the two forms of material definition are compared together as well as with the results obtained by the usual design calculations. Conclusions on the influence of the concrete creep and shrinkage during the construction towards structural behaviour are made.