WorldWideScience

Sample records for real-time temporal bayesian

  1. Temporal Proof Methodologies for Real-Time Systems,

    Science.gov (United States)

    1990-09-01

    real time systems that communicate either through shared variables or by message passing and real time issues such as time-outs, process priorities (interrupts) and process scheduling. The authors exhibit two styles for the specification of real - time systems . While the first approach uses bounded versions of temporal operators the second approach allows explicit references to time through a special clock variable. Corresponding to two styles of specification the authors present and compare two fundamentally different proof

  2. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  3. Temporal logics and real time expert systems

    NARCIS (Netherlands)

    Blom, J.A.

    1996-01-01

    This paper introduces temporal logics. Due to the eternal compromise between expressive adequacy and reasoning efficiency that must decided upon in any application, full (first order logic or modal logic based) temporal logics are frequently not suitable. This is especially true in real time expert

  4. Temporal logics and real time expert systems.

    Science.gov (United States)

    Blom, J A

    1996-10-01

    This paper introduces temporal logics. Due to the eternal compromise between expressive adequacy and reasoning efficiency that must decided upon in any application, full (first order logic or modal logic based) temporal logics are frequently not suitable. This is especially true in real time expert systems, where a fixed (and usually small) response time must be guaranteed. One such expert system, Fagan's VM, is reviewed, and a delineation is given of how to formally describe and reason with time in medical protocols. It is shown that Petri net theory is a useful tool to check the correctness of formalised protocols.

  5. Spatio-Temporal Series Remote Sensing Image Prediction Based on Multi-Dictionary Bayesian Fusion

    Directory of Open Access Journals (Sweden)

    Chu He

    2017-11-01

    Full Text Available Contradictions in spatial resolution and temporal coverage emerge from earth observation remote sensing images due to limitations in technology and cost. Therefore, how to combine remote sensing images with low spatial yet high temporal resolution as well as those with high spatial yet low temporal resolution to construct images with both high spatial resolution and high temporal coverage has become an important problem called spatio-temporal fusion problem in both research and practice. A Multi-Dictionary Bayesian Spatio-Temporal Reflectance Fusion Model (MDBFM has been proposed in this paper. First, multiple dictionaries from regions of different classes are trained. Second, a Bayesian framework is constructed to solve the dictionary selection problem. A pixel-dictionary likehood function and a dictionary-dictionary prior function are constructed under the Bayesian framework. Third, remote sensing images before and after the middle moment are combined to predict images at the middle moment. Diverse shapes and textures information is learned from different landscapes in multi-dictionary learning to help dictionaries capture the distinctions between regions. The Bayesian framework makes full use of the priori information while the input image is classified. The experiments with one simulated dataset and two satellite datasets validate that the MDBFM is highly effective in both subjective and objective evaluation indexes. The results of MDBFM show more precise details and have a higher similarity with real images when dealing with both type changes and phenology changes.

  6. A Bayesian Approach to Real-Time Earthquake Phase Association

    Science.gov (United States)

    Benz, H.; Johnson, C. E.; Earle, P. S.; Patton, J. M.

    2014-12-01

    Real-time location of seismic events requires a robust and extremely efficient means of associating and identifying seismic phases with hypothetical sources. An association algorithm converts a series of phase arrival times into a catalog of earthquake hypocenters. The classical approach based on time-space stacking of the locus of possible hypocenters for each phase arrival using the principal of acoustic reciprocity has been in use now for many years. One of the most significant problems that has emerged over time with this approach is related to the extreme variations in seismic station density throughout the global seismic network. To address this problem we have developed a novel, Bayesian association algorithm, which looks at the association problem as a dynamically evolving complex system of "many to many relationships". While the end result must be an array of one to many relations (one earthquake, many phases), during the association process the situation is quite different. Both the evolving possible hypocenters and the relationships between phases and all nascent hypocenters is many to many (many earthquakes, many phases). The computational framework we are using to address this is a responsive, NoSQL graph database where the earthquake-phase associations are represented as intersecting Bayesian Learning Networks. The approach directly addresses the network inhomogeneity issue while at the same time allowing the inclusion of other kinds of data (e.g., seismic beams, station noise characteristics, priors on estimated location of the seismic source) by representing the locus of intersecting hypothetical loci for a given datum as joint probability density functions.

  7. Evolution of bayesian-related research over time: a temporal text mining task

    CSIR Research Space (South Africa)

    de Waal, A

    2006-06-01

    Full Text Available Ronald Reagan’s Radio Addresses? Bayesian Analysis 2006, Volume 1, Number 2, pp. 189-383. 2. Mei Q and Zhai C, 2005. Discovering Evolutionary Theme Patterns from Text – An Exploration of Temporal Text Mining. KDD’05, August 21-24, 2005. Chicago...

  8. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    Science.gov (United States)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  9. Bayesian median regression for temporal gene expression data

    Science.gov (United States)

    Yu, Keming; Vinciotti, Veronica; Liu, Xiaohui; 't Hoen, Peter A. C.

    2007-09-01

    Most of the existing methods for the identification of biologically interesting genes in a temporal expression profiling dataset do not fully exploit the temporal ordering in the dataset and are based on normality assumptions for the gene expression. In this paper, we introduce a Bayesian median regression model to detect genes whose temporal profile is significantly different across a number of biological conditions. The regression model is defined by a polynomial function where both time and condition effects as well as interactions between the two are included. MCMC-based inference returns the posterior distribution of the polynomial coefficients. From this a simple Bayes factor test is proposed to test for significance. The estimation of the median rather than the mean, and within a Bayesian framework, increases the robustness of the method compared to a Hotelling T2-test previously suggested. This is shown on simulated data and on muscular dystrophy gene expression data.

  10. Assessing global vegetation activity using spatio-temporal Bayesian modelling

    Science.gov (United States)

    Mulder, Vera L.; van Eck, Christel M.; Friedlingstein, Pierre; Regnier, Pierre A. G.

    2016-04-01

    This work demonstrates the potential of modelling vegetation activity using a hierarchical Bayesian spatio-temporal model. This approach allows modelling changes in vegetation and climate simultaneous in space and time. Changes of vegetation activity such as phenology are modelled as a dynamic process depending on climate variability in both space and time. Additionally, differences in observed vegetation status can be contributed to other abiotic ecosystem properties, e.g. soil and terrain properties. Although these properties do not change in time, they do change in space and may provide valuable information in addition to the climate dynamics. The spatio-temporal Bayesian models were calibrated at a regional scale because the local trends in space and time can be better captured by the model. The regional subsets were defined according to the SREX segmentation, as defined by the IPCC. Each region is considered being relatively homogeneous in terms of large-scale climate and biomes, still capturing small-scale (grid-cell level) variability. Modelling within these regions is hence expected to be less uncertain due to the absence of these large-scale patterns, compared to a global approach. This overall modelling approach allows the comparison of model behavior for the different regions and may provide insights on the main dynamic processes driving the interaction between vegetation and climate within different regions. The data employed in this study encompasses the global datasets for soil properties (SoilGrids), terrain properties (Global Relief Model based on SRTM DEM and ETOPO), monthly time series of satellite-derived vegetation indices (GIMMS NDVI3g) and climate variables (Princeton Meteorological Forcing Dataset). The findings proved the potential of a spatio-temporal Bayesian modelling approach for assessing vegetation dynamics, at a regional scale. The observed interrelationships of the employed data and the different spatial and temporal trends support

  11. Sparse Bayesian learning machine for real-time management of reservoir releases

    Science.gov (United States)

    Khalil, Abedalrazq; McKee, Mac; Kemblowski, Mariush; Asefa, Tirusew

    2005-11-01

    Water scarcity and uncertainties in forecasting future water availabilities present serious problems for basin-scale water management. These problems create a need for intelligent prediction models that learn and adapt to their environment in order to provide water managers with decision-relevant information related to the operation of river systems. This manuscript presents examples of state-of-the-art techniques for forecasting that combine excellent generalization properties and sparse representation within a Bayesian paradigm. The techniques are demonstrated as decision tools to enhance real-time water management. A relevance vector machine, which is a probabilistic model, has been used in an online fashion to provide confident forecasts given knowledge of some state and exogenous conditions. In practical applications, online algorithms should recognize changes in the input space and account for drift in system behavior. Support vectors machines lend themselves particularly well to the detection of drift and hence to the initiation of adaptation in response to a recognized shift in system structure. The resulting model will normally have a structure and parameterization that suits the information content of the available data. The utility and practicality of this proposed approach have been demonstrated with an application in a real case study involving real-time operation of a reservoir in a river basin in southern Utah.

  12. Real time eye tracking using Kalman extended spatio-temporal context learning

    Science.gov (United States)

    Munir, Farzeen; Minhas, Fayyaz ul Amir Asfar; Jalil, Abdul; Jeon, Moongu

    2017-06-01

    Real time eye tracking has numerous applications in human computer interaction such as a mouse cursor control in a computer system. It is useful for persons with muscular or motion impairments. However, tracking the movement of the eye is complicated by occlusion due to blinking, head movement, screen glare, rapid eye movements, etc. In this work, we present the algorithmic and construction details of a real time eye tracking system. Our proposed system is an extension of Spatio-Temporal context learning through Kalman Filtering. Spatio-Temporal Context Learning offers state of the art accuracy in general object tracking but its performance suffers due to object occlusion. Addition of the Kalman filter allows the proposed method to model the dynamics of the motion of the eye and provide robust eye tracking in cases of occlusion. We demonstrate the effectiveness of this tracking technique by controlling the computer cursor in real time by eye movements.

  13. Spatial and spatio-temporal bayesian models with R - INLA

    CERN Document Server

    Blangiardo, Marta

    2015-01-01

    Dedication iiiPreface ix1 Introduction 11.1 Why spatial and spatio-temporal statistics? 11.2 Why do we use Bayesian methods for modelling spatial and spatio-temporal structures? 21.3 Why INLA? 31.4 Datasets 32 Introduction to 212.1 The language 212.2 objects 222.3 Data and session management 342.4 Packages 352.5 Programming in 362.6 Basic statistical analysis with 393 Introduction to Bayesian Methods 533.1 Bayesian Philosophy 533.2 Basic Probability Elements 573.3 Bayes Theorem 623.4 Prior and Posterior Distributions 643.5 Working with the Posterior Distribution 663.6 Choosing the Prior Distr

  14. Mobile real-time EEG imaging Bayesian inference with sparse, temporally smooth source priors

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Hansen, Sofie Therese; Stahlhut, Carsten

    2013-01-01

    EEG based real-time imaging of human brain function has many potential applications including quality control, in-line experimental design, brain state decoding, and neuro-feedback. In mobile applications these possibilities are attractive as elements in systems for personal state monitoring...

  15. Improving a real-time object detector with compact temporal information

    DEFF Research Database (Denmark)

    Ahrnbom, Martin; Jensen, Morten Bornø; Åström, Kalle

    2017-01-01

    Neural networks designed for real-time object detection have recently improved significantly, but in practice, look- ing at only a single RGB image at the time may not be ideal. For example, when detecting objects in videos, a foreground detection algorithm can be used to obtain compact temporal......, a problem this approach is well suited for. The ac- curacy was found to improve significantly (up to 66%), with a roughly 40% increase in computational time....

  16. Robust real-time pattern matching using bayesian sequential hypothesis testing.

    Science.gov (United States)

    Pele, Ofir; Werman, Michael

    2008-08-01

    This paper describes a method for robust real time pattern matching. We first introduce a family of image distance measures, the "Image Hamming Distance Family". Members of this family are robust to occlusion, small geometrical transforms, light changes and non-rigid deformations. We then present a novel Bayesian framework for sequential hypothesis testing on finite populations. Based on this framework, we design an optimal rejection/acceptance sampling algorithm. This algorithm quickly determines whether two images are similar with respect to a member of the Image Hamming Distance Family. We also present a fast framework that designs a near-optimal sampling algorithm. Extensive experimental results show that the sequential sampling algorithm performance is excellent. Implemented on a Pentium 4 3 GHz processor, detection of a pattern with 2197 pixels, in 640 x 480 pixel frames, where in each frame the pattern rotated and was highly occluded, proceeds at only 0.022 seconds per frame.

  17. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    Science.gov (United States)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  18. Real time bayesian estimation of the epidemic potential of emerging infectious diseases.

    Directory of Open Access Journals (Sweden)

    Luís M A Bettencourt

    Full Text Available BACKGROUND: Fast changes in human demographics worldwide, coupled with increased mobility, and modified land uses make the threat of emerging infectious diseases increasingly important. Currently there is worldwide alert for H5N1 avian influenza becoming as transmissible in humans as seasonal influenza, and potentially causing a pandemic of unprecedented proportions. Here we show how epidemiological surveillance data for emerging infectious diseases can be interpreted in real time to assess changes in transmissibility with quantified uncertainty, and to perform running time predictions of new cases and guide logistics allocations. METHODOLOGY/PRINCIPAL FINDINGS: We develop an extension of standard epidemiological models, appropriate for emerging infectious diseases, that describes the probabilistic progression of case numbers due to the concurrent effects of (incipient human transmission and multiple introductions from a reservoir. The model is cast in terms of surveillance observables and immediately suggests a simple graphical estimation procedure for the effective reproductive number R (mean number of cases generated by an infectious individual of standard epidemics. For emerging infectious diseases, which typically show large relative case number fluctuations over time, we develop a bayesian scheme for real time estimation of the probability distribution of the effective reproduction number and show how to use such inferences to formulate significance tests on future epidemiological observations. CONCLUSIONS/SIGNIFICANCE: Violations of these significance tests define statistical anomalies that may signal changes in the epidemiology of emerging diseases and should trigger further field investigation. We apply the methodology to case data from World Health Organization reports to place bounds on the current transmissibility of H5N1 influenza in humans and establish a statistical basis for monitoring its evolution in real time.

  19. Analyzing Local Spatio-Temporal Patterns of Police Calls-for-Service Using Bayesian Integrated Nested Laplace Approximation

    Directory of Open Access Journals (Sweden)

    Hui Luan

    2016-09-01

    Full Text Available This research investigates spatio-temporal patterns of police calls-for-service in the Region of Waterloo, Canada, at a fine spatial and temporal resolution. Modeling was implemented via Bayesian Integrated Nested Laplace Approximation (INLA. Temporal patterns for two-hour time periods, spatial patterns at the small-area scale, and space-time interaction (i.e., unusual departures from overall spatial and temporal patterns were estimated. Temporally, calls-for-service were found to be lowest in the early morning (02:00–03:59 and highest in the evening (20:00–21:59, while high levels of calls-for-service were spatially located in central business areas and in areas characterized by major roadways, universities, and shopping centres. Space-time interaction was observed to be geographically dispersed during daytime hours but concentrated in central business areas during evening hours. Interpreted through the routine activity theory, results are discussed with respect to law enforcement resource demand and allocation, and the advantages of modeling spatio-temporal datasets with Bayesian INLA methods are highlighted.

  20. A Bayesian analysis of the unit root in real exchange rates

    NARCIS (Netherlands)

    P.C. Schotman (Peter); H.K. van Dijk (Herman)

    1991-01-01

    textabstractWe propose a posterior odds analysis of the hypothesis of a unit root in real exchange rates. From a Bayesian viewpoint the random walk hypothesis for real exchange rates is a posteriori as probable as a stationary AR(1) process for four out of eight time series investigated. The French

  1. Personal attitudes toward time: The relationship between temporal focus, space-time mappings and real life experiences.

    Science.gov (United States)

    Li, Heng; Cao, Yu

    2017-06-01

    What influences how people implicitly associate "past" and "future" with "front" and "back?" Whereas previous research has shown that cultural attitudes toward time play a role in modulating space-time mappings in people's mental models (de la Fuente, Santiago, Román, Dumitrache & Casasanto, 2014), we investigated real life experiences as potential additional influences on these implicit associations. Participants within the same single culture, who are engaged in different intermediate-term educational experiences (Study 1), long-term living experiences (Study 2), and short-term visiting experiences (Study 3), showed their distinct differences in temporal focus, thereby influencing their implicit spatializations of time. Results across samples suggest that personal attitudes toward time related to real life experiences may influence people's space-time mappings. The findings we report on shed further light on the high flexibility of human conceptualization system. While culture may exert an important influence on temporal focus, a person's conceptualization of time may be attributed to a culmination of factors. © 2017 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  2. Bayesian algorithm implementation in a real time exposure assessment model on benzene with calculation of associated cancer risks.

    Science.gov (United States)

    Sarigiannis, Dimosthenis A; Karakitsios, Spyros P; Gotti, Alberto; Papaloukas, Costas L; Kassomenos, Pavlos A; Pilidis, Georgios A

    2009-01-01

    The objective of the current study was the development of a reliable modeling platform to calculate in real time the personal exposure and the associated health risk for filling station employees evaluating current environmental parameters (traffic, meteorological and amount of fuel traded) determined by the appropriate sensor network. A set of Artificial Neural Networks (ANNs) was developed to predict benzene exposure pattern for the filling station employees. Furthermore, a Physiology Based Pharmaco-Kinetic (PBPK) risk assessment model was developed in order to calculate the lifetime probability distribution of leukemia to the employees, fed by data obtained by the ANN model. Bayesian algorithm was involved in crucial points of both model sub compartments. The application was evaluated in two filling stations (one urban and one rural). Among several algorithms available for the development of the ANN exposure model, Bayesian regularization provided the best results and seemed to be a promising technique for prediction of the exposure pattern of that occupational population group. On assessing the estimated leukemia risk under the scope of providing a distribution curve based on the exposure levels and the different susceptibility of the population, the Bayesian algorithm was a prerequisite of the Monte Carlo approach, which is integrated in the PBPK-based risk model. In conclusion, the modeling system described herein is capable of exploiting the information collected by the environmental sensors in order to estimate in real time the personal exposure and the resulting health risk for employees of gasoline filling stations.

  3. Temporal abstraction and temporal Bayesian networks in clinical domains: a survey.

    Science.gov (United States)

    Orphanou, Kalia; Stassopoulou, Athena; Keravnou, Elpida

    2014-03-01

    Temporal abstraction (TA) of clinical data aims to abstract and interpret clinical data into meaningful higher-level interval concepts. Abstracted concepts are used for diagnostic, prediction and therapy planning purposes. On the other hand, temporal Bayesian networks (TBNs) are temporal extensions of the known probabilistic graphical models, Bayesian networks. TBNs can represent temporal relationships between events and their state changes, or the evolution of a process, through time. This paper offers a survey on techniques/methods from these two areas that were used independently in many clinical domains (e.g. diabetes, hepatitis, cancer) for various clinical tasks (e.g. diagnosis, prognosis). A main objective of this survey, in addition to presenting the key aspects of TA and TBNs, is to point out important benefits from a potential integration of TA and TBNs in medical domains and tasks. The motivation for integrating these two areas is their complementary function: TA provides clinicians with high level views of data while TBNs serve as a knowledge representation and reasoning tool under uncertainty, which is inherent in all clinical tasks. Key publications from these two areas of relevance to clinical systems, mainly circumscribed to the latest two decades, are reviewed and classified. TA techniques are compared on the basis of: (a) knowledge acquisition and representation for deriving TA concepts and (b) methodology for deriving basic and complex temporal abstractions. TBNs are compared on the basis of: (a) representation of time, (b) knowledge representation and acquisition, (c) inference methods and the computational demands of the network, and (d) their applications in medicine. The survey performs an extensive comparative analysis to illustrate the separate merits and limitations of various TA and TBN techniques used in clinical systems with the purpose of anticipating potential gains through an integration of the two techniques, thus leading to a

  4. Real-time digital angiocardiography using a temporal high-pass filter

    International Nuclear Information System (INIS)

    Hardin, C.W.; Kruger, R.A.; Anderson, F.L.; Bray, B.F.; Nelson, J.A.

    1984-01-01

    A temporal high-pass filtration technique for digital subtraction angiocardiography was studied, using real-time digital studies performed simultaneously with routine cineangiocardiography (cine) for qualitative image comparison. The digital studies showed increased contrast and suppression of background anatomy and also enhanced detection of wall motion abnormalities when compared with cine. The digital images are comparable with, and in some cases better than, cine images. Clinical efficacy of this digital technique is currently being evaluated

  5. Bayesian spatio-temporal modeling of particulate matter concentrations in Peninsular Malaysia

    Science.gov (United States)

    Manga, Edna; Awang, Norhashidah

    2016-06-01

    This article presents an application of a Bayesian spatio-temporal Gaussian process (GP) model on particulate matter concentrations from Peninsular Malaysia. We analyze daily PM10 concentration levels from 35 monitoring sites in June and July 2011. The spatiotemporal model set in a Bayesian hierarchical framework allows for inclusion of informative covariates, meteorological variables and spatiotemporal interactions. Posterior density estimates of the model parameters are obtained by Markov chain Monte Carlo methods. Preliminary data analysis indicate information on PM10 levels at sites classified as industrial locations could explain part of the space time variations. We include the site-type indicator in our modeling efforts. Results of the parameter estimates for the fitted GP model show significant spatio-temporal structure and positive effect of the location-type explanatory variable. We also compute some validation criteria for the out of sample sites that show the adequacy of the model for predicting PM10 at unmonitored sites.

  6. Bayesian Algorithm Implementation in a Real Time Exposure Assessment Model on Benzene with Calculation of Associated Cancer Risks

    Directory of Open Access Journals (Sweden)

    Pavlos A. Kassomenos

    2009-02-01

    Full Text Available The objective of the current study was the development of a reliable modeling platform to calculate in real time the personal exposure and the associated health risk for filling station employees evaluating current environmental parameters (traffic, meteorological and amount of fuel traded determined by the appropriate sensor network. A set of Artificial Neural Networks (ANNs was developed to predict benzene exposure pattern for the filling station employees. Furthermore, a Physiology Based Pharmaco-Kinetic (PBPK risk assessment model was developed in order to calculate the lifetime probability distribution of leukemia to the employees, fed by data obtained by the ANN model. Bayesian algorithm was involved in crucial points of both model sub compartments. The application was evaluated in two filling stations (one urban and one rural. Among several algorithms available for the development of the ANN exposure model, Bayesian regularization provided the best results and seemed to be a promising technique for prediction of the exposure pattern of that occupational population group. On assessing the estimated leukemia risk under the scope of providing a distribution curve based on the exposure levels and the different susceptibility of the population, the Bayesian algorithm was a prerequisite of the Monte Carlo approach, which is integrated in the PBPK-based risk model. In conclusion, the modeling system described herein is capable of exploiting the information collected by the environmental sensors in order to estimate in real time the personal exposure and the resulting health risk for employees of gasoline filling stations.

  7. Implementation of a Flexible Bayesian Classifier for the Assessment of Patient’s Activities within a Real-time Personalized Mobile Application

    Directory of Open Access Journals (Sweden)

    V. Miskovic

    2017-02-01

    Full Text Available This paper presents an implementation of a mobile application that provides a real-time personalized assessment of patient’s activities by using a Flexible Bayesian Classifier. The personalized assessment is derived from data collected from the 3-axial accelerometer sensor and the counting steps sensor, both widespread among nowadays mobile devices. Despite the fact that online mobile solutions with Bayesian Classifier have been rare and insufficiently precise, we have proven that the accuracy of the proposed system within a defined data model is comparable to the accuracy of decision trees and neural networks.

  8. Real-Time Tracking of Selective Auditory Attention From M/EEG: A Bayesian Filtering Approach

    Science.gov (United States)

    Miran, Sina; Akram, Sahar; Sheikhattar, Alireza; Simon, Jonathan Z.; Zhang, Tao; Babadi, Behtash

    2018-01-01

    Humans are able to identify and track a target speaker amid a cacophony of acoustic interference, an ability which is often referred to as the cocktail party phenomenon. Results from several decades of studying this phenomenon have culminated in recent years in various promising attempts to decode the attentional state of a listener in a competing-speaker environment from non-invasive neuroimaging recordings such as magnetoencephalography (MEG) and electroencephalography (EEG). To this end, most existing approaches compute correlation-based measures by either regressing the features of each speech stream to the M/EEG channels (the decoding approach) or vice versa (the encoding approach). To produce robust results, these procedures require multiple trials for training purposes. Also, their decoding accuracy drops significantly when operating at high temporal resolutions. Thus, they are not well-suited for emerging real-time applications such as smart hearing aid devices or brain-computer interface systems, where training data might be limited and high temporal resolutions are desired. In this paper, we close this gap by developing an algorithmic pipeline for real-time decoding of the attentional state. Our proposed framework consists of three main modules: (1) Real-time and robust estimation of encoding or decoding coefficients, achieved by sparse adaptive filtering, (2) Extracting reliable markers of the attentional state, and thereby generalizing the widely-used correlation-based measures thereof, and (3) Devising a near real-time state-space estimator that translates the noisy and variable attention markers to robust and statistically interpretable estimates of the attentional state with minimal delay. Our proposed algorithms integrate various techniques including forgetting factor-based adaptive filtering, ℓ1-regularization, forward-backward splitting algorithms, fixed-lag smoothing, and Expectation Maximization. We validate the performance of our proposed

  9. Real-Time Tracking of Selective Auditory Attention From M/EEG: A Bayesian Filtering Approach

    Directory of Open Access Journals (Sweden)

    Sina Miran

    2018-05-01

    Full Text Available Humans are able to identify and track a target speaker amid a cacophony of acoustic interference, an ability which is often referred to as the cocktail party phenomenon. Results from several decades of studying this phenomenon have culminated in recent years in various promising attempts to decode the attentional state of a listener in a competing-speaker environment from non-invasive neuroimaging recordings such as magnetoencephalography (MEG and electroencephalography (EEG. To this end, most existing approaches compute correlation-based measures by either regressing the features of each speech stream to the M/EEG channels (the decoding approach or vice versa (the encoding approach. To produce robust results, these procedures require multiple trials for training purposes. Also, their decoding accuracy drops significantly when operating at high temporal resolutions. Thus, they are not well-suited for emerging real-time applications such as smart hearing aid devices or brain-computer interface systems, where training data might be limited and high temporal resolutions are desired. In this paper, we close this gap by developing an algorithmic pipeline for real-time decoding of the attentional state. Our proposed framework consists of three main modules: (1 Real-time and robust estimation of encoding or decoding coefficients, achieved by sparse adaptive filtering, (2 Extracting reliable markers of the attentional state, and thereby generalizing the widely-used correlation-based measures thereof, and (3 Devising a near real-time state-space estimator that translates the noisy and variable attention markers to robust and statistically interpretable estimates of the attentional state with minimal delay. Our proposed algorithms integrate various techniques including forgetting factor-based adaptive filtering, ℓ1-regularization, forward-backward splitting algorithms, fixed-lag smoothing, and Expectation Maximization. We validate the performance of our

  10. Towards Real-time, On-board, Hardware-Supported Sensor and Software Health Management for Unmanned Aerial Systems

    Science.gov (United States)

    Schumann, Johann; Rozier, Kristin Y.; Reinbacher, Thomas; Mengshoel, Ole J.; Mbaya, Timmy; Ippolito, Corey

    2013-01-01

    Unmanned aerial systems (UASs) can only be deployed if they can effectively complete their missions and respond to failures and uncertain environmental conditions while maintaining safety with respect to other aircraft as well as humans and property on the ground. In this paper, we design a real-time, on-board system health management (SHM) capability to continuously monitor sensors, software, and hardware components for detection and diagnosis of failures and violations of safety or performance rules during the flight of a UAS. Our approach to SHM is three-pronged, providing: (1) real-time monitoring of sensor and/or software signals; (2) signal analysis, preprocessing, and advanced on the- fly temporal and Bayesian probabilistic fault diagnosis; (3) an unobtrusive, lightweight, read-only, low-power realization using Field Programmable Gate Arrays (FPGAs) that avoids overburdening limited computing resources or costly re-certification of flight software due to instrumentation. Our implementation provides a novel approach of combining modular building blocks, integrating responsive runtime monitoring of temporal logic system safety requirements with model-based diagnosis and Bayesian network-based probabilistic analysis. We demonstrate this approach using actual data from the NASA Swift UAS, an experimental all-electric aircraft.

  11. Parametric spectro-temporal analyzer (PASTA) for real-time optical spectrum observation

    Science.gov (United States)

    Zhang, Chi; Xu, Jianbing; Chui, P. C.; Wong, Kenneth K. Y.

    2013-06-01

    Real-time optical spectrum analysis is an essential tool in observing ultrafast phenomena, such as the dynamic monitoring of spectrum evolution. However, conventional method such as optical spectrum analyzers disperse the spectrum in space and allocate it in time sequence by mechanical rotation of a grating, so are incapable of operating at high speed. A more recent method all-optically stretches the spectrum in time domain, but is limited by the allowable input condition. In view of these constraints, here we present a real-time spectrum analyzer called parametric spectro-temporal analyzer (PASTA), which is based on the time-lens focusing mechanism. It achieves a frame rate as high as 100 MHz and accommodates various input conditions. As a proof of concept and also for the first time, we verify its applications in observing the dynamic spectrum of a Fourier domain mode-locked laser, and the spectrum evolution of a laser cavity during its stabilizing process.

  12. Bayesian calibration of simultaneity in audiovisual temporal order judgments.

    Directory of Open Access Journals (Sweden)

    Shinya Yamamoto

    Full Text Available After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation. In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to bayesian integration theory (bayesian calibration. We further showed, in theory, that the effect of bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone in a different block, the point of simultaneity shifted to "sound-first" for the pitch associated with sound-first stimuli, and to "light-first" for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to "light-first" for the pitch associated with sound-first stimuli, and to "sound-first" for the pitch associated with light-first stimuli. The results clearly show that bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.

  13. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    Science.gov (United States)

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes.

  14. A BAYESIAN SPATIAL AND TEMPORAL MODELING APPROACH TO MAPPING GEOGRAPHIC VARIATION IN MORTALITY RATES FOR SUBNATIONAL AREAS WITH R-INLA.

    Science.gov (United States)

    Khana, Diba; Rossen, Lauren M; Hedegaard, Holly; Warner, Margaret

    2018-01-01

    Hierarchical Bayes models have been used in disease mapping to examine small scale geographic variation. State level geographic variation for less common causes of mortality outcomes have been reported however county level variation is rarely examined. Due to concerns about statistical reliability and confidentiality, county-level mortality rates based on fewer than 20 deaths are suppressed based on Division of Vital Statistics, National Center for Health Statistics (NCHS) statistical reliability criteria, precluding an examination of spatio-temporal variation in less common causes of mortality outcomes such as suicide rates (SRs) at the county level using direct estimates. Existing Bayesian spatio-temporal modeling strategies can be applied via Integrated Nested Laplace Approximation (INLA) in R to a large number of rare causes of mortality outcomes to enable examination of spatio-temporal variations on smaller geographic scales such as counties. This method allows examination of spatiotemporal variation across the entire U.S., even where the data are sparse. We used mortality data from 2005-2015 to explore spatiotemporal variation in SRs, as one particular application of the Bayesian spatio-temporal modeling strategy in R-INLA to predict year and county-specific SRs. Specifically, hierarchical Bayesian spatio-temporal models were implemented with spatially structured and unstructured random effects, correlated time effects, time varying confounders and space-time interaction terms in the software R-INLA, borrowing strength across both counties and years to produce smoothed county level SRs. Model-based estimates of SRs were mapped to explore geographic variation.

  15. A Bayesian spatio-temporal geostatistical model with an auxiliary lattice for large datasets

    KAUST Repository

    Xu, Ganggang

    2015-01-01

    When spatio-temporal datasets are large, the computational burden can lead to failures in the implementation of traditional geostatistical tools. In this paper, we propose a computationally efficient Bayesian hierarchical spatio-temporal model in which the spatial dependence is approximated by a Gaussian Markov random field (GMRF) while the temporal correlation is described using a vector autoregressive model. By introducing an auxiliary lattice on the spatial region of interest, the proposed method is not only able to handle irregularly spaced observations in the spatial domain, but it is also able to bypass the missing data problem in a spatio-temporal process. Because the computational complexity of the proposed Markov chain Monte Carlo algorithm is of the order O(n) with n the total number of observations in space and time, our method can be used to handle very large spatio-temporal datasets with reasonable CPU times. The performance of the proposed model is illustrated using simulation studies and a dataset of precipitation data from the coterminous United States.

  16. A real-time architecture for time-aware agents.

    Science.gov (United States)

    Prouskas, Konstantinos-Vassileios; Pitt, Jeremy V

    2004-06-01

    This paper describes the specification and implementation of a new three-layer time-aware agent architecture. This architecture is designed for applications and environments where societies of humans and agents play equally active roles, but interact and operate in completely different time frames. The architecture consists of three layers: the April real-time run-time (ART) layer, the time aware layer (TAL), and the application agents layer (AAL). The ART layer forms the underlying real-time agent platform. An original online, real-time, dynamic priority-based scheduling algorithm is described for scheduling the computation time of agent processes, and it is shown that the algorithm's O(n) complexity and scalable performance are sufficient for application in real-time domains. The TAL layer forms an abstraction layer through which human and agent interactions are temporally unified, that is, handled in a common way irrespective of their temporal representation and scale. A novel O(n2) interaction scheduling algorithm is described for predicting and guaranteeing interactions' initiation and completion times. The time-aware predicting component of a workflow management system is also presented as an instance of the AAL layer. The described time-aware architecture addresses two key challenges in enabling agents to be effectively configured and applied in environments where humans and agents play equally active roles. It provides flexibility and adaptability in its real-time mechanisms while placing them under direct agent control, and it temporally unifies human and agent interactions.

  17. Comparison of Co-Temporal Modeling Algorithms on Sparse Experimental Time Series Data Sets.

    Science.gov (United States)

    Allen, Edward E; Norris, James L; John, David J; Thomas, Stan J; Turkett, William H; Fetrow, Jacquelyn S

    2010-01-01

    Multiple approaches for reverse-engineering biological networks from time-series data have been proposed in the computational biology literature. These approaches can be classified by their underlying mathematical algorithms, such as Bayesian or algebraic techniques, as well as by their time paradigm, which includes next-state and co-temporal modeling. The types of biological relationships, such as parent-child or siblings, discovered by these algorithms are quite varied. It is important to understand the strengths and weaknesses of the various algorithms and time paradigms on actual experimental data. We assess how well the co-temporal implementations of three algorithms, continuous Bayesian, discrete Bayesian, and computational algebraic, can 1) identify two types of entity relationships, parent and sibling, between biological entities, 2) deal with experimental sparse time course data, and 3) handle experimental noise seen in replicate data sets. These algorithms are evaluated, using the shuffle index metric, for how well the resulting models match literature models in terms of siblings and parent relationships. Results indicate that all three co-temporal algorithms perform well, at a statistically significant level, at finding sibling relationships, but perform relatively poorly in finding parent relationships.

  18. Applications of Temporal Graph Metrics to Real-World Networks

    Science.gov (United States)

    Tang, John; Leontiadis, Ilias; Scellato, Salvatore; Nicosia, Vincenzo; Mascolo, Cecilia; Musolesi, Mirco; Latora, Vito

    Real world networks exhibit rich temporal information: friends are added and removed over time in online social networks; the seasons dictate the predator-prey relationship in food webs; and the propagation of a virus depends on the network of human contacts throughout the day. Recent studies have demonstrated that static network analysis is perhaps unsuitable in the study of real world network since static paths ignore time order, which, in turn, results in static shortest paths overestimating available links and underestimating their true corresponding lengths. Temporal extensions to centrality and efficiency metrics based on temporal shortest paths have also been proposed. Firstly, we analyse the roles of key individuals of a corporate network ranked according to temporal centrality within the context of a bankruptcy scandal; secondly, we present how such temporal metrics can be used to study the robustness of temporal networks in presence of random errors and intelligent attacks; thirdly, we study containment schemes for mobile phone malware which can spread via short range radio, similar to biological viruses; finally, we study how the temporal network structure of human interactions can be exploited to effectively immunise human populations. Through these applications we demonstrate that temporal metrics provide a more accurate and effective analysis of real-world networks compared to their static counterparts.

  19. Bayesian switching factor analysis for estimating time-varying functional connectivity in fMRI.

    Science.gov (United States)

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-07-15

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal

  20. A Bayesian Approach for Summarizing and Modeling Time-Series Exposure Data with Left Censoring.

    Science.gov (United States)

    Houseman, E Andres; Virji, M Abbas

    2017-08-01

    Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates

  1. Identifying food deserts and swamps based on relative healthy food access: a spatio-temporal Bayesian approach.

    Science.gov (United States)

    Luan, Hui; Law, Jane; Quick, Matthew

    2015-12-30

    Obesity and other adverse health outcomes are influenced by individual- and neighbourhood-scale risk factors, including the food environment. At the small-area scale, past research has analysed spatial patterns of food environments for one time period, overlooking how food environments change over time. Further, past research has infrequently analysed relative healthy food access (RHFA), a measure that is more representative of food purchasing and consumption behaviours than absolute outlet density. This research applies a Bayesian hierarchical model to analyse the spatio-temporal patterns of RHFA in the Region of Waterloo, Canada, from 2011 to 2014 at the small-area level. RHFA is calculated as the proportion of healthy food outlets (healthy outlets/healthy + unhealthy outlets) within 4-km from each small-area. This model measures spatial autocorrelation of RHFA, temporal trend of RHFA for the study region, and spatio-temporal trends of RHFA for small-areas. For the study region, a significant decreasing trend in RHFA is observed (-0.024), suggesting that food swamps have become more prevalent during the study period. For small-areas, significant decreasing temporal trends in RHFA were observed for all small-areas. Specific small-areas located in south Waterloo, north Kitchener, and southeast Cambridge exhibited the steepest decreasing spatio-temporal trends and are classified as spatio-temporal food swamps. This research demonstrates a Bayesian spatio-temporal modelling approach to analyse RHFA at the small-area scale. Results suggest that food swamps are more prevalent than food deserts in the Region of Waterloo. Analysing spatio-temporal trends of RHFA improves understanding of local food environment, highlighting specific small-areas where policies should be targeted to increase RHFA and reduce risk factors of adverse health outcomes such as obesity.

  2. Internal representations of temporal statistics and feedback calibrate motor-sensory interval timing.

    Directory of Open Access Journals (Sweden)

    Luigi Acerbi

    Full Text Available Humans have been shown to adapt to the temporal statistics of timing tasks so as to optimize the accuracy of their responses, in agreement with the predictions of Bayesian integration. This suggests that they build an internal representation of both the experimentally imposed distribution of time intervals (the prior and of the error (the loss function. The responses of a Bayesian ideal observer depend crucially on these internal representations, which have only been previously studied for simple distributions. To study the nature of these representations we asked subjects to reproduce time intervals drawn from underlying temporal distributions of varying complexity, from uniform to highly skewed or bimodal while also varying the error mapping that determined the performance feedback. Interval reproduction times were affected by both the distribution and feedback, in good agreement with a performance-optimizing Bayesian observer and actor model. Bayesian model comparison highlighted that subjects were integrating the provided feedback and represented the experimental distribution with a smoothed approximation. A nonparametric reconstruction of the subjective priors from the data shows that they are generally in agreement with the true distributions up to third-order moments, but with systematically heavier tails. In particular, higher-order statistical features (kurtosis, multimodality seem much harder to acquire. Our findings suggest that humans have only minor constraints on learning lower-order statistical properties of unimodal (including peaked and skewed distributions of time intervals under the guidance of corrective feedback, and that their behavior is well explained by Bayesian decision theory.

  3. A Bayesian network based framework for real-time crash prediction on the basic freeway segments of urban expressways.

    Science.gov (United States)

    Hossain, Moinul; Muromachi, Yasunori

    2012-03-01

    The concept of measuring the crash risk for a very short time window in near future is gaining more practicality due to the recent advancements in the fields of information systems and traffic sensor technology. Although some real-time crash prediction models have already been proposed, they are still primitive in nature and require substantial improvements to be implemented in real-life. This manuscript investigates the major shortcomings of the existing models and offers solutions to overcome them with an improved framework and modeling method. It employs random multinomial logit model to identify the most important predictors as well as the most suitable detector locations to acquire data to build such a model. Afterwards, it applies Bayesian belief net (BBN) to build the real-time crash prediction model. The model has been constructed using high resolution detector data collected from Shibuya 3 and Shinjuku 4 expressways under the jurisdiction of Tokyo Metropolitan Expressway Company Limited, Japan. It has been specifically built for the basic freeway segments and it predicts the chance of formation of a hazardous traffic condition within the next 4-9 min for a particular 250 meter long road section. The performance evaluation results reflect that at an average threshold value the model is able to successful classify 66% of the future crashes with a false alarm rate less than 20%. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. The multi temporal/multi-model approach to predictive uncertainty assessment in real-time flood forecasting

    Science.gov (United States)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Brocca, Luca; Todini, Ezio

    2017-08-01

    This work extends the multi-temporal approach of the Model Conditional Processor (MCP-MT) to the multi-model case and to the four Truncated Normal Distributions (TNDs) approach, demonstrating the improvement on the single-temporal one. The study is framed in the context of probabilistic Bayesian decision-making that is appropriate to take rational decisions on uncertain future outcomes. As opposed to the direct use of deterministic forecasts, the probabilistic forecast identifies a predictive probability density function that represents a fundamental knowledge on future occurrences. The added value of MCP-MT is the identification of the probability that a critical situation will happen within the forecast lead-time and when, more likely, it will occur. MCP-MT is thoroughly tested for both single-model and multi-model configurations at a gauged site on the Tiber River, central Italy. The stages forecasted by two operative deterministic models, STAFOM-RCM and MISDc, are considered for the study. The dataset used for the analysis consists of hourly data from 34 flood events selected on a time series of six years. MCP-MT improves over the original models' forecasts: the peak overestimation and the rising limb delayed forecast, characterizing MISDc and STAFOM-RCM respectively, are significantly mitigated, with a reduced mean error on peak stage from 45 to 5 cm and an increased coefficient of persistence from 0.53 up to 0.75. The results show that MCP-MT outperforms the single-temporal approach and is potentially useful for supporting decision-making because the exceedance probability of hydrometric thresholds within a forecast horizon and the most probable flooding time can be estimated.

  5. Single breath-hold real-time cine MR imaging: improved temporal resolution using generalized autocalibrating partially parallel acquisition (GRAPPA) algorithm

    International Nuclear Information System (INIS)

    Wintersperger, Bernd J.; Nikolaou, Konstantin; Dietrich, Olaf; Reiser, Maximilian F.; Schoenberg, Stefan O.; Rieber, Johannes; Nittka, Matthias

    2003-01-01

    The purpose of this study was to test parallel imaging techniques for improvement of temporal resolution in multislice single breath-hold real-time cine steady-state free precession (SSFP) in comparison with standard segmented single-slice SSFP techniques. Eighteen subjects were examined on a 1.5-T scanner using a multislice real-time cine SSFP technique using the GRAPPA algorithm. Global left ventricular parameters (EDV, ESV, SV, EF) were evaluated and results compared with a standard segmented single-slice SSFP technique. Results for EDV (r=0.93), ESV (r=0.99), SV (r=0.83), and EF (r=0.99) of real-time multislice SSFP imaging showed a high correlation with results of segmented SSFP acquisitions. Systematic differences between both techniques were statistically non-significant. Single breath-hold multislice techniques using GRAPPA allow for improvement of temporal resolution and for accurate assessment of global left ventricular functional parameters. (orig.)

  6. Improving the effectiveness of real-time flood forecasting through Predictive Uncertainty estimation: the multi-temporal approach

    Science.gov (United States)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Todini, Ezio

    2015-04-01

    The negative effects of severe flood events are usually contrasted through structural measures that, however, do not fully eliminate flood risk. Non-structural measures, such as real-time flood forecasting and warning, are also required. Accurate stage/discharge future predictions with appropriate forecast lead-time are sought by decision-makers for implementing strategies to mitigate the adverse effects of floods. Traditionally, flood forecasting has been approached by using rainfall-runoff and/or flood routing modelling. Indeed, both types of forecasts, cannot be considered perfectly representing future outcomes because of lacking of a complete knowledge of involved processes (Todini, 2004). Nonetheless, although aware that model forecasts are not perfectly representing future outcomes, decision makers are de facto implicitly assuming the forecast of water level/discharge/volume, etc. as "deterministic" and coinciding with what is going to occur. Recently the concept of Predictive Uncertainty (PU) was introduced in hydrology (Krzysztofowicz, 1999), and several uncertainty processors were developed (Todini, 2008). PU is defined as the probability of occurrence of the future realization of a predictand (water level/discharge/volume) conditional on: i) prior observations and knowledge, ii) the available information obtained on the future value, typically provided by one or more forecast models. Unfortunately, PU has been frequently interpreted as a measure of lack of accuracy rather than the appropriate tool allowing to take the most appropriate decisions, given a model or several models' forecasts. With the aim to shed light on the benefits for appropriately using PU, a multi-temporal approach based on the MCP approach (Todini, 2008; Coccia and Todini, 2011) is here applied to stage forecasts at sites along the Upper Tiber River. Specifically, the STAge Forecasting-Rating Curve Model Muskingum-based (STAFOM-RCM) (Barbetta et al., 2014) along with the Rating

  7. Bayesian inference for multivariate point processes observed at sparsely distributed times

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl; Møller, Jesper; Aukema, B.H.

    We consider statistical and computational aspects of simulation-based Bayesian inference for a multivariate point process which is only observed at sparsely distributed times. For specicity we consider a particular data set which has earlier been analyzed by a discrete time model involving unknown...... normalizing constants. We discuss the advantages and disadvantages of using continuous time processes compared to discrete time processes in the setting of the present paper as well as other spatial-temporal situations. Keywords: Bark beetle, conditional intensity, forest entomology, Markov chain Monte Carlo...

  8. Real-Time Control of a Video Game Using Eye Movements and Two Temporal EEG Sensors.

    Science.gov (United States)

    Belkacem, Abdelkader Nasreddine; Saetia, Supat; Zintus-art, Kalanyu; Shin, Duk; Kambara, Hiroyuki; Yoshimura, Natsue; Berrached, Nasreddine; Koike, Yasuharu

    2015-01-01

    EEG-controlled gaming applications range widely from strictly medical to completely nonmedical applications. Games can provide not only entertainment but also strong motivation for practicing, thereby achieving better control with rehabilitation system. In this paper we present real-time control of video game with eye movements for asynchronous and noninvasive communication system using two temporal EEG sensors. We used wavelets to detect the instance of eye movement and time-series characteristics to distinguish between six classes of eye movement. A control interface was developed to test the proposed algorithm in real-time experiments with opened and closed eyes. Using visual feedback, a mean classification accuracy of 77.3% was obtained for control with six commands. And a mean classification accuracy of 80.2% was obtained using auditory feedback for control with five commands. The algorithm was then applied for controlling direction and speed of character movement in two-dimensional video game. Results showed that the proposed algorithm had an efficient response speed and timing with a bit rate of 30 bits/min, demonstrating its efficacy and robustness in real-time control.

  9. A spatio-temporal nonparametric Bayesian variable selection model of fMRI data for clustering correlated time courses.

    Science.gov (United States)

    Zhang, Linlin; Guindani, Michele; Versace, Francesco; Vannucci, Marina

    2014-07-15

    In this paper we present a novel wavelet-based Bayesian nonparametric regression model for the analysis of functional magnetic resonance imaging (fMRI) data. Our goal is to provide a joint analytical framework that allows to detect regions of the brain which exhibit neuronal activity in response to a stimulus and, simultaneously, infer the association, or clustering, of spatially remote voxels that exhibit fMRI time series with similar characteristics. We start by modeling the data with a hemodynamic response function (HRF) with a voxel-dependent shape parameter. We detect regions of the brain activated in response to a given stimulus by using mixture priors with a spike at zero on the coefficients of the regression model. We account for the complex spatial correlation structure of the brain by using a Markov random field (MRF) prior on the parameters guiding the selection of the activated voxels, therefore capturing correlation among nearby voxels. In order to infer association of the voxel time courses, we assume correlated errors, in particular long memory, and exploit the whitening properties of discrete wavelet transforms. Furthermore, we achieve clustering of the voxels by imposing a Dirichlet process (DP) prior on the parameters of the long memory process. For inference, we use Markov Chain Monte Carlo (MCMC) sampling techniques that combine Metropolis-Hastings schemes employed in Bayesian variable selection with sampling algorithms for nonparametric DP models. We explore the performance of the proposed model on simulated data, with both block- and event-related design, and on real fMRI data. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Real-time Fatigue and Free-Living Physical Activity in Hematopoietic Stem Cell Transplantation Cancer Survivors and Healthy Controls: A Preliminary Examination of the Temporal, Dynamic Relationship.

    Science.gov (United States)

    Hacker, Eileen Danaher; Kim, Inah; Park, Chang; Peters, Tara

    Fatigue and physical inactivity, critical problems facing cancer survivors, impact overall health and functioning. Our group designed a novel methodology to evaluate the temporal, dynamic patterns in real-world settings. Using real-time technology, the temporal, dynamic relationship between real-time fatigue and free-living is described and compared in cancer survivors who were treated with hematopoietic stem cell transplantation (n = 25) and age- and gender-matched healthy controls (n = 25). Subjects wore wrist actigraphs on their nondominant hand to assess free-living physical activity, measured in 1-minute epochs, over 7 days. Subjects entered real-time fatigue assessments directly into the subjective event marker of the actigraph 5 times per day. Running averages of mean 1-minute activity counts 30, 60, and 120 minutes before and after each real-time fatigue score were correlated with real-time fatigue using generalized estimating equations, RESULTS:: A strong inverse relationship exists between real-time fatigue and subsequent free-living physical activity. This inverse relationship suggests that increasing real-time fatigue limits subsequent physical activity (B range= -0.002 to -0.004; P < .001). No significant differences in the dynamic patterns of real-time fatigue and free-living physical activity were found between groups. To our knowledge, this is the first study to document the temporal and potentially causal relationship between real-time fatigue and free-living physical activity in real-world setting. These findings suggest that fatigue drives the subsequent physical activity and the relationship may not be bidirectional. Understanding the temporal, dynamic relationship may have important health implications for developing interventions to address fatigue in cancer survivors.

  11. Spike-Based Bayesian-Hebbian Learning of Temporal Sequences

    DEFF Research Database (Denmark)

    Tully, Philip J; Lindén, Henrik; Hennig, Matthias H

    2016-01-01

    Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed...... in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods...

  12. High spatial and temporal resolution retrospective cine cardiovascular magnetic resonance from shortened free breathing real-time acquisitions.

    Science.gov (United States)

    Xue, Hui; Kellman, Peter; Larocca, Gina; Arai, Andrew E; Hansen, Michael S

    2013-11-14

    Cine cardiovascular magnetic resonance (CMR) is challenging in patients who cannot perform repeated breath holds. Real-time, free-breathing acquisition is an alternative, but image quality is typically inferior. There is a clinical need for techniques that achieve similar image quality to the segmented cine using a free breathing acquisition. Previously, high quality retrospectively gated cine images have been reconstructed from real-time acquisitions using parallel imaging and motion correction. These methods had limited clinical applicability due to lengthy acquisitions and volumetric measurements obtained with such methods have not previously been evaluated systematically. This study introduces a new retrospective reconstruction scheme for real-time cine imaging which aims to shorten the required acquisition. A real-time acquisition of 16-20s per acquired slice was inputted into a retrospective cine reconstruction algorithm, which employed non-rigid registration to remove respiratory motion and SPIRiT non-linear reconstruction with temporal regularization to fill in missing data. The algorithm was used to reconstruct cine loops with high spatial (1.3-1.8 × 1.8-2.1 mm²) and temporal resolution (retrospectively gated, 30 cardiac phases, temporal resolution 34.3 ± 9.1 ms). Validation was performed in 15 healthy volunteers using two different acquisition resolutions (256 × 144/192 × 128 matrix sizes). For each subject, 9 to 12 short axis and 3 long axis slices were imaged with both segmented and real-time acquisitions. The retrospectively reconstructed real-time cine images were compared to a traditional segmented breath-held acquisition in terms of image quality scores. Image quality scoring was performed by two experts using a scale between 1 and 5 (poor to good). For every subject, LAX and three SAX slices were selected and reviewed in the random order. The reviewers were blinded to the reconstruction approach and acquisition protocols and

  13. Risk-based design of process systems using discrete-time Bayesian networks

    International Nuclear Information System (INIS)

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2013-01-01

    Temporal Bayesian networks have gained popularity as a robust technique to model dynamic systems in which the components' sequential dependency, as well as their functional dependency, cannot be ignored. In this regard, discrete-time Bayesian networks have been proposed as a viable alternative to solve dynamic fault trees without resort to Markov chains. This approach overcomes the drawbacks of Markov chains such as the state-space explosion and the error-prone conversion procedure from dynamic fault tree. It also benefits from the inherent advantages of Bayesian networks such as probability updating. However, effective mapping of the dynamic gates of dynamic fault trees into Bayesian networks while avoiding the consequent huge multi-dimensional probability tables has always been a matter of concern. In this paper, a new general formalism has been developed to model two important elements of dynamic fault tree, i.e., cold spare gate and sequential enforcing gate, with any arbitrary probability distribution functions. Also, an innovative Neutral Dependency algorithm has been introduced to model dynamic gates such as priority-AND gate, thus reducing the dimension of conditional probability tables by an order of magnitude. The second part of the paper is devoted to the application of discrete-time Bayesian networks in the risk assessment and safety analysis of complex process systems. It has been shown how dynamic techniques can effectively be applied for optimal allocation of safety systems to obtain maximum risk reduction.

  14. Recursive Bayesian recurrent neural networks for time-series modeling.

    Science.gov (United States)

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  15. Hierarchical Bayesian Spatio–Temporal Analysis of Climatic and Socio–Economic Determinants of Rocky Mountain Spotted Fever

    Science.gov (United States)

    Raghavan, Ram K.; Goodin, Douglas G.; Neises, Daniel; Anderson, Gary A.; Ganta, Roman R.

    2016-01-01

    This study aims to examine the spatio-temporal dynamics of Rocky Mountain spotted fever (RMSF) prevalence in four contiguous states of Midwestern United States, and to determine the impact of environmental and socio–economic factors associated with this disease. Bayesian hierarchical models were used to quantify space and time only trends and spatio–temporal interaction effect in the case reports submitted to the state health departments in the region. Various socio–economic, environmental and climatic covariates screened a priori in a bivariate procedure were added to a main–effects Bayesian model in progressive steps to evaluate important drivers of RMSF space-time patterns in the region. Our results show a steady increase in RMSF incidence over the study period to newer geographic areas, and the posterior probabilities of county-specific trends indicate clustering of high risk counties in the central and southern parts of the study region. At the spatial scale of a county, the prevalence levels of RMSF is influenced by poverty status, average relative humidity, and average land surface temperature (>35°C) in the region, and the relevance of these factors in the context of climate–change impacts on tick–borne diseases are discussed. PMID:26942604

  16. Hierarchical Bayesian Spatio-Temporal Analysis of Climatic and Socio-Economic Determinants of Rocky Mountain Spotted Fever.

    Science.gov (United States)

    Raghavan, Ram K; Goodin, Douglas G; Neises, Daniel; Anderson, Gary A; Ganta, Roman R

    2016-01-01

    This study aims to examine the spatio-temporal dynamics of Rocky Mountain spotted fever (RMSF) prevalence in four contiguous states of Midwestern United States, and to determine the impact of environmental and socio-economic factors associated with this disease. Bayesian hierarchical models were used to quantify space and time only trends and spatio-temporal interaction effect in the case reports submitted to the state health departments in the region. Various socio-economic, environmental and climatic covariates screened a priori in a bivariate procedure were added to a main-effects Bayesian model in progressive steps to evaluate important drivers of RMSF space-time patterns in the region. Our results show a steady increase in RMSF incidence over the study period to newer geographic areas, and the posterior probabilities of county-specific trends indicate clustering of high risk counties in the central and southern parts of the study region. At the spatial scale of a county, the prevalence levels of RMSF is influenced by poverty status, average relative humidity, and average land surface temperature (>35°C) in the region, and the relevance of these factors in the context of climate-change impacts on tick-borne diseases are discussed.

  17. Real-Time Control of a Video Game Using Eye Movements and Two Temporal EEG Sensors

    Directory of Open Access Journals (Sweden)

    Abdelkader Nasreddine Belkacem

    2015-01-01

    Full Text Available EEG-controlled gaming applications range widely from strictly medical to completely nonmedical applications. Games can provide not only entertainment but also strong motivation for practicing, thereby achieving better control with rehabilitation system. In this paper we present real-time control of video game with eye movements for asynchronous and noninvasive communication system using two temporal EEG sensors. We used wavelets to detect the instance of eye movement and time-series characteristics to distinguish between six classes of eye movement. A control interface was developed to test the proposed algorithm in real-time experiments with opened and closed eyes. Using visual feedback, a mean classification accuracy of 77.3% was obtained for control with six commands. And a mean classification accuracy of 80.2% was obtained using auditory feedback for control with five commands. The algorithm was then applied for controlling direction and speed of character movement in two-dimensional video game. Results showed that the proposed algorithm had an efficient response speed and timing with a bit rate of 30 bits/min, demonstrating its efficacy and robustness in real-time control.

  18. An algorithm for learning real-time automata

    NARCIS (Netherlands)

    Verwer, S.E.; De Weerdt, M.M.; Witteveen, C.

    2007-01-01

    We describe an algorithm for learning simple timed automata, known as real-time automata. The transitions of real-time automata can have a temporal constraint on the time of occurrence of the current symbol relative to the previous symbol. The learning algorithm is similar to the redblue fringe

  19. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  20. Automated Bayesian model development for frequency detection in biological time series.

    Science.gov (United States)

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time

  1. Proposta de arquitetura de hardware e software para sistemas tempo-real distribuídos

    OpenAIRE

    Marcelo Gotz

    2001-01-01

    Um sistema tempo-real caracteriza-se por possuir requisitos temporais para execução de suas atividades, e de acordo com a sua tolerância ao atendimento destes requisitos é classificado em hard-real-time ou soft-real-time. O presente trabalho se propõe a apresentar uma arquitetura de hardware e software para suporte a sistemas tempo-real embarcados de baixo custo com objetivo de aplicação em pesquisas no meio acadêmico e que possa ser usado até em ambientes hard-real-time. A motivação para est...

  2. Hierarchical Bayesian Spatio-Temporal Analysis of Climatic and Socio-Economic Determinants of Rocky Mountain Spotted Fever.

    Directory of Open Access Journals (Sweden)

    Ram K Raghavan

    Full Text Available This study aims to examine the spatio-temporal dynamics of Rocky Mountain spotted fever (RMSF prevalence in four contiguous states of Midwestern United States, and to determine the impact of environmental and socio-economic factors associated with this disease. Bayesian hierarchical models were used to quantify space and time only trends and spatio-temporal interaction effect in the case reports submitted to the state health departments in the region. Various socio-economic, environmental and climatic covariates screened a priori in a bivariate procedure were added to a main-effects Bayesian model in progressive steps to evaluate important drivers of RMSF space-time patterns in the region. Our results show a steady increase in RMSF incidence over the study period to newer geographic areas, and the posterior probabilities of county-specific trends indicate clustering of high risk counties in the central and southern parts of the study region. At the spatial scale of a county, the prevalence levels of RMSF is influenced by poverty status, average relative humidity, and average land surface temperature (>35°C in the region, and the relevance of these factors in the context of climate-change impacts on tick-borne diseases are discussed.

  3. The use of a priori information in ICA-based techniques for real-time fMRI: an evaluation of static/dynamic and spatial/temporal characteristics

    Directory of Open Access Journals (Sweden)

    Nicola eSoldati

    2013-03-01

    Full Text Available Real-time brain functional MRI (rt-fMRI allows in-vivo non-invasive monitoring of neural networks. The use of multivariate data-driven analysis methods such as independent component analysis (ICA offers an attractive trade-off between data interpretability and information extraction, and can be used during both task-based and rest experiments. The purpose of this study was to assess the effectiveness of different ICA-based procedures to monitor in real-time a target IC defined from a functional localizer which also used ICA. Four novel methods were implemented to monitor ongoing brain activity in a sliding window approach. The methods differed in the ways in which a priori information, derived from ICA algorithms, was used to monitora target independent component (IC. We implemented four different algorithms, all based on ICA. One Back-projection method used ICA to derive static spatial information from the functional localizer, off line, which was then back-projected dynamically during the real-time acquisition. The other three methods used real-time ICA algorithms that dynamically exploited temporal, spatial, or spatial-temporal priors during the real-time acquisition. The methods were evaluated by simulating a rt-fMRI experiment that used real fMRI data. The performance of each method was characterized by the spatial and/or temporal correlation with the target IC component monitored, computation time and intrinsic stochastic variability of the algorithms. In this study the Back-projection method, which could monitor more than one IC of interest, outperformed the other methods. These results are consistent with a functional task that gives stable target ICs over time. The dynamic adaptation possibilities offered by the other ICA methods proposed may offer better performance than the Back-projection in conditions where the functional activation shows higher spatial and/or temporal variability.

  4. Toward a real-time system for temporal enhanced ultrasound-guided prostate biopsy.

    Science.gov (United States)

    Azizi, Shekoofeh; Van Woudenberg, Nathan; Sojoudi, Samira; Li, Ming; Xu, Sheng; Abu Anas, Emran M; Yan, Pingkun; Tahmasebi, Amir; Kwak, Jin Tae; Turkbey, Baris; Choyke, Peter; Pinto, Peter; Wood, Bradford; Mousavi, Parvin; Abolmaesumi, Purang

    2018-03-27

    We have previously proposed temporal enhanced ultrasound (TeUS) as a new paradigm for tissue characterization. TeUS is based on analyzing a sequence of ultrasound data with deep learning and has been demonstrated to be successful for detection of cancer in ultrasound-guided prostate biopsy. Our aim is to enable the dissemination of this technology to the community for large-scale clinical validation. In this paper, we present a unified software framework demonstrating near-real-time analysis of ultrasound data stream using a deep learning solution. The system integrates ultrasound imaging hardware, visualization and a deep learning back-end to build an accessible, flexible and robust platform. A client-server approach is used in order to run computationally expensive algorithms in parallel. We demonstrate the efficacy of the framework using two applications as case studies. First, we show that prostate cancer detection using near-real-time analysis of RF and B-mode TeUS data and deep learning is feasible. Second, we present real-time segmentation of ultrasound prostate data using an integrated deep learning solution. The system is evaluated for cancer detection accuracy on ultrasound data obtained from a large clinical study with 255 biopsy cores from 157 subjects. It is further assessed with an independent dataset with 21 biopsy targets from six subjects. In the first study, we achieve area under the curve, sensitivity, specificity and accuracy of 0.94, 0.77, 0.94 and 0.92, respectively, for the detection of prostate cancer. In the second study, we achieve an AUC of 0.85. Our results suggest that TeUS-guided biopsy can be potentially effective for the detection of prostate cancer.

  5. Bayesian inference and the analytic continuation of imaginary-time quantum Monte Carlo data

    International Nuclear Information System (INIS)

    Gubernatis, J.E.; Bonca, J.; Jarrell, M.

    1995-01-01

    We present brief description of how methods of Bayesian inference are used to obtain real frequency information by the analytic continuation of imaginary-time quantum Monte Carlo data. We present the procedure we used, which is due to R. K. Bryan, and summarize several bottleneck issues

  6. Real-Time, Single-Shot Temporal Measurements of Short Electron Bunches, Terahertz CSR and FEL Radiation

    CERN Document Server

    Berden, G; Van der Meer, A F G

    2005-01-01

    Electro-optic detection of the Coulomb field of electron bunches is a promising technique for single-shot measurements of the bunch length and shape in the sub-picosecond time domain. This technique has been applied to the measurement of 50 MeV electron bunches in the FELIX free electron laser, showing the longitudinal profile of single bunches of around 650 fs FWHM [Phys. Rev. Lett. 93, 114802 (2004)]. The method is non-destructive and real-time, and therefore ideal for online monitoring of the longitudinal shape of single electron bunches. At FELIX we have used it for real-time optimization of sub-picosecond electron bunches. Electro-optic detection has also been used to measure the electric field profiles of far-infrared (or terahertz) optical pulses generated by the relativistic electrons. We have characterised the far-infrared output of the free electron laser, and more recently, we have measured the temporal profile of terahertz optical pulses generated at one of the bending magnets.

  7. Sparse Estimation Using Bayesian Hierarchical Prior Modeling for Real and Complex Linear Models

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Badiu, Mihai Alin

    2015-01-01

    In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex-valued m......In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex...... error, and robustness in low and medium signal-to-noise ratio regimes....

  8. Towards real-time detection and tracking of spatio-temporal features: Blob-filaments in fusion plasma

    International Nuclear Information System (INIS)

    Wu, Lingfei; Wu, Kesheng; Sim, Alex; Churchill, Michael; Choi, Jong Youl

    2016-01-01

    A novel algorithm and implementation of real-time identification and tracking of blob-filaments in fusion reactor data is presented. Similar spatio-temporal features are important in many other applications, for example, ignition kernels in combustion and tumor cells in a medical image. This work presents an approach for extracting these features by dividing the overall task into three steps: local identification of feature cells, grouping feature cells into extended feature, and tracking movement of feature through overlapping in space. Through our extensive work in parallelization, we demonstrate that this approach can effectively make use of a large number of compute nodes to detect and track blob-filaments in real time in fusion plasma. Here, on a set of 30GB fusion simulation data, we observed linear speedup on 1024 processes and completed blob detection in less than three milliseconds using Edison, a Cray XC30 system at NERSC.

  9. Design of the real time systems using temporal logic specifications: a case study

    Directory of Open Access Journals (Sweden)

    A. Ursu

    1996-07-01

    Full Text Available An implementation method for real time systems is proposed in this article. The implementation starts with the design of the functional specifications of the systems behaviour. The functional specifications are introduced as a set of rules describing the partial time ordering of the actions performed by the system. These rules are then written in terms of temporal logic formulae. The temporal logic formulae are checked using Z.Manna-P.Wolper satisfiability analysis procedure [1]. It is known that this procedure generates a state-graph which can be regarded as a state- based automaton of the system. The sate-based automaton is used then to generate the dual (inverted automaton of the system. The dual automaton is called action-based automaton and can be created using the procedure proposed by authors in [4,5]. Using the action-based automaton of the system the design method introduced in [5,6] is applied to implement the system driver in a systematic manner which can be computerised. The method proposed in this paper is an efficient complementation and generalisation of the results [4,5,6] mentioned above. The method is used for a case study. An elevator control system is designed using the proposed method. The design is carried out in a systematic manner which includes: a design of functional specifications, b design of temporal logic specifications, c satisfiability analysis of temporal logic specifications, d design of the state-based automaton of the specifications, e design of the action-based automaton of the system, f design of the transition activation conditions, g design of the action activation conditions, h design of the functional model of the elevator control system, i implementation of the elevator's actions, j design of the elevator control system driver.

  10. A distributed agent architecture for real-time knowledge-based systems: Real-time expert systems project, phase 1

    Science.gov (United States)

    Lee, S. Daniel

    1990-01-01

    We propose a distributed agent architecture (DAA) that can support a variety of paradigms based on both traditional real-time computing and artificial intelligence. DAA consists of distributed agents that are classified into two categories: reactive and cognitive. Reactive agents can be implemented directly in Ada to meet hard real-time requirements and be deployed on on-board embedded processors. A traditional real-time computing methodology under consideration is the rate monotonic theory that can guarantee schedulability based on analytical methods. AI techniques under consideration for reactive agents are approximate or anytime reasoning that can be implemented using Bayesian belief networks as in Guardian. Cognitive agents are traditional expert systems that can be implemented in ART-Ada to meet soft real-time requirements. During the initial design of cognitive agents, it is critical to consider the migration path that would allow initial deployment on ground-based workstations with eventual deployment on on-board processors. ART-Ada technology enables this migration while Lisp-based technologies make it difficult if not impossible. In addition to reactive and cognitive agents, a meta-level agent would be needed to coordinate multiple agents and to provide meta-level control.

  11. Ab initio quantum-enhanced optical phase estimation using real-time feedback control

    DEFF Research Database (Denmark)

    Berni, Adriano; Gehring, Tobias; Nielsen, Bo Melholt

    2015-01-01

    of a quantum-enhanced and fully deterministic ab initio phase estimation protocol based on real-time feedback control. Using robust squeezed states of light combined with a real-time Bayesian adaptive estimation algorithm, we demonstrate deterministic phase estimation with a precision beyond the quantum shot...... noise limit. The demonstrated protocol opens up new opportunities for quantum microscopy, quantum metrology and quantum information processing....

  12. Spatio-temporal interpolation of precipitation during monsoon periods in Pakistan

    Science.gov (United States)

    Hussain, Ijaz; Spöck, Gunter; Pilz, Jürgen; Yu, Hwa-Lung

    2010-08-01

    Spatio-temporal estimation of precipitation over a region is essential to the modeling of hydrologic processes for water resources management. The changes of magnitude and space-time heterogeneity of rainfall observations make space-time estimation of precipitation a challenging task. In this paper we propose a Box-Cox transformed hierarchical Bayesian multivariate spatio-temporal interpolation method for the skewed response variable. The proposed method is applied to estimate space-time monthly precipitation in the monsoon periods during 1974-2000, and 27-year monthly average precipitation data are obtained from 51 stations in Pakistan. The results of transformed hierarchical Bayesian multivariate spatio-temporal interpolation are compared to those of non-transformed hierarchical Bayesian interpolation by using cross-validation. The software developed by [11] is used for Bayesian non-stationary multivariate space-time interpolation. It is observed that the transformed hierarchical Bayesian method provides more accuracy than the non-transformed hierarchical Bayesian method.

  13. Real-time statistical quality control and ARM

    International Nuclear Information System (INIS)

    Blough, D.K.

    1992-05-01

    An important component of the Atmospheric Radiation Measurement (ARM) Program is real-time quality control of data obtained from meteorological instruments. It is the goal of the ARM program to enhance the predictive capabilities of global circulation models by incorporating in them more detailed information on the radiative characteristics of the earth's atmosphere. To this end, a number of Cloud and Radiation Testbeds (CART's) will be built at various locations worldwide. Each CART will consist of an array of instruments designed to collect radiative data. The large amount of data obtained from these instruments necessitates real-time processing in order to flag outliers and possible instrument malfunction. The Bayesian dynamic linear model (DLM) proves to be an effective way of monitoring the time series data which each instrument generates. It provides a flexible yet powerful approach to detecting in real-time sudden shifts in a non-stationary multivariate time series. An application of these techniques to data arising from a remote sensing instrument to be used in the CART is provided. Using real data from a wind profiler, the ability of the DLM to detect outliers is studied. 5 refs

  14. A near real-time satellite-based global drought climate data record

    International Nuclear Information System (INIS)

    AghaKouchak, Amir; Nakhjiri, Navid

    2012-01-01

    Reliable drought monitoring requires long-term and continuous precipitation data. High resolution satellite measurements provide valuable precipitation information on a quasi-global scale. However, their short lengths of records limit their applications in drought monitoring. In addition to this limitation, long-term low resolution satellite-based gauge-adjusted data sets such as the Global Precipitation Climatology Project (GPCP) one are not available in near real-time form for timely drought monitoring. This study bridges the gap between low resolution long-term satellite gauge-adjusted data and the emerging high resolution satellite precipitation data sets to create a long-term climate data record of droughts. To accomplish this, a Bayesian correction algorithm is used to combine GPCP data with real-time satellite precipitation data sets for drought monitoring and analysis. The results showed that the combined data sets after the Bayesian correction were a significant improvement compared to the uncorrected data. Furthermore, several recent major droughts such as the 2011 Texas, 2010 Amazon and 2010 Horn of Africa droughts were detected in the combined real-time and long-term satellite observations. This highlights the potential application of satellite precipitation data for regional to global drought monitoring. The final product is a real-time data-driven satellite-based standardized precipitation index that can be used for drought monitoring especially over remote and/or ungauged regions. (letter)

  15. Integrating real-time and manual monitored data to predict hillslope soil moisture dynamics with high spatio-temporal resolution using linear and non-linear models

    Science.gov (United States)

    Spatio-temporal variability of soil moisture (') is a challenge that remains to be better understood. A trade-off exists between spatial coverage and temporal resolution when using the manual and real-time ' monitoring methods. This restricted the comprehensive and intensive examination of ' dynamic...

  16. Optimization of Time-Partitions for Mixed-Criticality Real-Time Distributed Embedded Systems

    DEFF Research Database (Denmark)

    Tamas-Selicean, Domitian; Pop, Paul

    2011-01-01

    In this paper we are interested in mixed-criticality embedded real-time applications mapped on distributed heterogeneous architectures. The architecture provides both spatial and temporal partitioning, thus enforcing enough separation for the critical applications. With temporal partitioning, each...

  17. Case Study: A Real-Time Flood Forecasting System with Predictive Uncertainty Estimation for the Godavari River, India

    Directory of Open Access Journals (Sweden)

    Silvia Barbetta

    2016-10-01

    Full Text Available This work presents the application of the multi-temporal approach of the Model Conditional Processor (MCP-MT for predictive uncertainty (PU estimation in the Godavari River basin, India. MCP-MT is developed for making probabilistic Bayesian decision. It is the most appropriate approach if the uncertainty of future outcomes is to be considered. It yields the best predictive density of future events and allows determining the probability that a critical warning threshold may be exceeded within a given forecast time. In Bayesian decision-making, the predictive density represents the best available knowledge on a future event to address a rational decision-making process. MCP-MT has already been tested for case studies selected in Italian river basins, showing evidence of improvement of the effectiveness of operative real-time flood forecasting systems. The application of MCP-MT for two river reaches selected in the Godavari River basin, India, is here presented and discussed by considering the stage forecasts provided by a deterministic model, STAFOM-RCM, and hourly dataset based on seven monsoon seasons in the period 2001–2010. The results show that the PU estimate is useful for finding the exceedance probability for a given hydrometric threshold as function of the forecast time up to 24 h, demonstrating the potential usefulness for supporting real-time decision-making. Moreover, the expected value provided by MCP-MT yields better results than the deterministic model predictions, with higher Nash–Sutcliffe coefficients and lower error on stage forecasts, both in term of mean error and standard deviation and root mean square error.

  18. Sniffer para redes Ethernet de tempo-real baseado em FPGA

    OpenAIRE

    Faria, João Pedro Puga

    2008-01-01

    A crescente utilização de sistemas distribuídos em aplicações de tempo-real tem levado á criação de protocolos de comunicação cada vez mais com- plexos e sofisticados. Apesar da rede Ethernet não apresentar característi- cas de tempo-real, devido ás suas vantagens, têm sido desenvolvidos muitos protocolos de comunicação tempo-real baseados em Ethernet. Nesta disser- tação é analisada a importância das arquitecturas distribuídas em aplicações de tempo-real, sendo apresentados...

  19. Quantifying temporal trends in fisheries abundance using Bayesian dynamic linear models: A case study of riverine Smallmouth Bass populations

    Science.gov (United States)

    Schall, Megan K.; Blazer, Vicki S.; Lorantas, Robert M.; Smith, Geoffrey; Mullican, John E.; Keplinger, Brandon J.; Wagner, Tyler

    2018-01-01

    Detecting temporal changes in fish abundance is an essential component of fisheries management. Because of the need to understand short‐term and nonlinear changes in fish abundance, traditional linear models may not provide adequate information for management decisions. This study highlights the utility of Bayesian dynamic linear models (DLMs) as a tool for quantifying temporal dynamics in fish abundance. To achieve this goal, we quantified temporal trends of Smallmouth Bass Micropterus dolomieu catch per effort (CPE) from rivers in the mid‐Atlantic states, and we calculated annual probabilities of decline from the posterior distributions of annual rates of change in CPE. We were interested in annual declines because of recent concerns about fish health in portions of the study area. In general, periods of decline were greatest within the Susquehanna River basin, Pennsylvania. The declines in CPE began in the late 1990s—prior to observations of fish health problems—and began to stabilize toward the end of the time series (2011). In contrast, many of the other rivers investigated did not have the same magnitude or duration of decline in CPE. Bayesian DLMs provide information about annual changes in abundance that can inform management and are easily communicated with managers and stakeholders.

  20. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  1. Bayesian spatio-temporal modelling of tobacco-related cancer mortality in Switzerland

    Directory of Open Access Journals (Sweden)

    Verena Jürgens

    2013-05-01

    Full Text Available Tobacco smoking is a main cause of disease in Switzerland; lung cancer being the most common cancer mortality in men and the second most common in women. Although disease-specific mortality is decreasing in men, it is steadily increasing in women. The four language regions in this country might play a role in this context as they are influenced in different ways by the cultural and social behaviour of neighbouring countries. Bayesian hierarchical spatio-temporal, negative binomial models were fitted on subgroup-specific death rates indirectly standardized by national references to explore age- and gender-specific spatio-temporal patterns of mortality due to lung cancer and other tobacco-related cancers in Switzerland for the time period 1969-2002. Differences influenced by linguistic region and life in rural or urban areas were also accounted for. Male lung cancer mortality was found to be rather homogeneous in space, whereas women were confirmed to be more affected in urban regions. Compared to the German-speaking part, female mortality was higher in the French-speaking part of the country, a result contradicting other reports of similar comparisons between France and Germany. The spatio-temporal patterns of mortality were similar for lung cancer and other tobacco-related cancers. The estimated mortality maps can support the planning in health care services and evaluation of a national tobacco control programme. Better understanding of spatial and temporal variation of cancer of the lung and other tobacco-related cancers may help in allocating resources for more effective screening, diagnosis and therapy. The methodology can be applied to similar studies in other settings.

  2. Real-Time Video Stylization Using Object Flows.

    Science.gov (United States)

    Lu, Cewu; Xiao, Yao; Tang, Chi-Keung

    2017-05-05

    We present a real-time video stylization system and demonstrate a variety of painterly styles rendered on real video inputs. The key technical contribution lies on the object flow, which is robust to inaccurate optical flow, unknown object transformation and partial occlusion as well. Since object flows relate regions of the same object across frames, shower-door effect can be effectively reduced where painterly strokes and textures are rendered on video objects. The construction of object flows is performed in real time and automatically after applying metric learning. To reduce temporal flickering, we extend the bilateral filtering into motion bilateral filtering. We propose quantitative metrics to measure the temporal coherence on structures and textures of our stylized videos, and perform extensive experiments to compare our stylized results with baseline systems and prior works specializing in watercolor and abstraction.

  3. Research on Control Method Based on Real-Time Operational Reliability Evaluation for Space Manipulator

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2014-05-01

    Full Text Available A control method based on real-time operational reliability evaluation for space manipulator is presented for improving the success rate of a manipulator during the execution of a task. In this paper, a method for quantitative analysis of operational reliability is given when manipulator is executing a specified task; then a control model which could control the quantitative operational reliability is built. First, the control process is described by using a state space equation. Second, process parameters are estimated in real time using Bayesian method. Third, the expression of the system's real-time operational reliability is deduced based on the state space equation and process parameters which are estimated using Bayesian method. Finally, a control variable regulation strategy which considers the cost of control is given based on the Theory of Statistical Process Control. It is shown via simulations that this method effectively improves the operational reliability of space manipulator control system.

  4. Compiling graphical real-time specifications into silicon

    DEFF Research Database (Denmark)

    Fränzle, Martin; Lüth, Karsten

    1998-01-01

    The basic algorithms underlying an automatic hardware synthesis environment using fully formal graphical requirements specifications as source language are outlined. The source language is real-time symbolic timing diagrams [FeyerabendJosko97], which are a metric-time temporal logic such that hard...

  5. Temporal compression in episodic memory for real-life events.

    Science.gov (United States)

    Jeunehomme, Olivier; Folville, Adrien; Stawarczyk, David; Van der Linden, Martial; D'Argembeau, Arnaud

    2018-07-01

    Remembering an event typically takes less time than experiencing it, suggesting that episodic memory represents past experience in a temporally compressed way. Little is known, however, about how the continuous flow of real-life events is summarised in memory. Here we investigated the nature and determinants of temporal compression by directly comparing memory contents with the objective timing of events as measured by a wearable camera. We found that episodic memories consist of a succession of moments of prior experience that represent events with varying compression rates, such that the density of retrieved information is modulated by goal processing and perceptual changes. Furthermore, the results showed that temporal compression rates remain relatively stable over one week and increase after a one-month delay, particularly for goal-related events. These data shed new light on temporal compression in episodic memory and suggest that compression rates are adaptively modulated to maintain current goal-relevant information.

  6. Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.

    Directory of Open Access Journals (Sweden)

    Philip J Tully

    2016-05-01

    Full Text Available Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model's feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx. We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison.

  7. Hierarchical Bayesian modeling of spatio-temporal patterns of lung cancer incidence risk in Georgia, USA: 2000-2007

    Science.gov (United States)

    Yin, Ping; Mu, Lan; Madden, Marguerite; Vena, John E.

    2014-10-01

    Lung cancer is the second most commonly diagnosed cancer in both men and women in Georgia, USA. However, the spatio-temporal patterns of lung cancer risk in Georgia have not been fully studied. Hierarchical Bayesian models are used here to explore the spatio-temporal patterns of lung cancer incidence risk by race and gender in Georgia for the period of 2000-2007. With the census tract level as the spatial scale and the 2-year period aggregation as the temporal scale, we compare a total of seven Bayesian spatio-temporal models including two under a separate modeling framework and five under a joint modeling framework. One joint model outperforms others based on the deviance information criterion. Results show that the northwest region of Georgia has consistently high lung cancer incidence risk for all population groups during the study period. In addition, there are inverse relationships between the socioeconomic status and the lung cancer incidence risk among all Georgian population groups, and the relationships in males are stronger than those in females. By mapping more reliable variations in lung cancer incidence risk at a relatively fine spatio-temporal scale for different Georgian population groups, our study aims to better support healthcare performance assessment, etiological hypothesis generation, and health policy making.

  8. X-real-time executive (X-RTE) an ultra-high reliable real-time executive for safety critical systems

    International Nuclear Information System (INIS)

    Suresh Babu, R.M.

    1995-01-01

    With growing number of application of computers in safety critical systems of nuclear plants there has been a need to assure high quality and reliability of the software used in these systems. One way to assure software quality is to use qualified software components. Since the safety systems and control systems are real-time systems there is a need for a real-time supervisory software to guarantee temporal response of the system. This report describes one such software package, called X-Real-Time Executive (or X-RTE), which was developed in Reactor Control Division, BARC. The report describes all the capabilities and unique features of X-RTE and compares it with a commercially available operating system. The features of X-RTE include pre-emptive scheduling, process synchronization, inter-process communication, multi-processor support, temporal support, debug facility, high portability, high reliability, high quality, and extensive documentation. Examples have been used very liberally to illustrate the underlying concepts. Besides, the report provides a brief description about the methods used, during the software development, to assure high quality and reliability of X-RTE. (author). refs., 11 figs., tabs

  9. A Bayesian approach to real-time 3D tumor localization via monoscopic x-ray imaging during treatment delivery

    International Nuclear Information System (INIS)

    Li, Ruijiang; Fahimian, Benjamin P.; Xing, Lei

    2011-01-01

    Purpose: Monoscopic x-ray imaging with on-board kV devices is an attractive approach for real-time image guidance in modern radiation therapy such as VMAT or IMRT, but it falls short in providing reliable information along the direction of imaging x-ray. By effectively taking consideration of projection data at prior times and/or angles through a Bayesian formalism, the authors develop an algorithm for real-time and full 3D tumor localization with a single x-ray imager during treatment delivery. Methods: First, a prior probability density function is constructed using the 2D tumor locations on the projection images acquired during patient setup. Whenever an x-ray image is acquired during the treatment delivery, the corresponding 2D tumor location on the imager is used to update the likelihood function. The unresolved third dimension is obtained by maximizing the posterior probability distribution. The algorithm can also be used in a retrospective fashion when all the projection images during the treatment delivery are used for 3D localization purposes. The algorithm does not involve complex optimization of any model parameter and therefore can be used in a ''plug-and-play'' fashion. The authors validated the algorithm using (1) simulated 3D linear and elliptic motion and (2) 3D tumor motion trajectories of a lung and a pancreas patient reproduced by a physical phantom. Continuous kV images were acquired over a full gantry rotation with the Varian TrueBeam on-board imaging system. Three scenarios were considered: fluoroscopic setup, cone beam CT setup, and retrospective analysis. Results: For the simulation study, the RMS 3D localization error is 1.2 and 2.4 mm for the linear and elliptic motions, respectively. For the phantom experiments, the 3D localization error is < 1 mm on average and < 1.5 mm at 95th percentile in the lung and pancreas cases for all three scenarios. The difference in 3D localization error for different scenarios is small and is not

  10. Evaluation of spatio-temporal Bayesian models for the spread of infectious diseases in oil palm.

    Science.gov (United States)

    Denis, Marie; Cochard, Benoît; Syahputra, Indra; de Franqueville, Hubert; Tisné, Sébastien

    2018-02-01

    In the field of epidemiology, studies are often focused on mapping diseases in relation to time and space. Hierarchical modeling is a common flexible and effective tool for modeling problems related to disease spread. In the context of oil palm plantations infected by the fungal pathogen Ganoderma boninense, we propose and compare two spatio-temporal hierarchical Bayesian models addressing the lack of information on propagation modes and transmission vectors. We investigate two alternative process models to study the unobserved mechanism driving the infection process. The models help gain insight into the spatio-temporal dynamic of the infection by identifying a genetic component in the disease spread and by highlighting a spatial component acting at the end of the experiment. In this challenging context, we propose models that provide assumptions on the unobserved mechanism driving the infection process while making short-term predictions using ready-to-use software. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. A hierarchical Bayesian spatio-temporal model to forecast trapped particle fluxes over the SAA region

    Czech Academy of Sciences Publication Activity Database

    Suparta, W.; Gusrizal, G.; Kudela, Karel; Isa, Z.

    2017-01-01

    Roč. 28, č. 3 (2017), s. 357-370 ISSN 1017-0839 R&D Projects: GA MŠk EF15_003/0000481 Institutional support: RVO:61389005 Keywords : trapped particle * spatio-temporal * hierarchical Bayesian * forecasting Subject RIV: DG - Athmosphere Sciences, Meteorology OBOR OECD: Meteorology and atmospheric sciences Impact factor: 0.752, year: 2016

  12. Reliable real-time applications - and how to use tests to model and understand

    DEFF Research Database (Denmark)

    Jensen, Peter Krogsgaard

    Test and analysis of real-time applications, where temporal properties are inspected, analyzed, and verified in a model developed from timed traces originating from measured test result on a running application......Test and analysis of real-time applications, where temporal properties are inspected, analyzed, and verified in a model developed from timed traces originating from measured test result on a running application...

  13. Bayesian Modeling of Temporal Coherence in Videos for Entity Discovery and Summarization.

    Science.gov (United States)

    Mitra, Adway; Biswas, Soma; Bhattacharyya, Chiranjib

    2017-03-01

    A video is understood by users in terms of entities present in it. Entity Discovery is the task of building appearance model for each entity (e.g., a person), and finding all its occurrences in the video. We represent a video as a sequence of tracklets, each spanning 10-20 frames, and associated with one entity. We pose Entity Discovery as tracklet clustering, and approach it by leveraging Temporal Coherence (TC): the property that temporally neighboring tracklets are likely to be associated with the same entity. Our major contributions are the first Bayesian nonparametric models for TC at tracklet-level. We extend Chinese Restaurant Process (CRP) to TC-CRP, and further to Temporally Coherent Chinese Restaurant Franchise (TC-CRF) to jointly model entities and temporal segments using mixture components and sparse distributions. For discovering persons in TV serial videos without meta-data like scripts, these methods show considerable improvement over state-of-the-art approaches to tracklet clustering in terms of clustering accuracy, cluster purity and entity coverage. The proposed methods can perform online tracklet clustering on streaming videos unlike existing approaches, and can automatically reject false tracklets. Finally we discuss entity-driven video summarization- where temporal segments of the video are selected based on the discovered entities, to create a semantically meaningful summary.

  14. Bayesian Inference for Functional Dynamics Exploring in fMRI Data

    Directory of Open Access Journals (Sweden)

    Xuan Guo

    2016-01-01

    Full Text Available This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM, Bayesian Connectivity Change Point Model (BCCPM, and Dynamic Bayesian Variable Partition Model (DBVPM, and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.

  15. Timeliness and Predictability in Real-Time Database Systems

    National Research Council Canada - National Science Library

    Son, Sang H

    1998-01-01

    The confluence of computers, communications, and databases is quickly creating a globally distributed database where many applications require real time access to both temporally accurate and multimedia data...

  16. Characterization of indoor aerosol temporal variations for the real-time management of indoor air quality

    Science.gov (United States)

    Ciuzas, Darius; Prasauskas, Tadas; Krugly, Edvinas; Sidaraviciute, Ruta; Jurelionis, Andrius; Seduikyte, Lina; Kauneliene, Violeta; Wierzbicka, Aneta; Martuzevicius, Dainius

    2015-10-01

    The study presents the characterization of dynamic patterns of indoor particulate matter (PM) during various pollution episodes for real-time IAQ management. The variation of PM concentrations was assessed for 20 indoor activities, including cooking related sources, other thermal sources, personal care and household products. The pollution episodes were modelled in full-scale test chamber representing a standard usual living room with the forced ventilation of 0.5 h-1. In most of the pollution episodes, the maximum concentration of particles in exhaust air was reached within a few minutes. The most rapid increase in particle concentration was during thermal source episodes such as candle, cigarette, incense stick burning and cooking related sources, while the slowest decay of concentrations was associated with sources, emitting ultrafine particle precursors, such as furniture polisher spraying, floor wet mopping with detergent etc. Placement of the particle sensors in the ventilation exhaust vs. in the centre of the ceiling yielded comparable results for both measured maximum concentrations and temporal variations, indicating that both locations were suitable for the placement of sensors for the management of IAQ. The obtained data provides information that may be utilized considering measurements of aerosol particles as indicators for the real-time management of IAQ.

  17. Encoding dependence in Bayesian causal networks

    Science.gov (United States)

    Bayesian networks (BNs) represent complex, uncertain spatio-temporal dynamics by propagation of conditional probabilities between identifiable states with a testable causal interaction model. Typically, they assume random variables are discrete in time and space with a static network structure that ...

  18. Specifying real-time systems with interval logic

    Science.gov (United States)

    Rushby, John

    1988-01-01

    Pure temporal logic makes no reference to time. An interval temporal logic and an extension to that logic which includes real time constraints are described. The application of this logic by giving a specification for the well-known lift (elevator) example is demonstrated. It is shown how interval logic can be extended to include a notion of process. How the specification language and verification environment of EHDM could be enhanced to support this logic is described. A specification of the alternating bit protocol in this extended version of the specification language of EHDM is given.

  19. Real-time prediction of acute cardiovascular events using hardware-implemented Bayesian networks.

    Science.gov (United States)

    Tylman, Wojciech; Waszyrowski, Tomasz; Napieralski, Andrzej; Kamiński, Marek; Trafidło, Tamara; Kulesza, Zbigniew; Kotas, Rafał; Marciniak, Paweł; Tomala, Radosław; Wenerski, Maciej

    2016-02-01

    This paper presents a decision support system that aims to estimate a patient׳s general condition and detect situations which pose an immediate danger to the patient׳s health or life. The use of this system might be especially important in places such as accident and emergency departments or admission wards, where a small medical team has to take care of many patients in various general conditions. Particular stress is laid on cardiovascular and pulmonary conditions, including those leading to sudden cardiac arrest. The proposed system is a stand-alone microprocessor-based device that works in conjunction with a standard vital signs monitor, which provides input signals such as temperature, blood pressure, pulseoxymetry, ECG, and ICG. The signals are preprocessed and analysed by a set of artificial intelligence algorithms, the core of which is based on Bayesian networks. The paper focuses on the construction and evaluation of the Bayesian network, both its structure and numerical specification. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Coalescence measurements for evolving foams monitored by real-time projection imaging

    International Nuclear Information System (INIS)

    Myagotin, A; Helfen, L; Baumbach, T

    2009-01-01

    Real-time radiographic projection imaging together with novel spatio-temporal image analysis is presented to be a powerful technique for the quantitative analysis of coalescence processes accompanying the generation and temporal evolution of foams and emulsions. Coalescence events can be identified as discontinuities in a spatio-temporal image representing a sequence of projection images. Detection, identification of intensity and localization of the discontinuities exploit a violation criterion of the Fourier shift theorem and are based on recursive spatio-temporal image partitioning. The proposed method is suited for automated measurements of discontinuity rates (i.e., discontinuity intensity per unit time), so that large series of radiographs can be analyzed without user intervention. The application potential is demonstrated by the quantification of coalescence during the formation and decay of metal foams monitored by real-time x-ray radiography

  1. Highly-Accelerated Real-Time Cardiac Cine MRI Using k-t SPARSE-SENSE

    Science.gov (United States)

    Feng, Li; Srichai, Monvadi B.; Lim, Ruth P.; Harrison, Alexis; King, Wilson; Adluru, Ganesh; Dibella, Edward VR.; Sodickson, Daniel K.; Otazo, Ricardo; Kim, Daniel

    2012-01-01

    For patients with impaired breath-hold capacity and/or arrhythmias, real-time cine MRI may be more clinically useful than breath-hold cine MRI. However, commercially available real-time cine MRI methods using parallel imaging typically yield relatively poor spatio-temporal resolution due to their low image acquisition speed. We sought to achieve relatively high spatial resolution (~2.5mm × 2.5mm) and temporal resolution (~40ms), to produce high-quality real-time cine MR images that could be applied clinically for wall motion assessment and measurement of left ventricular (LV) function. In this work, we present an 8-fold accelerated real-time cardiac cine MRI pulse sequence using a combination of compressed sensing and parallel imaging (k-t SPARSE-SENSE). Compared with reference, breath-hold cine MRI, our 8-fold accelerated real-time cine MRI produced significantly worse qualitative grades (1–5 scale), but its image quality and temporal fidelity scores were above 3.0 (adequate) and artifacts and noise scores were below 3.0 (moderate), suggesting that acceptable diagnostic image quality can be achieved. Additionally, both 8-fold accelerated real-time cine and breath-hold cine MRI yielded comparable LV function measurements, with coefficient of variation cine MRI with k-t SPARSE-SENSE is a promising modality for rapid imaging of myocardial function. PMID:22887290

  2. Highly accelerated real-time cardiac cine MRI using k-t SPARSE-SENSE.

    Science.gov (United States)

    Feng, Li; Srichai, Monvadi B; Lim, Ruth P; Harrison, Alexis; King, Wilson; Adluru, Ganesh; Dibella, Edward V R; Sodickson, Daniel K; Otazo, Ricardo; Kim, Daniel

    2013-07-01

    For patients with impaired breath-hold capacity and/or arrhythmias, real-time cine MRI may be more clinically useful than breath-hold cine MRI. However, commercially available real-time cine MRI methods using parallel imaging typically yield relatively poor spatio-temporal resolution due to their low image acquisition speed. We sought to achieve relatively high spatial resolution (∼2.5 × 2.5 mm(2)) and temporal resolution (∼40 ms), to produce high-quality real-time cine MR images that could be applied clinically for wall motion assessment and measurement of left ventricular function. In this work, we present an eightfold accelerated real-time cardiac cine MRI pulse sequence using a combination of compressed sensing and parallel imaging (k-t SPARSE-SENSE). Compared with reference, breath-hold cine MRI, our eightfold accelerated real-time cine MRI produced significantly worse qualitative grades (1-5 scale), but its image quality and temporal fidelity scores were above 3.0 (adequate) and artifacts and noise scores were below 3.0 (moderate), suggesting that acceptable diagnostic image quality can be achieved. Additionally, both eightfold accelerated real-time cine and breath-hold cine MRI yielded comparable left ventricular function measurements, with coefficient of variation cine MRI with k-t SPARSE-SENSE is a promising modality for rapid imaging of myocardial function. Copyright © 2012 Wiley Periodicals, Inc.

  3. Bayesian spatio-temporal analysis and geospatial risk factors of human monocytic ehrlichiosis.

    Directory of Open Access Journals (Sweden)

    Ram K Raghavan

    Full Text Available Variations in spatio-temporal patterns of Human Monocytic Ehrlichiosis (HME infection in the state of Kansas, USA were examined and the relationship between HME relative risk and various environmental, climatic and socio-economic variables were evaluated. HME data used in the study was reported to the Kansas Department of Health and Environment between years 2005-2012, and geospatial variables representing the physical environment [National Land cover/Land use, NASA Moderate Resolution Imaging Spectroradiometer (MODIS], climate [NASA MODIS, Prediction of Worldwide Renewable Energy (POWER], and socio-economic conditions (US Census Bureau were derived from publicly available sources. Following univariate screening of candidate variables using logistic regressions, two Bayesian hierarchical models were fit; a partial spatio-temporal model with random effects and a spatio-temporal interaction term, and a second model that included additional covariate terms. The best fitting model revealed that spatio-temporal autocorrelation in Kansas increased steadily from 2005-2012, and identified poverty status, relative humidity, and an interactive factor, 'diurnal temperature range x mixed forest area' as significant county-level risk factors for HME. The identification of significant spatio-temporal pattern and new risk factors are important in the context of HME prevention, for future research in the areas of ecology and evolution of HME, and as well as climate change impacts on tick-borne diseases.

  4. Real-time position reconstruction with hippocampal place cells.

    Science.gov (United States)

    Guger, Christoph; Gener, Thomas; Pennartz, Cyriel M A; Brotons-Mas, Jorge R; Edlinger, Günter; Bermúdez I Badia, S; Verschure, Paul; Schaffelhofer, Stefan; Sanchez-Vives, Maria V

    2011-01-01

    Brain-computer interfaces (BCI) are using the electroencephalogram, the electrocorticogram and trains of action potentials as inputs to analyze brain activity for communication purposes and/or the control of external devices. Thus far it is not known whether a BCI system can be developed that utilizes the states of brain structures that are situated well below the cortical surface, such as the hippocampus. In order to address this question we used the activity of hippocampal place cells (PCs) to predict the position of an rodent in real-time. First, spike activity was recorded from the hippocampus during foraging and analyzed off-line to optimize the spike sorting and position reconstruction algorithm of rats. Then the spike activity was recorded and analyzed in real-time. The rat was running in a box of 80 cm × 80 cm and its locomotor movement was captured with a video tracking system. Data were acquired to calculate the rat's trajectories and to identify place fields. Then a Bayesian classifier was trained to predict the position of the rat given its neural activity. This information was used in subsequent trials to predict the rat's position in real-time. The real-time experiments were successfully performed and yielded an error between 12.2 and 17.4% using 5-6 neurons. It must be noted here that the encoding step was done with data recorded before the real-time experiment and comparable accuracies between off-line (mean error of 15.9% for three rats) and real-time experiments (mean error of 14.7%) were achieved. The experiment shows proof of principle that position reconstruction can be done in real-time, that PCs were stable and spike sorting was robust enough to generalize from the training run to the real-time reconstruction phase of the experiment. Real-time reconstruction may be used for a variety of purposes, including creating behavioral-neuronal feedback loops or for implementing neuroprosthetic control.

  5. Route around real time

    International Nuclear Information System (INIS)

    Terrier, Francois

    1996-01-01

    The greater and greater autonomy and complexity asked to the control and command systems lead to work on introducing techniques such as Artificial Intelligence or concurrent object programming in industrial applications. However, while the critical feature of these systems impose to control the dynamics of the proposed solutions, their complexity often imposes a high adaptability to a partially modelled environment. The studies presented start from low level control and command systems to more complex applications at higher levels, such as 'supervision systems'. Techniques such as temporal reasoning and uncertainty management are proposed for the first studies, while the second are tackled with programming techniques based on the real time object paradigm. The outcomes of this itinerary crystallize on the ACCORD project which targets to manage - on the whole life cycle of a real time application - these two problematics, sometimes antagonistic: control of the dynamics and adaptivity. (author) [fr

  6. Registration of global cardiac function with real-time trueFISP in one respiratory cycle

    International Nuclear Information System (INIS)

    Wintersperger, B.J.; Nikolaou, K.; Huber, A.; Dietrich, O.; Reiser, M.F.; Schoenberg, S.O.; Muehling, O.; Nittka, M.; Kiefer, B.

    2004-01-01

    Real-time multislice cine techniques lead to inaccurate results in ventricular volumes based on limited temporal resolution. The purpose of the study is to evaluate a real-time cine technique with parallel imaging algorithms in comparison to standard segmented techniques. Twelve patients underwent cardiac cine MRI using real-time multislice cine trueFISP. Temporal resolution was improved using parallel acquisition techniques (iPAT) and data acquisition was performed in a single breath-hold along the patients' short axis. Evaluation of EDV, ESV, EF and myocardial mass was performed and results compared to a standard segmented single-slice cine trueFISP. Combination of real-time cine trueFISP and iPAT provided a temporal resolution of 48 ms. Results of the multislice approach showed an excellent correlation to standard single-slice trueFISP for EDV (0.94, p [de

  7. Bayesian networks with examples in R

    CERN Document Server

    Scutari, Marco

    2014-01-01

    Introduction. The Discrete Case: Multinomial Bayesian Networks. The Continuous Case: Gaussian Bayesian Networks. More Complex Cases. Theory and Algorithms for Bayesian Networks. Real-World Applications of Bayesian Networks. Appendices. Bibliography.

  8. Temporal and spatial variabilities of Antarctic ice mass changes inferred by GRACE in a Bayesian framework

    Science.gov (United States)

    Wang, L.; Davis, J. L.; Tamisiea, M. E.

    2017-12-01

    The Antarctic ice sheet (AIS) holds about 60% of all fresh water on the Earth, an amount equivalent to about 58 m of sea-level rise. Observation of AIS mass change is thus essential in determining and predicting its contribution to sea level. While the ice mass loss estimates for West Antarctica (WA) and the Antarctic Peninsula (AP) are in good agreement, what the mass balance over East Antarctica (EA) is, and whether or not it compensates for the mass loss is under debate. Besides the different error sources and sensitivities of different measurement types, complex spatial and temporal variabilities would be another factor complicating the accurate estimation of the AIS mass balance. Therefore, a model that allows for variabilities in both melting rate and seasonal signals would seem appropriate in the estimation of present-day AIS melting. We present a stochastic filter technique, which enables the Bayesian separation of the systematic stripe noise and mass signal in decade-length GRACE monthly gravity series, and allows the estimation of time-variable seasonal and inter-annual components in the signals. One of the primary advantages of this Bayesian method is that it yields statistically rigorous uncertainty estimates reflecting the inherent spatial resolution of the data. By applying the stochastic filter to the decade-long GRACE observations, we present the temporal variabilities of the AIS mass balance at basin scale, particularly over East Antarctica, and decipher the EA mass variations in the past decade, and their role in affecting overall AIS mass balance and sea level.

  9. Faster-Than-Real-Time Simulation of Lithium Ion Batteries with Full Spatial and Temporal Resolution

    Directory of Open Access Journals (Sweden)

    Sandip Mazumder

    2013-01-01

    Full Text Available A one-dimensional coupled electrochemical-thermal model of a lithium ion battery with full temporal and normal-to-electrode spatial resolution is presented. Only a single pair of electrodes is considered in the model. It is shown that simulation of a lithium ion battery with the inclusion of detailed transport phenomena and electrochemistry is possible with faster-than-real-time compute times. The governing conservation equations of mass, charge, and energy are discretized using the finite volume method and solved using an iterative procedure. The model is first successfully validated against experimental data for both charge and discharge processes in a LixC6-LiyMn2O4 battery. Finally, it is demonstrated for an arbitrary rapidly changing transient load typical of a hybrid electric vehicle drive cycle. The model is able to predict the cell voltage of a 15-minute drive cycle in less than 12 seconds of compute time on a laptop with a 2.33 GHz Intel Pentium 4 processor.

  10. Validation of Magnetic Reconstruction Codes for Real-Time Applications

    International Nuclear Information System (INIS)

    Mazon, D.; Murari, A.; Boulbe, C.; Faugeras, B.; Blum, J.; Svensson, J.; Quilichini, T.; Gelfusa, M.

    2010-01-01

    The real-time reconstruction of the plasma magnetic equilibrium in a tokamak is a key point to access high-performance regimes. Indeed, the shape of the plasma current density profile is a direct output of the reconstruction and has a leading effect for reaching a steady-state high-performance regime of operation. The challenge is thus to develop real-time methods and algorithms that reconstruct the magnetic equilibrium from the perspective of using these outputs for feedback control purposes. In this paper the validation of the JET real-time equilibrium reconstruction codes using both a Bayesian approach and a full equilibrium solver named Equinox will be detailed, the comparison being performed with the off-line equilibrium code EFIT (equilibrium fitting) or the real-time boundary reconstruction code XLOC (X-point local expansion). In this way a significant database, a methodology, and a strategy for the validation are presented. The validation of the results has been performed using a validated database of 130 JET discharges with a large variety of magnetic configurations. Internal measurements like polarimetry and motional Stark effect have been also used for the Equinox validation including some magnetohydrodynamic signatures for the assessment of the reconstructed safety profile and current density. (authors)

  11. Task Mapping and Partition Allocation for Mixed-Criticality Real-Time Systems

    DEFF Research Database (Denmark)

    Tamas-Selicean, Domitian; Pop, Paul

    2012-01-01

    In this paper we address the mapping of mixedcriticality hard real-time applications on distributed embedded architectures. We assume that the architecture provides both spatial and temporal partitioning, thus enforcing enough separation between applications. With temporal partitioning, each...

  12. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  13. Near Real Time Change-Point detection in Optical and Thermal Infrared Time Series Using Bayesian Inference over the Dry Chaco Forest

    Science.gov (United States)

    Barraza Bernadas, V.; Grings, F.; Roitberg, E.; Perna, P.; Karszenbaum, H.

    2017-12-01

    The Dry Chaco region (DCF) has the highest absolute deforestation rates of all Argentinian forests. The most recent report indicates a current deforestation rate of 200,000 Ha year-1. In order to better monitor this process, DCF was chosen to implement an early warning program for illegal deforestation. Although the area is intensively studied using medium resolution imagery (Landsat), the products obtained have a yearly pace and therefore unsuited for an early warning program. In this paper, we evaluated the performance of an online Bayesian change-point detection algorithm for MODIS Enhanced Vegetation Index (EVI) and Land Surface Temperature (LST) datasets. The goal was to to monitor the abrupt changes in vegetation dynamics associated with deforestation events. We tested this model by simulating 16-day EVI and 8-day LST time series with varying amounts of seasonality, noise, length of the time series and by adding abrupt changes with different magnitudes. This model was then tested on real satellite time series available through the Google Earth Engine, over a pilot area in DCF, where deforestation was common in the 2004-2016 period. A comparison with yearly benchmark products based on Landsat images is also presented (REDAF dataset). The results shows the advantages of using an automatic model to detect a changepoint in the time series than using only visual inspection techniques. Simulating time series with varying amounts of seasonality and noise, and by adding abrupt changes at different times and magnitudes, revealed that this model is robust against noise, and is not influenced by changes in amplitude of the seasonal component. Furthermore, the results compared favorably with REDAF dataset (near 65% of agreement). These results show the potential to combine LST and EVI to identify deforestation events. This work is being developed within the frame of the national Forest Law for the protection and sustainable development of Native Forest in Argentina in

  14. How to detect the location and time of a covert chemical attack a Bayesian approach

    OpenAIRE

    See, Mei Eng Elaine.

    2009-01-01

    Approved for public release, distribution unlimited In this thesis, we develop a Bayesian updating model that estimates the location and time of a chemical attack using inputs from chemical sensors and Atmospheric Threat and Dispersion (ATD) models. In bridging the critical gap between raw sensor data and threat evaluation and prediction, the model will help authorities perform better hazard prediction and damage control. The model is evaluated with respect to settings representing real-wo...

  15. Monitoring Murder Crime in Namibia Using Bayesian Space-Time Models

    Directory of Open Access Journals (Sweden)

    Isak Neema

    2012-01-01

    Full Text Available This paper focuses on the analysis of murder in Namibia using Bayesian spatial smoothing approach with temporal trends. The analysis was based on the reported cases from 13 regions of Namibia for the period 2002–2006 complemented with regional population sizes. The evaluated random effects include space-time structured heterogeneity measuring the effect of regional clustering, unstructured heterogeneity, time, space and time interaction and population density. The model consists of carefully chosen prior and hyper-prior distributions for parameters and hyper-parameters, with inference conducted using Gibbs sampling algorithm and sensitivity test for model validation. The posterior mean estimate of the parameters from the model using DIC as model selection criteria show that most of the variation in the relative risk of murder is due to regional clustering, while the effect of population density and time was insignificant. The sensitivity analysis indicates that both intrinsic and Laplace CAR prior can be adopted as prior distribution for the space-time heterogeneity. In addition, the relative risk map show risk structure of increasing north-south gradient, pointing to low risk in northern regions of Namibia, while Karas and Khomas region experience long-term increase in murder risk.

  16. Real-time individualized training vectors for experiential learning.

    Energy Technology Data Exchange (ETDEWEB)

    Willis, Matt; Tucker, Eilish Marie; Raybourn, Elaine Marie; Glickman, Matthew R.; Fabian, Nathan

    2011-01-01

    Military training utilizing serious games or virtual worlds potentially generate data that can be mined to better understand how trainees learn in experiential exercises. Few data mining approaches for deployed military training games exist. Opportunities exist to collect and analyze these data, as well as to construct a full-history learner model. Outcomes discussed in the present document include results from a quasi-experimental research study on military game-based experiential learning, the deployment of an online game for training evidence collection, and results from a proof-of-concept pilot study on the development of individualized training vectors. This Lab Directed Research & Development (LDRD) project leveraged products within projects, such as Titan (Network Grand Challenge), Real-Time Feedback and Evaluation System, (America's Army Adaptive Thinking and Leadership, DARWARS Ambush! NK), and Dynamic Bayesian Networks to investigate whether machine learning capabilities could perform real-time, in-game similarity vectors of learner performance, toward adaptation of content delivery, and quantitative measurement of experiential learning.

  17. A Flattened Hierarchical Scheduler for Real-Time Virtual Machines

    OpenAIRE

    Drescher, Michael Stuart

    2015-01-01

    The recent trend of migrating legacy computer systems to a virtualized, cloud-based environment has expanded to real-time systems. Unfortunately, modern hypervisors have no mechanism in place to guarantee the real-time performance of applications running on virtual machines. Past solutions to this problem rely on either spatial or temporal resource partitioning, both of which under-utilize the processing capacity of the host system. Paravirtualized solutions in which the guest communicates it...

  18. Real time urbanism

    Directory of Open Access Journals (Sweden)

    Ana Ruiz Varona

    2012-12-01

    Full Text Available Nowadays, given the technological revolution of the society of information, the administrative management of the cities faces a new problem not as related to the projection of the urban space as to the capacity of controlling and measuring the process of direct and centralized production of the cities by part of some non-homogeneous social multitudes, in a hyper-accelerated time towards instantaneity. Against libertarian apologies of the new “participative urbanisms”, the article puts forward a discourse that shows the lost associated to the new problem of temporal instantaneity. In this regard we claim new process of mediation that allow administrations and urbanist monitoring the production of the city. To that end, a previous and necessary step will be the redefinition of the role of a new real time urbanist.

  19. A Bayesian framework to estimate diversification rates and their variation through time and space

    Directory of Open Access Journals (Sweden)

    Silvestro Daniele

    2011-10-01

    Full Text Available Abstract Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae and Lupinus (Fabaceae. In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling.

  20. Calibrated birth-death phylogenetic time-tree priors for bayesian inference.

    Science.gov (United States)

    Heled, Joseph; Drummond, Alexei J

    2015-05-01

    Here we introduce a general class of multiple calibration birth-death tree priors for use in Bayesian phylogenetic inference. All tree priors in this class separate ancestral node heights into a set of "calibrated nodes" and "uncalibrated nodes" such that the marginal distribution of the calibrated nodes is user-specified whereas the density ratio of the birth-death prior is retained for trees with equal values for the calibrated nodes. We describe two formulations, one in which the calibration information informs the prior on ranked tree topologies, through the (conditional) prior, and the other which factorizes the prior on divergence times and ranked topologies, thus allowing uniform, or any arbitrary prior distribution on ranked topologies. Although the first of these formulations has some attractive properties, the algorithm we present for computing its prior density is computationally intensive. However, the second formulation is always faster and computationally efficient for up to six calibrations. We demonstrate the utility of the new class of multiple-calibration tree priors using both small simulations and a real-world analysis and compare the results to existing schemes. The two new calibrated tree priors described in this article offer greater flexibility and control of prior specification in calibrated time-tree inference and divergence time dating, and will remove the need for indirect approaches to the assessment of the combined effect of calibration densities and tree priors in Bayesian phylogenetic inference. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  1. Robust seismicity forecasting based on Bayesian parameter estimation for epidemiological spatio-temporal aftershock clustering models.

    Science.gov (United States)

    Ebrahimian, Hossein; Jalayer, Fatemeh

    2017-08-29

    In the immediate aftermath of a strong earthquake and in the presence of an ongoing aftershock sequence, scientific advisories in terms of seismicity forecasts play quite a crucial role in emergency decision-making and risk mitigation. Epidemic Type Aftershock Sequence (ETAS) models are frequently used for forecasting the spatio-temporal evolution of seismicity in the short-term. We propose robust forecasting of seismicity based on ETAS model, by exploiting the link between Bayesian inference and Markov Chain Monte Carlo Simulation. The methodology considers the uncertainty not only in the model parameters, conditioned on the available catalogue of events occurred before the forecasting interval, but also the uncertainty in the sequence of events that are going to happen during the forecasting interval. We demonstrate the methodology by retrospective early forecasting of seismicity associated with the 2016 Amatrice seismic sequence activities in central Italy. We provide robust spatio-temporal short-term seismicity forecasts with various time intervals in the first few days elapsed after each of the three main events within the sequence, which can predict the seismicity within plus/minus two standard deviations from the mean estimate within the few hours elapsed after the main event.

  2. Bayesian spatio-temporal discard model in a demersal trawl fishery

    Science.gov (United States)

    Grazia Pennino, M.; Muñoz, Facundo; Conesa, David; López-Quílez, Antonio; Bellido, José M.

    2014-07-01

    Spatial management of discards has recently been proposed as a useful tool for the protection of juveniles, by reducing discard rates and can be used as a buffer against management errors and recruitment failure. In this study Bayesian hierarchical spatial models have been used to analyze about 440 trawl fishing operations of two different metiers, sampled between 2009 and 2012, in order to improve our understanding of factors that influence the quantity of discards and to identify their spatio-temporal distribution in the study area. Our analysis showed that the relative importance of each variable was different for each metier, with a few similarities. In particular, the random vessel effect and seasonal variability were identified as main driving variables for both metiers. Predictive maps of the abundance of discards and maps of the posterior mean of the spatial component show several hot spots with high discard concentration for each metier. We argue how the seasonal/spatial effects, and the knowledge about the factors influential to discarding, could potentially be exploited as potential mitigation measures for future fisheries management strategies. However, misidentification of hotspots and uncertain predictions can culminate in inappropriate mitigation practices which can sometimes be irreversible. The proposed Bayesian spatial method overcomes these issues, since it offers a unified approach which allows the incorporation of spatial random-effect terms, spatial correlation of the variables and the uncertainty of the parameters in the modeling process, resulting in a better quantification of the uncertainty and accurate predictions.

  3. Formal Verification of User-Level Real-Time Property Patterns

    OpenAIRE

    Ge , Ning; Pantel , Marc; Dal Zilio , Silvano

    2017-01-01

    International audience; To ease the expression of real-time requirements, Dwyer, and then Konrad, studied a large collection of existing systems in order to identify a set of real-time property patterns covering most of the useful use cases. The goal was to provide a set of reusable patterns that system designers can instantiate to express requirements instead of using complex temporal logic formulas. A limitation of this approach is that the choice of patterns is more oriented towards expres...

  4. Using a Bayesian Probabilistic Forecasting Model to Analyze the Uncertainty in Real-Time Dynamic Control of the Flood Limiting Water Level for Reservoir Operation

    DEFF Research Database (Denmark)

    Liu, Dedi; Li, Xiang; Guo, Shenglian

    2015-01-01

    Dynamic control of the flood limiting water level (FLWL) is a valuable and effective way to maximize the benefits from reservoir operation without exceeding the design risk. In order to analyze the impacts of input uncertainty, a Bayesian forecasting system (BFS) is adopted. Applying quantile water...... inflow values and their uncertainties obtained from the BFS, the reservoir operation results from different schemes can be analyzed in terms of benefits, dam safety, and downstream impacts during the flood season. When the reservoir FLWL dynamic control operation is implemented, there are two fundamental......, also deterministic water inflow was tested. The proposed model in the paper emphasizes the importance of analyzing the uncertainties of the water inflow forecasting system for real-time dynamic control of the FLWL for reservoir operation. For the case study, the selected quantile inflow from...

  5. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  6. Real-Time Audio Processing on the T-CREST Multicore Platform

    DEFF Research Database (Denmark)

    Ausin, Daniel Sanz; Pezzarossa, Luca; Schoeberl, Martin

    2017-01-01

    of the audio signal. This paper presents a real-time multicore audio processing system based on the T-CREST platform. T-CREST is a time-predictable multicore processor for real-time embedded systems. Multiple audio effect tasks have been implemented, which can be connected together in different configurations...... forming sequential and parallel effect chains, and using a network-onchip for intercommunication between processors. The evaluation of the system shows that real-time processing of multiple effect configurations is possible, and that the estimation and control of latency ensures real-time behavior.......Multicore platforms are nowadays widely used for audio processing applications, due to the improvement of computational power that they provide. However, some of these systems are not optimized for temporally constrained environments, which often leads to an undesired increase in the latency...

  7. Real-time systems

    OpenAIRE

    Badr, Salah M.; Bruztman, Donald P.; Nelson, Michael L.; Byrnes, Ronald Benton

    1992-01-01

    This paper presents an introduction to the basic issues involved in real-time systems. Both real-time operating sys and real-time programming languages are explored. Concurrent programming and process synchronization and communication are also discussed. The real-time requirements of the Naval Postgraduate School Autonomous Under Vehicle (AUV) are then examined. Autonomous underwater vehicle (AUV), hard real-time system, real-time operating system, real-time programming language, real-time sy...

  8. Real-time hypothesis driven feature extraction on parallel processing architectures

    DEFF Research Database (Denmark)

    Granmo, O.-C.; Jensen, Finn Verner

    2002-01-01

    the problem of higher-order feature-content/feature-feature correlation, causally complexly interacting features are identified through Bayesian network d-separation analysis and combined into joint features. When used on a moderately complex object-tracking case, the technique is able to select...... extraction, which selectively extract relevant features one-by-one, have in some cases achieved real-time performance on single processing element architectures. In this paperwe propose a novel technique which combines the above two approaches. Features are selectively extracted in parallelizable sets...

  9. Real-time change detection in data streams with FPGAs

    International Nuclear Information System (INIS)

    Vega, J.; Dormido-Canto, S.; Cruz, T.; Ruiz, M.; Barrera, E.; Castro, R.; Murari, A.; Ochando, M.

    2014-01-01

    Highlights: • Automatic recognition of changes in data streams of multidimensional signals. • Detection algorithm based on testing exchangeability on-line. • Real-time and off-line applicability. • Real-time implementation in FPGAs. - Abstract: The automatic recognition of changes in data streams is useful in both real-time and off-line data analyses. This article shows several effective change-detecting algorithms (based on martingales) and describes their real-time applicability in the data acquisition systems through the use of Field Programmable Gate Arrays (FPGA). The automatic event recognition system is absolutely general and it does not depend on either the particular event to detect or the specific data representation (waveforms, images or multidimensional signals). The developed approach provides good results for change detection in both the temporal evolution of profiles and the two-dimensional spatial distribution of volume emission intensity. The average computation time in the FPGA is 210 μs per profile

  10. Using real-time PCR and Bayesian analysis to distinguish susceptible tubificid taxa important in the transmission of Myxobolus cerebralis, the cause of salmonid whirling disease.

    Science.gov (United States)

    Fytilis, Nikolaos; Rizzo, Donna M; Lamb, Ryan D; Kerans, Billie L; Stevens, Lori

    2013-05-01

    Aquatic oligochaetes have long been appreciated for their value in assessing habitat quality because they are ubiquitous sediment-dwelling filter feeders. Many oligochaete taxa are also important in the transmission of fish diseases. Distinguishing resistant and susceptible taxa is important for managing fish disease, yet challenging in practice. Tubifex tubifex (Oligochaeta: Tubificidae) is the definitive host for the complex life-cycle parasite, Myxobolus cerebralis, the causative agent of salmonid whirling disease. We developed two hydrolysis probe-based qualitative real-time PCR (qPCR) multiplex assays that distinguish among tubificid taxa collected from the Madison River, Montana, USA. The first assay distinguishes T. tubifex from Rhyacodrilus spp.; while the second classifies T. tubifex identified by the first assay into two genetic lineages (I and III). Specificity and sensitivity were optimized for each assay; the two assays showed specificity of 94.3% and 98.6% for the target oligochaetes, respectively. DNA sequencing verified the results. The development of these assays allowed us to more fully describe tubificid community composition (the taxa and their abundance at a site) and estimate the relative abundances of host taxa. To relate tubificid relative abundance to fish disease risk, we determined M. cerebralis infection prevalence in samples identified as T. tubifex using similar molecular techniques. Given prior information (i.e., morphological identification of sexually mature worms), Bayesian analysis inferred that the first qPCR assay improved taxonomic identification. Bayesian inference of the relative abundance of T. tubifex, combined with infection assay results, identified sites with a high prevalence of infected T. tubifex. To our knowledge, this study represents both the first assessment of oligochaete community composition using a qPCR assay based on fluorescent probes and the first use of Bayesian analysis to fully characterize the dominant

  11. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  12. Key Technologies and Applications of Satellite and Sensor Web-coupled Real-time Dynamic Web Geographic Information System

    Directory of Open Access Journals (Sweden)

    CHEN Nengcheng

    2017-10-01

    Full Text Available The geo-spatial information service has failed to reflect the live status of spot and meet the needs of integrated monitoring and real-time information for a long time. To tackle the problems in observation sharing and integrated management of space-borne, air-borne, and ground-based platforms and efficient service of spatio-temporal information, an observation sharing model was proposed. The key technologies in real-time dynamic geographical information system (GIS including maximum spatio-temporal coverage-based optimal layout of earth-observation sensor Web, task-driven and feedback-based control, real-time access of streaming observations, dynamic simulation, warning and decision support were detailed. An real-time dynamic Web geographical information system (WebGIS named GeoSensor and its applications in sensing and management of spatio-temporal information of Yangtze River basin including navigation, flood prevention, and power generation were also introduced.

  13. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  14. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  15. Real-Time Extended Interface Automata for Software Testing Cases Generation

    Science.gov (United States)

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080

  16. A Bayesian Approach to Integrate Real-Time Data into Probabilistic Risk Analysis of Remediation Efforts in NAPL Sites

    Science.gov (United States)

    Fernandez-Garcia, D.; Sanchez-Vila, X.; Bolster, D.; Tartakovsky, D. M.

    2010-12-01

    treatment technology is then updated given the observed real-time measurements of concentrations at nearby monitoring wells. Thus, the methodology allows combining the probability of failure of a remediation effort due to multiple causes; each one associated to different potential pathways and receptors, and provides a way of integrating real-time measurements into PRA analysis.

  17. Evaluation of highly accelerated real-time cardiac cine MRI in tachycardia.

    Science.gov (United States)

    Bassett, Elwin C; Kholmovski, Eugene G; Wilson, Brent D; DiBella, Edward V R; Dosdall, Derek J; Ranjan, Ravi; McGann, Christopher J; Kim, Daniel

    2014-02-01

    Electrocardiogram (ECG)-gated breath-hold cine MRI is considered to be the gold standard test for the assessment of cardiac function. However, it may fail in patients with arrhythmia, impaired breath-hold capacity and poor ECG gating. Although ungated real-time cine MRI may mitigate these problems, commercially available real-time cine MRI pulse sequences using parallel imaging typically yield relatively poor spatiotemporal resolution because of their low image acquisition efficiency. As an extension of our previous work, the purpose of this study was to evaluate the diagnostic quality and accuracy of eight-fold-accelerated real-time cine MRI with compressed sensing (CS) for the quantification of cardiac function in tachycardia, where it is challenging for real-time cine MRI to provide sufficient spatiotemporal resolution. We evaluated the performances of eight-fold-accelerated cine MRI with CS, three-fold-accelerated real-time cine MRI with temporal generalized autocalibrating partially parallel acquisitions (TGRAPPA) and ECG-gated breath-hold cine MRI in 21 large animals with tachycardia (mean heart rate, 104 beats per minute) at 3T. For each cine MRI method, two expert readers evaluated the diagnostic quality in four categories (image quality, temporal fidelity of wall motion, artifacts and apparent noise) using a Likert scale (1-5, worst to best). One reader evaluated the left ventricular functional parameters. The diagnostic quality scores were significantly different between the three cine pulse sequences, except for the artifact level between CS and TGRAPPA real-time cine MRI. Both ECG-gated breath-hold cine MRI and eight-fold accelerated real-time cine MRI yielded all four scores of ≥ 3.0 (acceptable), whereas three-fold-accelerated real-time cine MRI yielded all scores below 3.0, except for artifact (3.0). The left ventricular ejection fraction (LVEF) measurements agreed better between ECG-gated cine MRI and eight-fold-accelerated real-time cine MRI

  18. SU-C-201-04: Noise and Temporal Resolution in a Near Real-Time 3D Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Rilling, M [Department of physics, engineering physics and optics, Universite Laval, Quebec City, QC (Canada); Centre de recherche sur le cancer, Universite Laval, Quebec City, QC (Canada); Radiation oncology department, CHU de Quebec, Quebec City, QC (Canada); Center for optics, photonics and lasers, Universite Laval, Quebec City, Quebec (Canada); Goulet, M [Radiation oncology department, CHU de Quebec, Quebec City, QC (Canada); Beaulieu, L; Archambault, L [Department of physics, engineering physics and optics, Universite Laval, Quebec City, QC (Canada); Centre de recherche sur le cancer, Universite Laval, Quebec City, QC (Canada); Radiation oncology department, CHU de Quebec, Quebec City, QC (Canada); Thibault, S [Center for optics, photonics and lasers, Universite Laval, Quebec City, Quebec (Canada)

    2016-06-15

    Purpose: To characterize the performance of a real-time three-dimensional scintillation dosimeter in terms of signal-to-noise ratio (SNR) and temporal resolution of 3D dose measurements. This study quantifies its efficiency in measuring low dose levels characteristic of EBRT dynamic treatments, and in reproducing field profiles for varying multileaf collimator (MLC) speeds. Methods: The dosimeter prototype uses a plenoptic camera to acquire continuous images of the light field emitted by a 10×10×10 cm{sup 3} plastic scintillator. Using EPID acquisitions, ray tracing-based iterative tomographic algorithms allow millimeter-sized reconstruction of relative 3D dose distributions. Measurements were taken at 6MV, 400 MU/min with the scintillator centered at the isocenter, first receiving doses from 1.4 to 30.6 cGy. Dynamic measurements were then performed by closing half of the MLCs at speeds of 0.67 to 2.5 cm/s, at 0° and 90° collimator angles. A reference static half-field was obtained for measured profile comparison. Results: The SNR steadily increases as a function of dose and reaches a clinically adequate plateau of 80 at 10 cGy. Below this, the decrease in light collected and increase in pixel noise diminishes the SNR; nonetheless, the EPID acquisitions and the voxel correlation employed in the reconstruction algorithms result in suitable SNR values (>75) even at low doses. For dynamic measurements at varying MLC speeds, central relative dose profiles are characterized by gradients at %D{sub 50} of 8.48 to 22.7 %/mm. These values converge towards the 32.8 %/mm-gradient measured for the static reference field profile, but are limited by the dosimeter’s current acquisition rate of 1Hz. Conclusion: This study emphasizes the efficiency of the 3D dose distribution reconstructions, while identifying limits of the current prototype’s temporal resolution in terms of dynamic EBRT parameters. This work paves the way for providing an optimized, second

  19. SU-C-201-04: Noise and Temporal Resolution in a Near Real-Time 3D Dosimeter

    International Nuclear Information System (INIS)

    Rilling, M; Goulet, M; Beaulieu, L; Archambault, L; Thibault, S

    2016-01-01

    Purpose: To characterize the performance of a real-time three-dimensional scintillation dosimeter in terms of signal-to-noise ratio (SNR) and temporal resolution of 3D dose measurements. This study quantifies its efficiency in measuring low dose levels characteristic of EBRT dynamic treatments, and in reproducing field profiles for varying multileaf collimator (MLC) speeds. Methods: The dosimeter prototype uses a plenoptic camera to acquire continuous images of the light field emitted by a 10×10×10 cm"3 plastic scintillator. Using EPID acquisitions, ray tracing-based iterative tomographic algorithms allow millimeter-sized reconstruction of relative 3D dose distributions. Measurements were taken at 6MV, 400 MU/min with the scintillator centered at the isocenter, first receiving doses from 1.4 to 30.6 cGy. Dynamic measurements were then performed by closing half of the MLCs at speeds of 0.67 to 2.5 cm/s, at 0° and 90° collimator angles. A reference static half-field was obtained for measured profile comparison. Results: The SNR steadily increases as a function of dose and reaches a clinically adequate plateau of 80 at 10 cGy. Below this, the decrease in light collected and increase in pixel noise diminishes the SNR; nonetheless, the EPID acquisitions and the voxel correlation employed in the reconstruction algorithms result in suitable SNR values (>75) even at low doses. For dynamic measurements at varying MLC speeds, central relative dose profiles are characterized by gradients at %D_5_0 of 8.48 to 22.7 %/mm. These values converge towards the 32.8 %/mm-gradient measured for the static reference field profile, but are limited by the dosimeter’s current acquisition rate of 1Hz. Conclusion: This study emphasizes the efficiency of the 3D dose distribution reconstructions, while identifying limits of the current prototype’s temporal resolution in terms of dynamic EBRT parameters. This work paves the way for providing an optimized, second-generational real-time 3D

  20. Bayesian based design of real-time sensor systems for high-risk indoor contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Sreedharan, Priya [Univ. of California, Berkeley, CA (United States)

    2007-01-01

    The sudden release of toxic contaminants that reach indoor spaces can be hazardousto building occupants. To respond effectively, the contaminant release must be quicklydetected and characterized to determine unobserved parameters, such as release locationand strength. Characterizing the release requires solving an inverse problem. Designinga robust real-time sensor system that solves the inverse problem is challenging becausethe fate and transport of contaminants is complex, sensor information is limited andimperfect, and real-time estimation is computationally constrained.This dissertation uses a system-level approach, based on a Bayes Monte Carloframework, to develop sensor-system design concepts and methods. I describe threeinvestigations that explore complex relationships among sensors, network architecture,interpretation algorithms, and system performance. The investigations use data obtainedfrom tracer gas experiments conducted in a real building. The influence of individual sensor characteristics on the sensor-system performance for binary-type contaminant sensors is analyzed. Performance tradeoffs among sensor accuracy, threshold level and response time are identified; these attributes could not be inferred without a system-level analysis. For example, more accurate but slower sensors are found to outperform less accurate but faster sensors. Secondly, I investigate how the sensor-system performance can be understood in terms of contaminant transport processes and the model representation that is used to solve the inverse problem. The determination of release location and mass are shown to be related to and constrained by transport and mixing time scales. These time scales explain performance differences among different sensor networks. For example, the effect of longer sensor response times is comparably less for releases with longer mixing time scales. The third investigation explores how information fusion from heterogeneous sensors may improve the sensor

  1. Robust real-time change detection in high jitter.

    Energy Technology Data Exchange (ETDEWEB)

    Simonson, Katherine Mary; Ma, Tian J.

    2009-08-01

    A new method is introduced for real-time detection of transient change in scenes observed by staring sensors that are subject to platform jitter, pixel defects, variable focus, and other real-world challenges. The approach uses flexible statistical models for the scene background and its variability, which are continually updated to track gradual drift in the sensor's performance and the scene under observation. Two separate models represent temporal and spatial variations in pixel intensity. For the temporal model, each new frame is projected into a low-dimensional subspace designed to capture the behavior of the frame data over a recent observation window. Per-pixel temporal standard deviation estimates are based on projection residuals. The second approach employs a simple representation of jitter to generate pixelwise moment estimates from a single frame. These estimates rely on spatial characteristics of the scene, and are used gauge each pixel's susceptibility to jitter. The temporal model handles pixels that are naturally variable due to sensor noise or moving scene elements, along with jitter displacements comparable to those observed in the recent past. The spatial model captures jitter-induced changes that may not have been seen previously. Change is declared in pixels whose current values are inconsistent with both models.

  2. Detecting changes in real-time data: a user's guide to optimal detection.

    Science.gov (United States)

    Johnson, P; Moriarty, J; Peskir, G

    2017-08-13

    The real-time detection of changes in a noisily observed signal is an important problem in applied science and engineering. The study of parametric optimal detection theory began in the 1930s, motivated by applications in production and defence. Today this theory, which aims to minimize a given measure of detection delay under accuracy constraints, finds applications in domains including radar, sonar, seismic activity, global positioning, psychological testing, quality control, communications and power systems engineering. This paper reviews developments in optimal detection theory and sequential analysis, including sequential hypothesis testing and change-point detection, in both Bayesian and classical (non-Bayesian) settings. For clarity of exposition, we work in discrete time and provide a brief discussion of the continuous time setting, including recent developments using stochastic calculus. Different measures of detection delay are presented, together with the corresponding optimal solutions. We emphasize the important role of the signal-to-noise ratio and discuss both the underlying assumptions and some typical applications for each formulation.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  3. Real-Time Extended Interface Automata for Software Testing Cases Generation

    Directory of Open Access Journals (Sweden)

    Shunkun Yang

    2014-01-01

    Full Text Available Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system.

  4. 4D Near Real-Time Environmental Monitoring Using Highly Temporal LiDAR

    Science.gov (United States)

    Höfle, Bernhard; Canli, Ekrem; Schmitz, Evelyn; Crommelinck, Sophie; Hoffmeister, Dirk; Glade, Thomas

    2016-04-01

    The last decade has witnessed extensive applications of 3D environmental monitoring with the LiDAR technology, also referred to as laser scanning. Although several automatic methods were developed to extract environmental parameters from LiDAR point clouds, only little research has focused on highly multitemporal near real-time LiDAR (4D-LiDAR) for environmental monitoring. Large potential of applying 4D-LiDAR is given for landscape objects with high and varying rates of change (e.g. plant growth) and also for phenomena with sudden unpredictable changes (e.g. geomorphological processes). In this presentation we will report on the most recent findings of the research projects 4DEMON (http://uni-heidelberg.de/4demon) and NoeSLIDE (https://geomorph.univie.ac.at/forschung/projekte/aktuell/noeslide/). The method development in both projects is based on two real-world use cases: i) Surface parameter derivation of agricultural crops (e.g. crop height) and ii) change detection of landslides. Both projects exploit the "full history" contained in the LiDAR point cloud time series. One crucial initial step of 4D-LiDAR analysis is the co-registration over time, 3D-georeferencing and time-dependent quality assessment of the LiDAR point cloud time series. Due to the high amount of datasets (e.g. one full LiDAR scan per day), the procedure needs to be performed fully automatically. Furthermore, the online near real-time 4D monitoring system requires to set triggers that can detect removal or moving of tie reflectors (used for co-registration) or the scanner itself. This guarantees long-term data acquisition with high quality. We will present results from a georeferencing experiment for 4D-LiDAR monitoring, which performs benchmarking of co-registration, 3D-georeferencing and also fully automatic detection of events (e.g. removal/moving of reflectors or scanner). Secondly, we will show our empirical findings of an ongoing permanent LiDAR observation of a landslide (Gresten

  5. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  6. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  7. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    International Nuclear Information System (INIS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  8. Self-consistent Bayesian analysis of space-time symmetry studies

    International Nuclear Information System (INIS)

    Davis, E.D.

    1996-01-01

    We introduce a Bayesian method for the analysis of epithermal neutron transmission data on space-time symmetries in which unique assignment of the prior is achieved by maximisation of the cross entropy and the imposition of a self-consistency criterion. Unlike the maximum likelihood method used in previous analyses of parity-violation data, our method is freed of an ad hoc cutoff parameter. Monte Carlo studies indicate that our self-consistent Bayesian analysis is superior to the maximum likelihood method when applied to the small data samples typical of symmetry studies. (orig.)

  9. Spatio Temporal EEG Source Imaging with the Hierarchical Bayesian Elastic Net and Elitist Lasso Models.

    Science.gov (United States)

    Paz-Linares, Deirel; Vega-Hernández, Mayrim; Rojas-López, Pedro A; Valdés-Hernández, Pedro A; Martínez-Montes, Eduardo; Valdés-Sosa, Pedro A

    2017-01-01

    The estimation of EEG generating sources constitutes an Inverse Problem (IP) in Neuroscience. This is an ill-posed problem due to the non-uniqueness of the solution and regularization or prior information is needed to undertake Electrophysiology Source Imaging. Structured Sparsity priors can be attained through combinations of (L1 norm-based) and (L2 norm-based) constraints such as the Elastic Net (ENET) and Elitist Lasso (ELASSO) models. The former model is used to find solutions with a small number of smooth nonzero patches, while the latter imposes different degrees of sparsity simultaneously along different dimensions of the spatio-temporal matrix solutions. Both models have been addressed within the penalized regression approach, where the regularization parameters are selected heuristically, leading usually to non-optimal and computationally expensive solutions. The existing Bayesian formulation of ENET allows hyperparameter learning, but using the computationally intensive Monte Carlo/Expectation Maximization methods, which makes impractical its application to the EEG IP. While the ELASSO have not been considered before into the Bayesian context. In this work, we attempt to solve the EEG IP using a Bayesian framework for ENET and ELASSO models. We propose a Structured Sparse Bayesian Learning algorithm based on combining the Empirical Bayes and the iterative coordinate descent procedures to estimate both the parameters and hyperparameters. Using realistic simulations and avoiding the inverse crime we illustrate that our methods are able to recover complicated source setups more accurately and with a more robust estimation of the hyperparameters and behavior under different sparsity scenarios than classical LORETA, ENET and LASSO Fusion solutions. We also solve the EEG IP using data from a visual attention experiment, finding more interpretable neurophysiological patterns with our methods. The Matlab codes used in this work, including Simulations, Methods

  10. Bayesian Analysis for Dynamic Generalized Linear Latent Model with Application to Tree Survival Rate

    Directory of Open Access Journals (Sweden)

    Yu-sheng Cheng

    2014-01-01

    Full Text Available Logistic regression model is the most popular regression technique, available for modeling categorical data especially for dichotomous variables. Classic logistic regression model is typically used to interpret relationship between response variables and explanatory variables. However, in real applications, most data sets are collected in follow-up, which leads to the temporal correlation among the data. In order to characterize the different variables correlations, a new method about the latent variables is introduced in this study. At the same time, the latent variables about AR (1 model are used to depict time dependence. In the framework of Bayesian analysis, parameters estimates and statistical inferences are carried out via Gibbs sampler with Metropolis-Hastings (MH algorithm. Model comparison, based on the Bayes factor, and forecasting/smoothing of the survival rate of the tree are established. A simulation study is conducted to assess the performance of the proposed method and a pika data set is analyzed to illustrate the real application. Since Bayes factor approaches vary significantly, efficiency tests have been performed in order to decide which solution provides a better tool for the analysis of real relational data sets.

  11. Bayesian Analysis of Bubbles in Asset Prices

    Directory of Open Access Journals (Sweden)

    Andras Fulop

    2017-10-01

    Full Text Available We develop a new model where the dynamic structure of the asset price, after the fundamental value is removed, is subject to two different regimes. One regime reflects the normal period where the asset price divided by the dividend is assumed to follow a mean-reverting process around a stochastic long run mean. The second regime reflects the bubble period with explosive behavior. Stochastic switches between two regimes and non-constant probabilities of exit from the bubble regime are both allowed. A Bayesian learning approach is employed to jointly estimate the latent states and the model parameters in real time. An important feature of our Bayesian method is that we are able to deal with parameter uncertainty and at the same time, to learn about the states and the parameters sequentially, allowing for real time model analysis. This feature is particularly useful for market surveillance. Analysis using simulated data reveals that our method has good power properties for detecting bubbles. Empirical analysis using price-dividend ratios of S&P500 highlights the advantages of our method.

  12. Perceptual Real-Time 2D-to-3D Conversion Using Cue Fusion.

    Science.gov (United States)

    Leimkuhler, Thomas; Kellnhofer, Petr; Ritschel, Tobias; Myszkowski, Karol; Seidel, Hans-Peter

    2018-06-01

    We propose a system to infer binocular disparity from a monocular video stream in real-time. Different from classic reconstruction of physical depth in computer vision, we compute perceptually plausible disparity, that is numerically inaccurate, but results in a very similar overall depth impression with plausible overall layout, sharp edges, fine details and agreement between luminance and disparity. We use several simple monocular cues to estimate disparity maps and confidence maps of low spatial and temporal resolution in real-time. These are complemented by spatially-varying, appearance-dependent and class-specific disparity prior maps, learned from example stereo images. Scene classification selects this prior at runtime. Fusion of prior and cues is done by means of robust MAP inference on a dense spatio-temporal conditional random field with high spatial and temporal resolution. Using normal distributions allows this in constant-time, parallel per-pixel work. We compare our approach to previous 2D-to-3D conversion systems in terms of different metrics, as well as a user study and validate our notion of perceptually plausible disparity.

  13. Timing Analysis of Mixed-Criticality Hard Real-Time Applications Implemented on Distributed Partitioned Architectures

    DEFF Research Database (Denmark)

    Marinescu, Sorin Ovidiu; Tamas-Selicean, Domitian; Acretoaie, Vlad

    In this paper we are interested in the timing analysis of mixed-criticality embedded real-time applications mapped on distributed heterogeneous architectures. Mixedcriticality tasks can be integrated onto the same architecture only if there is enough spatial and temporal separation among them. We...... in partitions using fixedpriority preemptive scheduling. We have extended the stateof- the-art algorithms for schedulability analysis to take into account the partitions. The proposed algorithm has been evaluated using several synthetic and real-life benchmarks....... consider that the separation is provided by partitioning, such that applications run in separate partitions, and each partition is allocated several time slots on a processor. Each partition can have its own scheduling policy. We are interested to determine the worst-case response times of tasks scheduled...

  14. Arrow-bot: A Teaching Tool for Real-Time Embedded System Course

    Directory of Open Access Journals (Sweden)

    Zakaria Mohamad Fauzi

    2017-01-01

    Full Text Available This paper presents the design of a line following Arduino-based mobile robot for Real-Time Embedded System course at Universiti Tun Hussein Onn Malaysia. The real-time system (RTS concept was implementing is based on rate monotonic scheduling (RMS on an ATmega328P microcontroller. Three infrared line sensors were used as input for controlling two direct current (DC motors. A RTS software was programmed in Arduino IDE which relied on a real-time operating system (RTOS of ChibiOS/RT library. Three independent tasks of software functions were created for testing real-time scheduling capability and the result of temporal scope was collected. The microcontroller succeeded to handle multiple tasks without missed their dateline. This implementation of the RTOS in embedded system for mobile robotics system is hoped to increase students understanding and learning capability.

  15. Bayesian spatio-temporal modeling of Schistosoma japonicum prevalence data in the absence of a diagnostic 'gold' standard.

    Science.gov (United States)

    Wang, Xian-Hong; Zhou, Xiao-Nong; Vounatsou, Penelope; Chen, Zhao; Utzinger, Jürg; Yang, Kun; Steinmann, Peter; Wu, Xiao-Hua

    2008-06-11

    Spatial modeling is increasingly utilized to elucidate relationships between demographic, environmental, and socioeconomic factors, and infectious disease prevalence data. However, there is a paucity of studies focusing on spatio-temporal modeling that take into account the uncertainty of diagnostic techniques. We obtained Schistosoma japonicum prevalence data, based on a standardized indirect hemagglutination assay (IHA), from annual reports from 114 schistosome-endemic villages in Dangtu County, southeastern part of the People's Republic of China, for the period 1995 to 2004. Environmental data were extracted from satellite images. Socioeconomic data were available from village registries. We used Bayesian spatio-temporal models, accounting for the sensitivity and specificity of the IHA test via an equation derived from the law of total probability, to relate the observed with the 'true' prevalence. The risk of S. japonicum was positively associated with the mean land surface temperature, and negatively correlated with the mean normalized difference vegetation index and distance to the nearest water body. There was no significant association between S. japonicum and socioeconomic status of the villages surveyed. The spatial correlation structures of the observed S. japonicum seroprevalence and the estimated infection prevalence differed from one year to another. Variance estimates based on a model adjusted for the diagnostic error were larger than unadjusted models. The generated prediction map for 2005 showed that most of the former and current infections occur in close proximity to the Yangtze River. Bayesian spatial-temporal modeling incorporating diagnostic uncertainty is a suitable approach for risk mapping S. japonicum prevalence data. The Yangtze River and its tributaries govern schistosomiasis transmission in Dangtu County, but spatial correlation needs to be taken into consideration when making risk prediction at small scales.

  16. Runtime verification of embedded real-time systems.

    Science.gov (United States)

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  17. Retrospective Reconstruction of High Temporal Resolution Cine Images from Real-Time MRI using Iterative Motion Correction

    DEFF Research Database (Denmark)

    Hansen, Michael Schacht; Sørensen, Thomas Sangild; Arai, Andrew

    2012-01-01

    acquisitions in 10 (N = 10) subjects. Acceptable image quality was obtained in all motion-corrected reconstructions, and the resulting mean image quality score was (a) Cartesian real-time: 2.48, (b) Golden Angle real-time: 1.90 (1.00–2.50), (c) Cartesian motion correction: 3.92, (d) Radial motion correction: 4...... and motion correction based on nonrigid registration and can be applied to arbitrary k-space trajectories. The method is demonstrated with real-time Cartesian imaging and Golden Angle radial acquisitions, and the motion-corrected acquisitions are compared with raw real-time images and breath-hold cine...

  18. Bayesian modelling to estimate the test characteristics of coprology, coproantigen ELISA and a novel real-time PCR for the diagnosis of taeniasis.

    Science.gov (United States)

    Praet, Nicolas; Verweij, Jaco J; Mwape, Kabemba E; Phiri, Isaac K; Muma, John B; Zulu, Gideon; van Lieshout, Lisette; Rodriguez-Hidalgo, Richar; Benitez-Ortiz, Washington; Dorny, Pierre; Gabriël, Sarah

    2013-05-01

    To estimate and compare the performances of coprology, copro-Ag ELISA and real-time polymerase chain reaction assay (copro-PCR) for detection of Taenia solium tapeworm carriers. The three diagnostic tests were applied on 817 stool samples collected in two Zambian communities where taeniasis is endemic. A Bayesian approach was used to allow estimation of the test characteristics. Two (0.2%; 95% Confidence Interval (CI): 0-0.8), 67 (8.2%; 95% CI: 6.4-10.3) and 10 (1.2%; 95% CI: 0.5-2.2) samples were positive using coprology, copro-Ag ELISA and copro-PCR, respectively. Specificities of 99.9%, 92.0% and 99.0% were determined for coprology, copro-Ag ELISA and copro-PCR, respectively. Sensitivities of 52.5%, 84.5% and 82.7% were determined for coprology, copro-Ag ELISA and copro-PCR, respectively. We urge for additional studies exploring possible cross-reactions of the copro-Ag ELISA and for the use of more sensitive tests, such as copro-PCR, for the detection of tapeworm carriers, which is a key factor in controlling the parasite in endemic areas. © 2013 Blackwell Publishing Ltd.

  19. Bayesian community detection

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel N

    2012-01-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....

  20. Evidence on a Real Business Cycle Model with Neutral and Investment-Specific Technology Shocks using Bayesian Model Averaging

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2010-01-01

    textabstractThe empirical support for a real business cycle model with two technology shocks is evaluated using a Bayesian model averaging procedure. This procedure makes use of a finite mixture of many models within the class of vector autoregressive (VAR) processes. The linear VAR model is

  1. Real-Time EEG-Based Happiness Detection System

    Directory of Open Access Journals (Sweden)

    Noppadon Jatupaiboon

    2013-01-01

    Full Text Available We propose to use real-time EEG signal to classify happy and unhappy emotions elicited by pictures and classical music. We use PSD as a feature and SVM as a classifier. The average accuracies of subject-dependent model and subject-independent model are approximately 75.62% and 65.12%, respectively. Considering each pair of channels, temporal pair of channels (T7 and T8 gives a better result than the other area. Considering different frequency bands, high-frequency bands (Beta and Gamma give a better result than low-frequency bands. Considering different time durations for emotion elicitation, that result from 30 seconds does not have significant difference compared with the result from 60 seconds. From all of these results, we implement real-time EEG-based happiness detection system using only one pair of channels. Furthermore, we develop games based on the happiness detection system to help user recognize and control the happiness.

  2. Application of SCM with Bayesian B-Spline to Spatio-Temporal Analysis of Hypertension in China.

    Science.gov (United States)

    Ye, Zirong; Xu, Li; Zhou, Zi; Wu, Yafei; Fang, Ya

    2018-01-02

    Most previous research on the disparities of hypertension risk has neither simultaneously explored the spatio-temporal disparities nor considered the spatial information contained in the samples, thus the estimated results may be unreliable. Our study was based on the China Health and Nutrition Survey (CHNS), including residents over 12 years old in seven provinces from 1991 to 2011. Bayesian B-spline was used in the extended shared component model (SCM) for fitting temporal-related variation to explore spatio-temporal distribution in the odds ratio (OR) of hypertension, reveal gender variation, and explore latent risk factors. Our results revealed that the prevalence of hypertension increased from 14.09% in 1991 to 32.37% in 2011, with men experiencing a more obvious change than women. From a spatial perspective, a standardized prevalence ratio (SPR) remaining at a high level was found in Henan and Shandong for both men and women. Meanwhile, before 1997, the temporal distribution of hypertension risk for both men and women remained low. After that, notably since 2004, the OR of hypertension in each province increased to a relatively high level, especially in Northern China. Notably, the OR of hypertension in Shandong and Jiangsu, which was over 1.2, continuously stood out after 2004 for males, while that in Shandong and Guangxi was relatively high for females. The findings suggested that obvious spatial-temporal patterns for hypertension exist in the regions under research and this pattern was quite different between men and women.

  3. Real Time Revisited

    Science.gov (United States)

    Allen, Phillip G.

    1985-12-01

    The call for abolishing photo reconnaissance in favor of real time is once more being heard. Ten years ago the same cries were being heard with the introduction of the Charge Coupled Device (CCD). The real time system problems that existed then and stopped real time proliferation have not been solved. The lack of an organized program by either DoD or industry has hampered any efforts to solve the problems, and as such, very little has happened in real time in the last ten years. Real time is not a replacement for photo, just as photo is not a replacement for infra-red or radar. Operational real time sensors can be designed only after their role has been defined and improvements made to the weak links in the system. Plodding ahead on a real time reconnaissance suite without benefit of evaluation of utility will allow this same paper to be used ten years from now.

  4. Non-homogeneous dynamic Bayesian networks for continuous data

    NARCIS (Netherlands)

    Grzegorczyk, Marco; Husmeier, Dirk

    Classical dynamic Bayesian networks (DBNs) are based on the homogeneous Markov assumption and cannot deal with non-homogeneous temporal processes. Various approaches to relax the homogeneity assumption have recently been proposed. The present paper presents a combination of a Bayesian network with

  5. Bayesian Uncertainty Quantification for Subsurface Inversion Using a Multiscale Hierarchical Model

    KAUST Repository

    Mondal, Anirban

    2014-07-03

    We consider a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a random field (spatial or temporal). The Bayesian approach contains a natural mechanism for regularization in the form of prior information, can incorporate information from heterogeneous sources and provide a quantitative assessment of uncertainty in the inverse solution. The Bayesian setting casts the inverse solution as a posterior probability distribution over the model parameters. The Karhunen-Loeve expansion is used for dimension reduction of the random field. Furthermore, we use a hierarchical Bayes model to inject multiscale data in the modeling framework. In this Bayesian framework, we show that this inverse problem is well-posed by proving that the posterior measure is Lipschitz continuous with respect to the data in total variation norm. Computational challenges in this construction arise from the need for repeated evaluations of the forward model (e.g., in the context of MCMC) and are compounded by high dimensionality of the posterior. We develop two-stage reversible jump MCMC that has the ability to screen the bad proposals in the first inexpensive stage. Numerical results are presented by analyzing simulated as well as real data from hydrocarbon reservoir. This article has supplementary material available online. © 2014 American Statistical Association and the American Society for Quality.

  6. Real-Time Corrected Traffic Correlation Model for Traffic Flow Forecasting

    Directory of Open Access Journals (Sweden)

    Hua-pu Lu

    2015-01-01

    Full Text Available This paper focuses on the problems of short-term traffic flow forecasting. The main goal is to put forward traffic correlation model and real-time correction algorithm for traffic flow forecasting. Traffic correlation model is established based on the temporal-spatial-historical correlation characteristic of traffic big data. In order to simplify the traffic correlation model, this paper presents correction coefficients optimization algorithm. Considering multistate characteristic of traffic big data, a dynamic part is added to traffic correlation model. Real-time correction algorithm based on Fuzzy Neural Network is presented to overcome the nonlinear mapping problems. A case study based on a real-world road network in Beijing, China, is implemented to test the efficiency and applicability of the proposed modeling methods.

  7. How about a Bayesian M/EEG imaging method correcting for incomplete spatio-temporal priors

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Attias, Hagai T.; Sekihara, Kensuke

    2013-01-01

    previous spatio-temporal inverse M/EEG models, the proposed model benefits of consisting of two source terms, namely, a spatio-temporal pattern term limiting the source configuration to a spatio-temporal subspace and a source correcting term to pick up source activity not covered by the spatio......-temporal prior belief. We have tested the model on both artificial data and real EEG data in order to demonstrate the efficacy of the model. The model was tested at different SNRs (-10.0,-5.2, -3.0, -1.0, 0, 0.8, 3.0 dB) using white noise. At all SNRs the sAquavit performs best in AUC measure, e.g. at SNR=0d...

  8. A Markov Chain Monte Carlo version of the genetic algorithm Differential Evolution: easy Bayesian computing for real parameter spaces

    NARCIS (Netherlands)

    Braak, ter C.J.F.

    2006-01-01

    Differential Evolution (DE) is a simple genetic algorithm for numerical optimization in real parameter spaces. In a statistical context one would not just want the optimum but also its uncertainty. The uncertainty distribution can be obtained by a Bayesian analysis (after specifying prior and

  9. When mechanism matters: Bayesian forecasting using models of ecological diffusion

    Science.gov (United States)

    Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.

    2017-01-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.

  10. The economic analysis of power market architectures: application to real-time market design

    International Nuclear Information System (INIS)

    Saguan, M.

    2007-04-01

    This work contributes to the economic analysis of power market architectures. A modular framework is used to separate problems of market design in different modules. The work's goal is to study real-time market design. A two-stage market equilibrium model is used to analyse the two main real-time designs: the 'market' and the 'mechanism' (with penalty). Numerical simulations show that design applied in real-time is not neutral vis-a-vis of energy markets sequence and the competition dynamic. Designs using penalty (mechanisms) cause distortions, inefficiencies and can create barriers to entry. The size of distortions is given by the temporal position of the gate that closure the forward markets. This model has also allowed us to show the key role of real-time integration between zones and the importance of good harmonization between real-time designs of each zone. (author)

  11. Connectivity-based neurofeedback: Dynamic causal modeling for real-time fMRI☆

    Science.gov (United States)

    Koush, Yury; Rosa, Maria Joao; Robineau, Fabien; Heinen, Klaartje; W. Rieger, Sebastian; Weiskopf, Nikolaus; Vuilleumier, Patrik; Van De Ville, Dimitri; Scharnowski, Frank

    2013-01-01

    Neurofeedback based on real-time fMRI is an emerging technique that can be used to train voluntary control of brain activity. Such brain training has been shown to lead to behavioral effects that are specific to the functional role of the targeted brain area. However, real-time fMRI-based neurofeedback so far was limited to mainly training localized brain activity within a region of interest. Here, we overcome this limitation by presenting near real-time dynamic causal modeling in order to provide feedback information based on connectivity between brain areas rather than activity within a single brain area. Using a visual–spatial attention paradigm, we show that participants can voluntarily control a feedback signal that is based on the Bayesian model comparison between two predefined model alternatives, i.e. the connectivity between left visual cortex and left parietal cortex vs. the connectivity between right visual cortex and right parietal cortex. Our new approach thus allows for training voluntary control over specific functional brain networks. Because most mental functions and most neurological disorders are associated with network activity rather than with activity in a single brain region, this novel approach is an important methodological innovation in order to more directly target functionally relevant brain networks. PMID:23668967

  12. Bayesian spatio-temporal modeling of Schistosoma japonicum prevalence data in the absence of a diagnostic 'gold' standard.

    Directory of Open Access Journals (Sweden)

    Xian-Hong Wang

    Full Text Available BACKGROUND: Spatial modeling is increasingly utilized to elucidate relationships between demographic, environmental, and socioeconomic factors, and infectious disease prevalence data. However, there is a paucity of studies focusing on spatio-temporal modeling that take into account the uncertainty of diagnostic techniques. METHODOLOGY/PRINCIPAL FINDINGS: We obtained Schistosoma japonicum prevalence data, based on a standardized indirect hemagglutination assay (IHA, from annual reports from 114 schistosome-endemic villages in Dangtu County, southeastern part of the People's Republic of China, for the period 1995 to 2004. Environmental data were extracted from satellite images. Socioeconomic data were available from village registries. We used Bayesian spatio-temporal models, accounting for the sensitivity and specificity of the IHA test via an equation derived from the law of total probability, to relate the observed with the 'true' prevalence. The risk of S. japonicum was positively associated with the mean land surface temperature, and negatively correlated with the mean normalized difference vegetation index and distance to the nearest water body. There was no significant association between S. japonicum and socioeconomic status of the villages surveyed. The spatial correlation structures of the observed S. japonicum seroprevalence and the estimated infection prevalence differed from one year to another. Variance estimates based on a model adjusted for the diagnostic error were larger than unadjusted models. The generated prediction map for 2005 showed that most of the former and current infections occur in close proximity to the Yangtze River. CONCLUSION/SIGNIFICANCE: Bayesian spatial-temporal modeling incorporating diagnostic uncertainty is a suitable approach for risk mapping S. japonicum prevalence data. The Yangtze River and its tributaries govern schistosomiasis transmission in Dangtu County, but spatial correlation needs to be taken

  13. Formal Specification and Verification of Real-Time Multi-Agent Systems using Timed-Arc Petri Nets

    Directory of Open Access Journals (Sweden)

    QASIM, A.

    2015-08-01

    Full Text Available In this study we have formally specified and verified the actions of communicating real-time software agents (RTAgents. Software agents are expected to work autonomously and deal with unfamiliar situations astutely. Achieving cent percent test cases coverage for these agents has always been a problem due to limited resources. Also a high degree of dependability and predictability is expected from real-time software agents. In this research we have used Timed-Arc Petri Net's for formal specification and verification. Formal specification of e-agents has been done in the past using Linear Temporal Logic (LTL but we believe that Timed-Arc Petri Net's being more visually expressive provides a richer framework for such formalism. A case study of Stock Market System (SMS based on Real Time Multi Agent System framework (RTMAS using Timed-Arc Petri Net's is taken to illustrate the proposed modeling approach. The model was verified used AF, AG, EG, and EF fragments of Timed Computational Tree Logic (TCTL via translations to timed automata.

  14. Iterative Bayesian Estimation of Travel Times on Urban Arterials: Fusing Loop Detector and Probe Vehicle Data.

    Science.gov (United States)

    Liu, Kai; Cui, Meng-Ying; Cao, Peng; Wang, Jiang-Bo

    2016-01-01

    On urban arterials, travel time estimation is challenging especially from various data sources. Typically, fusing loop detector data and probe vehicle data to estimate travel time is a troublesome issue while considering the data issue of uncertain, imprecise and even conflicting. In this paper, we propose an improved data fusing methodology for link travel time estimation. Link travel times are simultaneously pre-estimated using loop detector data and probe vehicle data, based on which Bayesian fusion is then applied to fuse the estimated travel times. Next, Iterative Bayesian estimation is proposed to improve Bayesian fusion by incorporating two strategies: 1) substitution strategy which replaces the lower accurate travel time estimation from one sensor with the current fused travel time; and 2) specially-designed conditions for convergence which restrict the estimated travel time in a reasonable range. The estimation results show that, the proposed method outperforms probe vehicle data based method, loop detector based method and single Bayesian fusion, and the mean absolute percentage error is reduced to 4.8%. Additionally, iterative Bayesian estimation performs better for lighter traffic flows when the variability of travel time is practically higher than other periods.

  15. Comparison of different strategies for using fossil calibrations to generate the time prior in Bayesian molecular clock dating.

    Science.gov (United States)

    Barba-Montoya, Jose; Dos Reis, Mario; Yang, Ziheng

    2017-09-01

    Fossil calibrations are the utmost source of information for resolving the distances between molecular sequences into estimates of absolute times and absolute rates in molecular clock dating analysis. The quality of calibrations is thus expected to have a major impact on divergence time estimates even if a huge amount of molecular data is available. In Bayesian molecular clock dating, fossil calibration information is incorporated in the analysis through the prior on divergence times (the time prior). Here, we evaluate three strategies for converting fossil calibrations (in the form of minimum- and maximum-age bounds) into the prior on times, which differ according to whether they borrow information from the maximum age of ancestral nodes and minimum age of descendent nodes to form constraints for any given node on the phylogeny. We study a simple example that is analytically tractable, and analyze two real datasets (one of 10 primate species and another of 48 seed plant species) using three Bayesian dating programs: MCMCTree, MrBayes and BEAST2. We examine how different calibration strategies, the birth-death process, and automatic truncation (to enforce the constraint that ancestral nodes are older than descendent nodes) interact to determine the time prior. In general, truncation has a great impact on calibrations so that the effective priors on the calibration node ages after the truncation can be very different from the user-specified calibration densities. The different strategies for generating the effective prior also had considerable impact, leading to very different marginal effective priors. Arbitrary parameters used to implement minimum-bound calibrations were found to have a strong impact upon the prior and posterior of the divergence times. Our results highlight the importance of inspecting the joint time prior used by the dating program before any Bayesian dating analysis. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Recognition of Action as a Bayesian Parameter Estimation Problem over Time

    DEFF Research Database (Denmark)

    Krüger, Volker

    2007-01-01

    In this paper we will discuss two problems related to action recognition: The first problem is the one of identifying in a surveillance scenario whether a person is walking or running and in what rough direction. The second problem is concerned with the recovery of action primitives from observed...... complex actions. Both problems will be discussed within a statistical framework. Bayesian propagation over time offers a framework to treat likelihood observations at each time step and the dynamics between the time steps in a unified manner. The first problem will be approached as a patter recognition...... of the Bayesian framework for action recognition and round up our discussion....

  17. Improved operative efficiency using a real-time MRI-guided stereotactic platform for laser amygdalohippocampotomy.

    Science.gov (United States)

    Ho, Allen L; Sussman, Eric S; Pendharkar, Arjun V; Le, Scheherazade; Mantovani, Alessandra; Keebaugh, Alaine C; Drover, David R; Grant, Gerald A; Wintermark, Max; Halpern, Casey H

    2018-04-01

    OBJECTIVE MR-guided laser interstitial thermal therapy (MRgLITT) is a minimally invasive method for thermal destruction of benign or malignant tissue that has been used for selective amygdalohippocampal ablation for the treatment of temporal lobe epilepsy. The authors report their initial experience adopting a real-time MRI-guided stereotactic platform that allows for completion of the entire procedure in the MRI suite. METHODS Between October 2014 and May 2016, 17 patients with mesial temporal sclerosis were selected by a multidisciplinary epilepsy board to undergo a selective amygdalohippocampal ablation for temporal lobe epilepsy using MRgLITT. The first 9 patients underwent standard laser ablation in 2 phases (operating room [OR] and MRI suite), whereas the next 8 patients underwent laser ablation entirely in the MRI suite with the ClearPoint platform. A checklist specific to the real-time MRI-guided laser amydalohippocampal ablation was developed and used for each case. For both cohorts, clinical and operative information, including average case times and accuracy data, was collected and analyzed. RESULTS There was a learning curve associated with using this real-time MRI-guided system. However, operative times decreased in a linear fashion, as did total anesthesia time. In fact, the total mean patient procedure time was less in the MRI cohort (362.8 ± 86.6 minutes) than in the OR cohort (456.9 ± 80.7 minutes). The mean anesthesia time was significantly shorter in the MRI cohort (327.2 ± 79.9 minutes) than in the OR cohort (435.8 ± 78.4 minutes, p = 0.02). CONCLUSIONS The real-time MRI platform for MRgLITT can be adopted in an expedient manner. Completion of MRgLITT entirely in the MRI suite may lead to significant advantages in procedural times.

  18. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    Science.gov (United States)

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  19. Visualization of swallowing using real-time TrueFISP MR fluoroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Barkhausen, Joerg; Goyen, Mathias; Lauenstein, Thomas; Debatin, Joerg F. [Department of Diagnostic Radiology, University Hospital Essen (Germany); Winterfeld, F. von; Arweiler-Harbeck, Diana [Department of Otorhinolaryngology, University Hospital Essen (Germany)

    2002-01-01

    The aim of this study was to evaluate the ability of different real-time true fast imaging with steady precession (TrueFISP) sequences regarding their ability to depict the swallowing process and delineate oropharyngeal pathologies in patients with dysphagia. Real-time TrueFISP visualization of swallowing was performed in 8 volunteers and 6 patients with dysphagia using a 1.5 T scanner (Magnetom Sonata, Siemens, Erlangen Germany) equipped with high-performance gradients (amplitude 40 mT/m). Image quality of four different real-time TrueFISP sequences (TR 2.2-3.0 ms, TE 1.1-1.5 ms, matrix 63 x 128-135 x 256, field of view 250 mm{sup 2}, acquisition time per image 139-405 ms) was evaluated. Water, yoghurt, and semolina pudding were assessed as oral contrast agents. Functional exploration of the oropharyngeal apparatus was best possible using the fastest real-time TrueFISP sequence (TR 2.2 ms, TE 1.1 ms, matrix 63 x 128). Increased acquisition time resulted in blurring of anatomical structures. As the image contrast of TrueFISP sequences depends on T2/T1 properties, all tested foodstuff were well suited as oral contrast agents, but image quality was best using semolina pudding. Real-time visualization of swallowing is possible using real-time TrueFISP sequences in conjunction with oral contrast agents. For the functional exploration of swallowing high temporal resolution is more crucial than spatial resolution. (orig.)

  20. RealCalc : a real time Java calculation tool. Application to HVSR estimation

    Science.gov (United States)

    Hloupis, G.; Vallianatos, F.

    2009-04-01

    Java computation platform is not a newcomer in the seismology field. It is mainly used for applications regarding collecting, requesting, spreading and visualizing seismological data because it is productive, safe and has low maintenance costs. Although it has very attractive characteristics for the engineers, Java didn't used frequently in real time applications where prediction and reliability required as a reaction to real world events. The main reasons for this are the absence of priority support (such as priority ceiling or priority inversion) and the use of an automated memory management (called garbage collector). To overcome these problems a number of extensions have been proposed with the Real Time Specification for Java (RTSJ) being the most promising and used one. In the current study we used the RTSJ to build an application that receives data continuously and provides estimations in real time. The application consists of four main modules: incoming data, preprocessing, estimation and publication. As an application example we present real time HVSR estimation. Microtremors recordings are collected continuously from the incoming data module. The preprocessing module consists of a window selector tool based on wavelets which is applied on the incoming data stream in order derive the most stationary parts. The estimation module provides all the necessary calculations according to user specifications. Finally the publication module except the results presentation it also calculates attributes and relevant statistics for each site (temporal variations, HVSR stability). Acknowledgements This work is partially supported by the Greek General Secretariat of Research and Technology in the frame of Crete Regional Project 2000- 2006 (M1.2): "TALOS: An integrated system of seismic hazard monitoring and management in the front of the Hellenic Arc", CRETE PEP7 (KP_7).

  1. Bayesian Travel Time Inversion adopting Gaussian Process Regression

    Science.gov (United States)

    Mauerberger, S.; Holschneider, M.

    2017-12-01

    A major application in seismology is the determination of seismic velocity models. Travel time measurements are putting an integral constraint on the velocity between source and receiver. We provide insight into travel time inversion from a correlation-based Bayesian point of view. Therefore, the concept of Gaussian process regression is adopted to estimate a velocity model. The non-linear travel time integral is approximated by a 1st order Taylor expansion. A heuristic covariance describes correlations amongst observations and a priori model. That approach enables us to assess a proxy of the Bayesian posterior distribution at ordinary computational costs. No multi dimensional numeric integration nor excessive sampling is necessary. Instead of stacking the data, we suggest to progressively build the posterior distribution. Incorporating only a single evidence at a time accounts for the deficit of linearization. As a result, the most probable model is given by the posterior mean whereas uncertainties are described by the posterior covariance.As a proof of concept, a synthetic purely 1d model is addressed. Therefore a single source accompanied by multiple receivers is considered on top of a model comprising a discontinuity. We consider travel times of both phases - direct and reflected wave - corrupted by noise. Left and right of the interface are assumed independent where the squared exponential kernel serves as covariance.

  2. Bayesian phylogenetic estimation of fossil ages.

    Science.gov (United States)

    Drummond, Alexei J; Stadler, Tanja

    2016-07-19

    Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth-death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the 'morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses.This article is part of the themed issue 'Dating species divergences using

  3. A Novel Spatial-Temporal Voronoi Diagram-Based Heuristic Approach for Large-Scale Vehicle Routing Optimization with Time Constraints

    Directory of Open Access Journals (Sweden)

    Wei Tu

    2015-10-01

    Full Text Available Vehicle routing optimization (VRO designs the best routes to reduce travel cost, energy consumption, and carbon emission. Due to non-deterministic polynomial-time hard (NP-hard complexity, many VROs involved in real-world applications require too much computing effort. Shortening computing time for VRO is a great challenge for state-of-the-art spatial optimization algorithms. From a spatial-temporal perspective, this paper presents a spatial-temporal Voronoi diagram-based heuristic approach for large-scale vehicle routing problems with time windows (VRPTW. Considering time constraints, a spatial-temporal Voronoi distance is derived from the spatial-temporal Voronoi diagram to find near neighbors in the space-time searching context. A Voronoi distance decay strategy that integrates a time warp operation is proposed to accelerate local search procedures. A spatial-temporal feature-guided search is developed to improve unpromising micro route structures. Experiments on VRPTW benchmarks and real-world instances are conducted to verify performance. The results demonstrate that the proposed approach is competitive with state-of-the-art heuristics and achieves high-quality solutions for large-scale instances of VRPTWs in a short time. This novel approach will contribute to spatial decision support community by developing an effective vehicle routing optimization method for large transportation applications in both public and private sectors.

  4. Real-time detection and discrimination of visual perception using electrocorticographic signals

    Science.gov (United States)

    Kapeller, C.; Ogawa, H.; Schalk, G.; Kunii, N.; Coon, W. G.; Scharinger, J.; Guger, C.; Kamada, K.

    2018-06-01

    Objective. Several neuroimaging studies have demonstrated that the ventral temporal cortex contains specialized regions that process visual stimuli. This study investigated the spatial and temporal dynamics of electrocorticographic (ECoG) responses to different types and colors of visual stimulation that were presented to four human participants, and demonstrated a real-time decoder that detects and discriminates responses to untrained natural images. Approach. ECoG signals from the participants were recorded while they were shown colored and greyscale versions of seven types of visual stimuli (images of faces, objects, bodies, line drawings, digits, and kanji and hiragana characters), resulting in 14 classes for discrimination (experiment I). Additionally, a real-time system asynchronously classified ECoG responses to faces, kanji and black screens presented via a monitor (experiment II), or to natural scenes (i.e. the face of an experimenter, natural images of faces and kanji, and a mirror) (experiment III). Outcome measures in all experiments included the discrimination performance across types based on broadband γ activity. Main results. Experiment I demonstrated an offline classification accuracy of 72.9% when discriminating among the seven types (without color separation). Further discrimination of grey versus colored images reached an accuracy of 67.1%. Discriminating all colors and types (14 classes) yielded an accuracy of 52.1%. In experiment II and III, the real-time decoder correctly detected 73.7% responses to face, kanji and black computer stimuli and 74.8% responses to presented natural scenes. Significance. Seven different types and their color information (either grey or color) could be detected and discriminated using broadband γ activity. Discrimination performance maximized for combined spatial-temporal information. The discrimination of stimulus color information provided the first ECoG-based evidence for color-related population

  5. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  6. Real-time cardiovascular magnetic resonance at high temporal resolution: radial FLASH with nonlinear inverse reconstruction

    Directory of Open Access Journals (Sweden)

    Merboldt Klaus-Dietmar

    2010-07-01

    Full Text Available Abstract Background Functional assessments of the heart by dynamic cardiovascular magnetic resonance (CMR commonly rely on (i electrocardiographic (ECG gating yielding pseudo real-time cine representations, (ii balanced gradient-echo sequences referred to as steady-state free precession (SSFP, and (iii breath holding or respiratory gating. Problems may therefore be due to the need for a robust ECG signal, the occurrence of arrhythmia and beat to beat variations, technical instabilities (e.g., SSFP "banding" artefacts, and limited patient compliance and comfort. Here we describe a new approach providing true real-time CMR with image acquisition times as short as 20 to 30 ms or rates of 30 to 50 frames per second. Methods The approach relies on a previously developed real-time MR method, which combines a strongly undersampled radial FLASH CMR sequence with image reconstruction by regularized nonlinear inversion. While iterative reconstructions are currently performed offline due to limited computer speed, online monitoring during scanning is accomplished using gridding reconstructions with a sliding window at the same frame rate but with lower image quality. Results Scans of healthy young subjects were performed at 3 T without ECG gating and during free breathing. The resulting images yield T1 contrast (depending on flip angle with an opposed-phase or in-phase condition for water and fat signals (depending on echo time. They completely avoid (i susceptibility-induced artefacts due to the very short echo times, (ii radiofrequency power limitations due to excitations with flip angles of 10° or less, and (iii the risk of peripheral nerve stimulation due to the use of normal gradient switching modes. For a section thickness of 8 mm, real-time images offer a spatial resolution and total acquisition time of 1.5 mm at 30 ms and 2.0 mm at 22 ms, respectively. Conclusions Though awaiting thorough clinical evaluation, this work describes a robust and

  7. Real-time cardiovascular magnetic resonance at high temporal resolution: radial FLASH with nonlinear inverse reconstruction.

    Science.gov (United States)

    Zhang, Shuo; Uecker, Martin; Voit, Dirk; Merboldt, Klaus-Dietmar; Frahm, Jens

    2010-07-08

    Functional assessments of the heart by dynamic cardiovascular magnetic resonance (CMR) commonly rely on (i) electrocardiographic (ECG) gating yielding pseudo real-time cine representations, (ii) balanced gradient-echo sequences referred to as steady-state free precession (SSFP), and (iii) breath holding or respiratory gating. Problems may therefore be due to the need for a robust ECG signal, the occurrence of arrhythmia and beat to beat variations, technical instabilities (e.g., SSFP "banding" artefacts), and limited patient compliance and comfort. Here we describe a new approach providing true real-time CMR with image acquisition times as short as 20 to 30 ms or rates of 30 to 50 frames per second. The approach relies on a previously developed real-time MR method, which combines a strongly undersampled radial FLASH CMR sequence with image reconstruction by regularized nonlinear inversion. While iterative reconstructions are currently performed offline due to limited computer speed, online monitoring during scanning is accomplished using gridding reconstructions with a sliding window at the same frame rate but with lower image quality. Scans of healthy young subjects were performed at 3 T without ECG gating and during free breathing. The resulting images yield T1 contrast (depending on flip angle) with an opposed-phase or in-phase condition for water and fat signals (depending on echo time). They completely avoid (i) susceptibility-induced artefacts due to the very short echo times, (ii) radiofrequency power limitations due to excitations with flip angles of 10 degrees or less, and (iii) the risk of peripheral nerve stimulation due to the use of normal gradient switching modes. For a section thickness of 8 mm, real-time images offer a spatial resolution and total acquisition time of 1.5 mm at 30 ms and 2.0 mm at 22 ms, respectively. Though awaiting thorough clinical evaluation, this work describes a robust and flexible acquisition and reconstruction technique for

  8. Modelling and analysis of real-time and hybrid systems

    Energy Technology Data Exchange (ETDEWEB)

    Olivero, A

    1994-09-29

    This work deals with the modelling and analysis of real-time and hybrid systems. We first present the timed-graphs as model for the real-time systems and we recall the basic notions of the analysis of real-time systems. We describe the temporal properties on the timed-graphs using TCTL formulas. We consider two methods for property verification: in one hand we study the symbolic model-checking (based on backward analysis) and in the other hand we propose a verification method derived of the construction of the simulation graph (based on forward analysis). Both methods have been implemented within the KRONOS verification tool. Their application for the automatic verification on several real-time systems confirms the practical interest of our approach. In a second part we study the hybrid systems, systems combining discrete components with continuous ones. As in the general case the analysis of this king of systems is not decidable, we identify two sub-classes of hybrid systems and we give a construction based method for the generation of a timed-graph from an element into the sub-classes. We prove that in one case the timed-graph obtained is bi-similar with the considered system and that there exists a simulation in the other case. These relationships allow the application of the described technics on the hybrid systems into the defined sub-classes. (authors). 60 refs., 43 figs., 8 tabs., 2 annexes.

  9. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  10. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    2000-01-01

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...... Carlo (MCMC) sampling techniques. The advantage of this method is that detection of short time learning effects between repeated trials is possible since inference is based only on single trial experiments....

  11. Spatial-temporal noise reduction method optimized for real-time implementation

    Science.gov (United States)

    Romanenko, I. V.; Edirisinghe, E. A.; Larkin, D.

    2013-02-01

    Image de-noising in the spatial-temporal domain has been a problem studied in-depth in the field of digital image processing. However complexity of algorithms often leads to high hardware resource usage, or computational complexity and memory bandwidth issues, making their practical use impossible. In our research we attempt to solve these issues with an optimized implementation of a practical spatial-temporal de-noising algorithm. Spatial-temporal filtering was performed in Bayer RAW data space, which allowed us to benefit from predictable sensor noise characteristics and reduce memory bandwidth requirements. The proposed algorithm efficiently removes different kinds of noise in a wide range of signal to noise ratios. In our algorithm the local motion compensation is performed in Bayer RAW data space, while preserving the resolution and effectively improving the signal to noise ratios of moving objects. The main challenge for the use of spatial-temporal noise reduction algorithms in video applications is the compromise between the quality of the motion prediction and the complexity of the algorithm and required memory bandwidth. In photo and video applications it is very important that moving objects should stay sharp, while the noise is efficiently removed in both the static background and moving objects. Another important use case is the case when background is also non-static as well as the foreground where objects are also moving. Taking into account the achievable improvement in PSNR (on the level of the best known noise reduction techniques, like VBM3D) and low algorithmic complexity, enabling its practical use in commercial video applications, the results of our research can be very valuable.

  12. INLA goes extreme: Bayesian tail regression for the estimation of high spatio-temporal quantiles

    KAUST Repository

    Opitz, Thomas; Huser, Raphaë l; Bakka, Haakon; Rue, Haavard

    2018-01-01

    approach is based on a Bayesian generalized additive modeling framework that is designed to estimate complex trends in marginal extremes over space and time. First, we estimate a high non-stationary threshold using a gamma distribution for precipitation

  13. Integrative real-time geographic visualization of energy resources

    International Nuclear Information System (INIS)

    Sorokine, A.; Shankar, M.; Stovall, J.; Bhaduri, B.; King, T.; Fernandez, S.; Datar, N.; Omitaomu, O.

    2009-01-01

    'Full text:' Several models forecast that climatic changes will increase the frequency of disastrous events like droughts, hurricanes, and snow storms. Responding to these events and also to power outages caused by system errors such as the 2003 North American blackout require an interconnect-wide real-time monitoring system for various energy resources. Such a system should be capable of providing situational awareness to its users in the government and energy utilities by dynamically visualizing the status of the elements of the energy grid infrastructure and supply chain in geographic contexts. We demonstrate an approach that relies on Google Earth and similar standard-based platforms as client-side geographic viewers with a data-dependent server component. The users of the system can view status information in spatial and temporal contexts. These data can be integrated with a wide range of geographic sources including all standard Google Earth layers and a large number of energy and environmental data feeds. In addition, we show a real-time spatio-temporal data sharing capability across the users of the system, novel methods for visualizing dynamic network data, and a fine-grain access to very large multi-resolution geographic datasets for faster delivery of the data. The system can be extended to integrate contingency analysis results and other grid models to assess recovery and repair scenarios in the case of major disruption. (author)

  14. Near-Real-Time Monitoring of Insect Defoliation Using Landsat Time Series

    Directory of Open Access Journals (Sweden)

    Valerie J. Pasquarella

    2017-07-01

    Full Text Available Introduced insects and pathogens impact millions of acres of forested land in the United States each year, and large-scale monitoring efforts are essential for tracking the spread of outbreaks and quantifying the extent of damage. However, monitoring the impacts of defoliating insects presents a significant challenge due to the ephemeral nature of defoliation events. Using the 2016 gypsy moth (Lymantria dispar outbreak in Southern New England as a case study, we present a new approach for near-real-time defoliation monitoring using synthetic images produced from Landsat time series. By comparing predicted and observed images, we assessed changes in vegetation condition multiple times over the course of an outbreak. Initial measures can be made as imagery becomes available, and season-integrated products provide a wall-to-wall assessment of potential defoliation at 30 m resolution. Qualitative and quantitative comparisons suggest our Landsat Time Series (LTS products improve identification of defoliation events relative to existing products and provide a repeatable metric of change in condition. Our synthetic-image approach is an important step toward using the full temporal potential of the Landsat archive for operational monitoring of forest health over large extents, and provides an important new tool for understanding spatial and temporal dynamics of insect defoliators.

  15. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    Science.gov (United States)

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  16. Temporal motifs in time-dependent networks

    International Nuclear Information System (INIS)

    Kovanen, Lauri; Karsai, Márton; Kaski, Kimmo; Kertész, János; Saramäki, Jari

    2011-01-01

    Temporal networks are commonly used to represent systems where connections between elements are active only for restricted periods of time, such as telecommunication, neural signal processing, biochemical reaction and human social interaction networks. We introduce the framework of temporal motifs to study the mesoscale topological–temporal structure of temporal networks in which the events of nodes do not overlap in time. Temporal motifs are classes of similar event sequences, where the similarity refers not only to topology but also to the temporal order of the events. We provide a mapping from event sequences to coloured directed graphs that enables an efficient algorithm for identifying temporal motifs. We discuss some aspects of temporal motifs, including causality and null models, and present basic statistics of temporal motifs in a large mobile call network

  17. Audio-Visual Tibetan Speech Recognition Based on a Deep Dynamic Bayesian Network for Natural Human Robot Interaction

    Directory of Open Access Journals (Sweden)

    Yue Zhao

    2012-12-01

    Full Text Available Audio-visual speech recognition is a natural and robust approach to improving human-robot interaction in noisy environments. Although multi-stream Dynamic Bayesian Network and coupled HMM are widely used for audio-visual speech recognition, they fail to learn the shared features between modalities and ignore the dependency of features among the frames within each discrete state. In this paper, we propose a Deep Dynamic Bayesian Network (DDBN to perform unsupervised extraction of spatial-temporal multimodal features from Tibetan audio-visual speech data and build an accurate audio-visual speech recognition model under a no frame-independency assumption. The experiment results on Tibetan speech data from some real-world environments showed the proposed DDBN outperforms the state-of-art methods in word recognition accuracy.

  18. Real-time computing platform for spiking neurons (RT-spike).

    Science.gov (United States)

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  19. A Bayesian-Based Approach to Marine Spatial Planning: Evaluating Spatial and Temporal Variance in the Provision of Ecosystem Services Before and After the Establishment Oregon's Marine Protected Areas

    Science.gov (United States)

    Black, B.; Harte, M.; Goldfinger, C.

    2017-12-01

    Participating in a ten-year monitoring project to assess the ecological, social, and socioeconomic impacts of Oregon's Marine Protected Areas (MPAs), we have worked in partnership with the Oregon Department of Fish and Wildlife (ODFW) to develop a Bayesian geospatial method to evaluate the spatial and temporal variance in the provision of ecosystem services produced by Oregon's MPAs. Probabilistic (Bayesian) approaches to Marine Spatial Planning (MSP) show considerable potential for addressing issues such as uncertainty, cumulative effects, and the need to integrate stakeholder-held information and preferences into decision making processes. To that end, we have created a Bayesian-based geospatial approach to MSP capable of modelling the evolution of the provision of ecosystem services before and after the establishment of Oregon's MPAs. Our approach permits both planners and stakeholders to view expected impacts of differing policies, behaviors, or choices made concerning Oregon's MPAs and surrounding areas in a geospatial (map) format while simultaneously considering multiple parties' beliefs on the policies or uses in question. We quantify the influence of the MPAs as the shift in the spatial distribution of ecosystem services, both inside and outside the protected areas, over time. Once the MPAs' influence on the provision of coastal ecosystem services has been evaluated, it is possible to view these impacts through geovisualization techniques. As a specific example of model use and output, a user could investigate the effects of altering the habitat preferences of a rockfish species over a prescribed period of time (5, 10, 20 years post-harvesting restrictions, etc.) on the relative intensity of spillover from nearby reserves (please see submitted figure). Particular strengths of our Bayesian-based approach include its ability to integrate highly disparate input types (qualitative or quantitative), to accommodate data gaps, address uncertainty, and to

  20. Real-time decision support and information gathering system for financial domain

    Science.gov (United States)

    Tseng, Chiu-Che; Gmytrasiewicz, Piotr J.

    2006-05-01

    The challenge of the investment domain is that a large amount of diverse information can be potentially relevant to an investment decision, and that, frequently, the decisions have to be made in a timely manner. This presents the potential for better decision support, but poses the challenge of building a decision support agent that gathers information from different sources and incorporates it for timely decision support. These problems motivate us to investigate ways in which the investors can be equipped with a flexible real-time decision support system to be practical in time-critical situations. The flexible real-time decision support system considers a tradeoff between decision quality and computation cost. For this purpose, we propose a system that uses the object oriented Bayesian knowledge base (OOBKB) design to create a decision model at the most suitable level of detail to guide the information gathering activities, and to produce an investment recommendation within a reasonable length of time. The decision models our system uses are implemented as influence diagrams. We validate our system with experiments in a simplified investment domain. The experiments show that our system produces a quality recommendation under different urgency situations. The contribution of our system is that it provides the flexible decision recommendation for an investor under time constraints in a complex environment.

  1. Temporal analysis and scheduling of hard real-time radios running on a multi-processor

    NARCIS (Netherlands)

    Moreira, O.

    2012-01-01

    On a multi-radio baseband system, multiple independent transceivers must share the resources of a multi-processor, while meeting each its own hard real-time requirements. Not all possible combinations of transceivers are known at compile time, so a solution must be found that either allows for

  2. Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.

    Science.gov (United States)

    Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi

    2015-02-01

    We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.

  3. Predicting forest insect flight activity: A Bayesian network approach.

    Directory of Open Access Journals (Sweden)

    Stephen M Pawson

    Full Text Available Daily flight activity patterns of forest insects are influenced by temporal and meteorological conditions. Temperature and time of day are frequently cited as key drivers of activity; however, complex interactions between multiple contributing factors have also been proposed. Here, we report individual Bayesian network models to assess the probability of flight activity of three exotic insects, Hylurgus ligniperda, Hylastes ater, and Arhopalus ferus in a managed plantation forest context. Models were built from 7,144 individual hours of insect sampling, temperature, wind speed, relative humidity, photon flux density, and temporal data. Discretized meteorological and temporal variables were used to build naïve Bayes tree augmented networks. Calibration results suggested that the H. ater and A. ferus Bayesian network models had the best fit for low Type I and overall errors, and H. ligniperda had the best fit for low Type II errors. Maximum hourly temperature and time since sunrise had the largest influence on H. ligniperda flight activity predictions, whereas time of day and year had the greatest influence on H. ater and A. ferus activity. Type II model errors for the prediction of no flight activity is improved by increasing the model's predictive threshold. Improvements in model performance can be made by further sampling, increasing the sensitivity of the flight intercept traps, and replicating sampling in other regions. Predicting insect flight informs an assessment of the potential phytosanitary risks of wood exports. Quantifying this risk allows mitigation treatments to be targeted to prevent the spread of invasive species via international trade pathways.

  4. Real-time capture of student reasoning while writing

    Science.gov (United States)

    Franklin, Scott V.; Hermsen, Lisa M.

    2014-12-01

    We present a new approach to investigating student reasoning while writing: real-time capture of the dynamics of the writing process. Key-capture or video software is used to record the entire writing episode, including all pauses, deletions, insertions, and revisions. A succinct shorthand, "S notation," is used to highlight significant moments in the episode that may be indicative of shifts in understanding and can be used in followup interviews for triangulation. The methodology allows one to test the widespread belief that writing is a valuable pedagogical technique, which currently has little directly supportive research. To demonstrate the method, we present a case study of a writing episode. The data reveal an evolution of expression and articulation, discontinuous in both time and space. Distinct shifts in the tone and topic that follow long pauses and revisions are not restricted to the most recently written text. Real-time writing analysis, with its study of the temporal breaks and revision locations, can serve as a complementary tool to more traditional research methods (e.g., speak-aloud interviews) into student reasoning during the writing process.

  5. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)

    2008-01-01

    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  6. Bayesian spatiotemporal model of fMRI data using transfer functions.

    Science.gov (United States)

    Quirós, Alicia; Diez, Raquel Montes; Wilson, Simon P

    2010-09-01

    This research describes a new Bayesian spatiotemporal model to analyse BOLD fMRI studies. In the temporal dimension, we describe the shape of the hemodynamic response function (HRF) with a transfer function model. The spatial continuity and local homogeneity of the evoked responses are modelled by a Gaussian Markov random field prior on the parameter indicating activations. The proposal constitutes an extension of the spatiotemporal model presented in a previous approach [Quirós, A., Montes Diez, R. and Gamerman, D., 2010. Bayesian spatiotemporal model of fMRI data, Neuroimage, 49: 442-456], offering more flexibility in the estimation of the HRF and computational advantages in the resulting MCMC algorithm. Simulations from the model are performed in order to ascertain the performance of the sampling scheme and the ability of the posterior to estimate model parameters, as well as to check the model sensitivity to signal to noise ratio. Results are shown on synthetic data and on a real data set from a block-design fMRI experiment. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  7. Real-time shadows

    CERN Document Server

    Eisemann, Elmar; Assarsson, Ulf; Wimmer, Michael

    2011-01-01

    Important elements of games, movies, and other computer-generated content, shadows are crucial for enhancing realism and providing important visual cues. In recent years, there have been notable improvements in visual quality and speed, making high-quality realistic real-time shadows a reachable goal. Real-Time Shadows is a comprehensive guide to the theory and practice of real-time shadow techniques. It covers a large variety of different effects, including hard, soft, volumetric, and semi-transparent shadows.The book explains the basics as well as many advanced aspects related to the domain

  8. Dependable Real-Time Systems

    Science.gov (United States)

    1991-09-30

    0196 or 413 545-0720 PI E-mail Address: krithi@nirvan.cs.umass.edu, stankovic(ocs.umass.edu Grant or Contract Title: Dependable Real - Time Systems Grant...Dependable Real - Time Systems " Grant or Contract Number: N00014-85-k-0398 L " Reporting Period: 1 Oct 87 - 30 Sep 91 , 2. Summary of Accomplishments ’ 2.1 Our...in developing a sound approach to scheduling tasks in complex real - time systems , (2) developed a real-time operating system kernel, a preliminary

  9. LabVIEW Real-Time

    CERN Multimedia

    CERN. Geneva; Flockhart, Ronald Bruce; Seppey, P

    2003-01-01

    With LabVIEW Real-Time, you can choose from a variety of RT Series hardware. Add a real-time data acquisition component into a larger measurement and automation system or create a single stand-alone real-time solution with data acquisition, signal conditioning, motion control, RS-232, GPIB instrumentation, and Ethernet connectivity. With the various hardware options, you can create a system to meet your precise needs today, while the modularity of the system means you can add to the solution as your system requirements grow. If you are interested in Reliable and Deterministic systems for Measurement and Automation, you will profit from this seminar. Agenda: Real-Time Overview LabVIEW RT Hardware Platforms - Linux on PXI Programming with LabVIEW RT Real-Time Operating Systems concepts Timing Applications Data Transfer

  10. Bayesian Ising approximation for learning dictionaries of multispike timing patterns in premotor neurons

    Science.gov (United States)

    Hernandez Lahme, Damian; Sober, Samuel; Nemenman, Ilya

    Important questions in computational neuroscience are whether, how much, and how information is encoded in the precise timing of neural action potentials. We recently demonstrated that, in the premotor cortex during vocal control in songbirds, spike timing is far more informative about upcoming behavior than is spike rate (Tang et al, 2014). However, identification of complete dictionaries that relate spike timing patterns with the controled behavior remains an elusive problem. Here we present a computational approach to deciphering such codes for individual neurons in the songbird premotor area RA, an analog of mammalian primary motor cortex. Specifically, we analyze which multispike patterns of neural activity predict features of the upcoming vocalization, and hence are important codewords. We use a recently introduced Bayesian Ising Approximation, which properly accounts for the fact that many codewords overlap and hence are not independent. Our results show which complex, temporally precise multispike combinations are used by individual neurons to control acoustic features of the produced song, and that these code words are different across individual neurons and across different acoustic features. This work was supported, in part, by JSMF Grant 220020321, NSF Grant 1208126, NIH Grant NS084844 and NIH Grant 1 R01 EB022872.

  11. Concepts of real time and semi-real time material control

    International Nuclear Information System (INIS)

    Lovett, J.E.

    1975-01-01

    After a brief consideration of the traditional material balance accounting on an MBA basis, this paper explores the basic concepts of real time and semi-real time material control, together with some of the major problems to be solved. Three types of short-term material control are discussed: storage, batch processing, and continuous processing. (DLC)

  12. Real Time Systems

    DEFF Research Database (Denmark)

    Christensen, Knud Smed

    2000-01-01

    Describes fundamentals of parallel programming and a kernel for that. Describes methods for modelling and checking parallel problems. Real time problems.......Describes fundamentals of parallel programming and a kernel for that. Describes methods for modelling and checking parallel problems. Real time problems....

  13. Real time expert systems

    International Nuclear Information System (INIS)

    Asami, Tohru; Hashimoto, Kazuo; Yamamoto, Seiichi

    1992-01-01

    Recently, aiming at the application to the plant control for nuclear reactors and traffic and communication control, the research and the practical use of the expert system suitable to real time processing have become conspicuous. In this report, the condition for the required function to control the object that dynamically changes within a limited time is presented, and the technical difference between the real time expert system developed so as to satisfy it and the expert system of conventional type is explained with the actual examples and from theoretical aspect. The expert system of conventional type has the technical base in the problem-solving equipment originating in STRIPS. The real time expert system is applied to the fields accompanied by surveillance and control, to which conventional expert system is hard to be applied. The requirement for the real time expert system, the example of the real time expert system, and as the techniques of realizing real time processing, the realization of interruption processing, dispersion processing, and the mechanism of maintaining the consistency of knowledge are explained. (K.I.)

  14. Time-reversal and Bayesian inversion

    Science.gov (United States)

    Debski, Wojciech

    2017-04-01

    Probabilistic inversion technique is superior to the classical optimization-based approach in all but one aspects. It requires quite exhaustive computations which prohibit its use in huge size inverse problems like global seismic tomography or waveform inversion to name a few. The advantages of the approach are, however, so appealing that there is an ongoing continuous afford to make the large inverse task as mentioned above manageable with the probabilistic inverse approach. One of the perspective possibility to achieve this goal relays on exploring the internal symmetry of the seismological modeling problems in hand - a time reversal and reciprocity invariance. This two basic properties of the elastic wave equation when incorporating into the probabilistic inversion schemata open a new horizons for Bayesian inversion. In this presentation we discuss the time reversal symmetry property, its mathematical aspects and propose how to combine it with the probabilistic inverse theory into a compact, fast inversion algorithm. We illustrate the proposed idea with the newly developed location algorithm TRMLOC and discuss its efficiency when applied to mining induced seismic data.

  15. Reliable 5-min real-time MR technique for left-ventricular-wall motion analysis

    International Nuclear Information System (INIS)

    Katoh, Marcus; Spuentrup, Elmar; Guenther, Rolf W.; Buecker, Arno; Kuehl, Harald P.; Lipke, Claudia S.A.

    2007-01-01

    The aim of this study was to investigate the value of a real-time magnetic resonance imaging (MRI) approach for the assessment of left-ventricular-wall motion in patients with insufficient transthoracic echocardiography in terms of accuracy and temporal expenditure. Twenty-five consecutive patients were examined on a 1.5-Tesla whole-body MR system (ACS-NT, Philips Medical Systems, Best, NL) using a real-time and ECG-gated (the current gold standard) steady-state free-precession (SSFP) sequence. Wall motion was analyzed by three observers by consensus interpretation. In addition, the preparation, scanning, and overall examination times were measured. The assessment of the wall motion demonstrated a close agreement between the two modalities resulting in a mean κ coefficient of 0.8. At the same time, each stage of the examination was significantly shortened using the real-time MR approach. Real-time imaging allows for accurate assessment of left-ventricular-wall motion with the added benefit of decreased examination time. Therefore, it may serve as a cost-efficient alternative in patients with insufficient echocardiography. (orig.)

  16. Bayesian Model Selection under Time Constraints

    Science.gov (United States)

    Hoege, M.; Nowak, W.; Illman, W. A.

    2017-12-01

    Bayesian model selection (BMS) provides a consistent framework for rating and comparing models in multi-model inference. In cases where models of vastly different complexity compete with each other, we also face vastly different computational runtimes of such models. For instance, time series of a quantity of interest can be simulated by an autoregressive process model that takes even less than a second for one run, or by a partial differential equations-based model with runtimes up to several hours or even days. The classical BMS is based on a quantity called Bayesian model evidence (BME). It determines the model weights in the selection process and resembles a trade-off between bias of a model and its complexity. However, in practice, the runtime of models is another weight relevant factor for model selection. Hence, we believe that it should be included, leading to an overall trade-off problem between bias, variance and computing effort. We approach this triple trade-off from the viewpoint of our ability to generate realizations of the models under a given computational budget. One way to obtain BME values is through sampling-based integration techniques. We argue with the fact that more expensive models can be sampled much less under time constraints than faster models (in straight proportion to their runtime). The computed evidence in favor of a more expensive model is statistically less significant than the evidence computed in favor of a faster model, since sampling-based strategies are always subject to statistical sampling error. We present a straightforward way to include this misbalance into the model weights that are the basis for model selection. Our approach follows directly from the idea of insufficient significance. It is based on a computationally cheap bootstrapping error estimate of model evidence and is easy to implement. The approach is illustrated in a small synthetic modeling study.

  17. TEMPORAL QUERY PROCESSIG USING SQL SERVER

    OpenAIRE

    Vali Shaik, Mastan; Sujatha, P

    2017-01-01

    Most data sources in real-life are not static but change their information in time. This evolution of data in time can give valuable insights to business analysts. Temporal data refers to data, where changes over time or temporal aspects play a central role. Temporal data denotes the evaluation of object characteristics over time. One of the main unresolved problems that arise during the data mining process is treating data that contains temporal information. Temporal queries on time evolving...

  18. Accurate estimation of camera shot noise in the real-time

    Science.gov (United States)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.

    2017-10-01

    Nowadays digital cameras are essential parts of various technological processes and daily tasks. They are widely used in optics and photonics, astronomy, biology and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photo- and videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Temporal noise can be divided into signal-dependent shot noise and signal-nondependent dark temporal noise. For measurement of camera noise characteristics, the most widely used methods are standards (for example, EMVA Standard 1288). It allows precise shot and dark temporal noise measurement but difficult in implementation and time-consuming. Earlier we proposed method for measurement of temporal noise of photo- and videocameras. It is based on the automatic segmentation of nonuniform targets (ASNT). Only two frames are sufficient for noise measurement with the modified method. In this paper, we registered frames and estimated shot and dark temporal noises of cameras consistently in the real-time. The modified ASNT method is used. Estimation was performed for the cameras: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PL-B781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. Time of registering and processing of frames used for temporal noise estimation was measured. Using standard computer, frames were registered and processed during a fraction of second to several seconds only. Also the

  19. INLA goes extreme: Bayesian tail regression for the estimation of high spatio-temporal quantiles

    KAUST Repository

    Opitz, Thomas

    2018-05-25

    This work is motivated by the challenge organized for the 10th International Conference on Extreme-Value Analysis (EVA2017) to predict daily precipitation quantiles at the 99.8% level for each month at observed and unobserved locations. Our approach is based on a Bayesian generalized additive modeling framework that is designed to estimate complex trends in marginal extremes over space and time. First, we estimate a high non-stationary threshold using a gamma distribution for precipitation intensities that incorporates spatial and temporal random effects. Then, we use the Bernoulli and generalized Pareto (GP) distributions to model the rate and size of threshold exceedances, respectively, which we also assume to vary in space and time. The latent random effects are modeled additively using Gaussian process priors, which provide high flexibility and interpretability. We develop a penalized complexity (PC) prior specification for the tail index that shrinks the GP model towards the exponential distribution, thus preventing unrealistically heavy tails. Fast and accurate estimation of the posterior distributions is performed thanks to the integrated nested Laplace approximation (INLA). We illustrate this methodology by modeling the daily precipitation data provided by the EVA2017 challenge, which consist of observations from 40 stations in the Netherlands recorded during the period 1972–2016. Capitalizing on INLA’s fast computational capacity and powerful distributed computing resources, we conduct an extensive cross-validation study to select the model parameters that govern the smoothness of trends. Our results clearly outperform simple benchmarks and are comparable to the best-scoring approaches of the other teams.

  20. Real-time classification of auditory sentences using evoked cortical activity in humans

    Science.gov (United States)

    Moses, David A.; Leonard, Matthew K.; Chang, Edward F.

    2018-06-01

    Objective. Recent research has characterized the anatomical and functional basis of speech perception in the human auditory cortex. These advances have made it possible to decode speech information from activity in brain regions like the superior temporal gyrus, but no published work has demonstrated this ability in real-time, which is necessary for neuroprosthetic brain-computer interfaces. Approach. Here, we introduce a real-time neural speech recognition (rtNSR) software package, which was used to classify spoken input from high-resolution electrocorticography signals in real-time. We tested the system with two human subjects implanted with electrode arrays over the lateral brain surface. Subjects listened to multiple repetitions of ten sentences, and rtNSR classified what was heard in real-time from neural activity patterns using direct sentence-level and HMM-based phoneme-level classification schemes. Main results. We observed single-trial sentence classification accuracies of 90% or higher for each subject with less than 7 minutes of training data, demonstrating the ability of rtNSR to use cortical recordings to perform accurate real-time speech decoding in a limited vocabulary setting. Significance. Further development and testing of the package with different speech paradigms could influence the design of future speech neuroprosthetic applications.

  1. Real-time MRI of the temporomandibular joint at 15 frames per second—A feasibility study

    International Nuclear Information System (INIS)

    Krohn, Sebastian; Gersdorff, Nikolaus; Wassmann, Torsten; Merboldt, Klaus-Dietmar; Joseph, Arun A.; Buergers, Ralf; Frahm, Jens

    2016-01-01

    The purpose of this study was to develop and evaluate a novel method for real-time MRI of TMJ function at high temporal resolution and with two different contrasts. Real-time MRI was based on undersampled radial fast low angle shot (FLASH) acquisitions with iterative image reconstruction by regularized nonlinear inversion. Real-time MRI movies with T1 contrast were obtained with use of a radiofrequency-spoiled FLASH sequence, while movies with T2/T1 contrast employed a gradient-refocused FLASH version. TMJ function was characterized in 40 randomly selected volunteers by sequential 20 s acquisitions of both the right and left joint during voluntary opening and closing of the mouth (in a medial, central and lateral oblique sagittal section perpendicular to the long axis of the condylar head). All studies were performed on a commercial MRI system at 3 T using the standard head coil, while online reconstruction was achieved with a bypass computer fully integrated into the MRI system. As a first result, real-time MRI studies of the right and left TMJ were successfully performed in all 40 subjects (80 joints) within a total examination time per subject of only 15 min. Secondly, at an in-plane resolution of 0.75 mm and 5 mm section thickness, the achieved temporal resolution was 66.7 ms per image or 15 frames per second. Thirdly, both T1-weighted and T2/T1-weighted real-time MRI movies provided information about TMJ function such as disc position, condyle mobility and disc-condyle relationship. While T1 contrast offers a better delineation of structures during rapid jaw movements, T2/T1 contrast was rated superior for characterizing the articular disc. In conclusion, the proposed real-time MRI method may become a robust and efficient tool for the clinical assessment of TMJ function.

  2. Real-time MRI of the temporomandibular joint at 15 frames per second—A feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Krohn, Sebastian; Gersdorff, Nikolaus; Wassmann, Torsten [Department of Prosthodontics, University Medical Center, Göttingen (Germany); Merboldt, Klaus-Dietmar [Biomedizinische NMR Forschungs GmbH am Max-Planck-Institut für Biophysikalische Chemie, Göttingen (Germany); Joseph, Arun A., E-mail: ajoseph@mpibpc.mpg.de [Biomedizinische NMR Forschungs GmbH am Max-Planck-Institut für Biophysikalische Chemie, Göttingen (Germany); Buergers, Ralf [Department of Prosthodontics, University Medical Center, Göttingen (Germany); Frahm, Jens [Biomedizinische NMR Forschungs GmbH am Max-Planck-Institut für Biophysikalische Chemie, Göttingen (Germany)

    2016-12-15

    The purpose of this study was to develop and evaluate a novel method for real-time MRI of TMJ function at high temporal resolution and with two different contrasts. Real-time MRI was based on undersampled radial fast low angle shot (FLASH) acquisitions with iterative image reconstruction by regularized nonlinear inversion. Real-time MRI movies with T1 contrast were obtained with use of a radiofrequency-spoiled FLASH sequence, while movies with T2/T1 contrast employed a gradient-refocused FLASH version. TMJ function was characterized in 40 randomly selected volunteers by sequential 20 s acquisitions of both the right and left joint during voluntary opening and closing of the mouth (in a medial, central and lateral oblique sagittal section perpendicular to the long axis of the condylar head). All studies were performed on a commercial MRI system at 3 T using the standard head coil, while online reconstruction was achieved with a bypass computer fully integrated into the MRI system. As a first result, real-time MRI studies of the right and left TMJ were successfully performed in all 40 subjects (80 joints) within a total examination time per subject of only 15 min. Secondly, at an in-plane resolution of 0.75 mm and 5 mm section thickness, the achieved temporal resolution was 66.7 ms per image or 15 frames per second. Thirdly, both T1-weighted and T2/T1-weighted real-time MRI movies provided information about TMJ function such as disc position, condyle mobility and disc-condyle relationship. While T1 contrast offers a better delineation of structures during rapid jaw movements, T2/T1 contrast was rated superior for characterizing the articular disc. In conclusion, the proposed real-time MRI method may become a robust and efficient tool for the clinical assessment of TMJ function.

  3. Process algebra with timing : real time and discrete time

    NARCIS (Netherlands)

    Baeten, J.C.M.; Middelburg, C.A.; Bergstra, J.A.; Ponse, A.J.; Smolka, S.A.

    2001-01-01

    We present real time and discrete time versions of ACP with absolute timing and relative timing. The starting-point is a new real time version with absolute timing, called ACPsat, featuring urgent actions and a delay operator. The discrete time versions are conservative extensions of the discrete

  4. Process algebra with timing: Real time and discrete time

    NARCIS (Netherlands)

    Baeten, J.C.M.; Middelburg, C.A.

    1999-01-01

    We present real time and discrete time versions of ACP with absolute timing and relative timing. The startingpoint is a new real time version with absolute timing, called ACPsat , featuring urgent actions and a delay operator. The discrete time versions are conservative extensions of the discrete

  5. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems: Summary of Research

    Science.gov (United States)

    Manna, Zohar

    1998-01-01

    This research is directed towards the implementation of a comprehensive deductive-algorithmic environment (toolkit) for the development and verification of high assurance reactive systems, especially concurrent, real-time, and hybrid systems. For this, we have designed and implemented the STCP (Stanford Temporal Prover) verification system. Reactive systems have an ongoing interaction with their environment, and their computations are infinite sequences of states. A large number of systems can be seen as reactive systems, including hardware, concurrent programs, network protocols, and embedded systems. Temporal logic provides a convenient language for expressing properties of reactive systems. A temporal verification methodology provides procedures for proving that a given system satisfies a given temporal property. The research covered necessary theoretical foundations as well as implementation and application issues.

  6. A hierarchical Bayesian spatio-temporal model for extreme precipitation events

    KAUST Repository

    Ghosh, Souparno; Mallick, Bani K.

    2011-01-01

    We propose a new approach to model a sequence of spatially distributed time series of extreme values. Unlike common practice, we incorporate spatial dependence directly in the likelihood and allow the temporal component to be captured at the second level of hierarchy. Inferences about the parameters and spatio-temporal predictions are obtained via MCMC technique. The model is fitted to a gridded precipitation data set collected over 99 years across the continental U.S. © 2010 John Wiley & Sons, Ltd..

  7. A hierarchical Bayesian spatio-temporal model for extreme precipitation events

    KAUST Repository

    Ghosh, Souparno

    2011-03-01

    We propose a new approach to model a sequence of spatially distributed time series of extreme values. Unlike common practice, we incorporate spatial dependence directly in the likelihood and allow the temporal component to be captured at the second level of hierarchy. Inferences about the parameters and spatio-temporal predictions are obtained via MCMC technique. The model is fitted to a gridded precipitation data set collected over 99 years across the continental U.S. © 2010 John Wiley & Sons, Ltd..

  8. Estimation of the order of an autoregressive time series: a Bayesian approach

    International Nuclear Information System (INIS)

    Robb, L.J.

    1980-01-01

    Finite-order autoregressive models for time series are often used for prediction and other inferences. Given the order of the model, the parameters of the models can be estimated by least-squares, maximum-likelihood, or Yule-Walker method. The basic problem is estimating the order of the model. The problem of autoregressive order estimation is placed in a Bayesian framework. This approach illustrates how the Bayesian method brings the numerous aspects of the problem together into a coherent structure. A joint prior probability density is proposed for the order, the partial autocorrelation coefficients, and the variance; and the marginal posterior probability distribution for the order, given the data, is obtained. It is noted that the value with maximum posterior probability is the Bayes estimate of the order with respect to a particular loss function. The asymptotic posterior distribution of the order is also given. In conclusion, Wolfer's sunspot data as well as simulated data corresponding to several autoregressive models are analyzed according to Akaike's method and the Bayesian method. Both methods are observed to perform quite well, although the Bayesian method was clearly superior, in most cases

  9. Real-time radiography

    International Nuclear Information System (INIS)

    Bossi, R.H.; Oien, C.T.

    1981-01-01

    Real-time radiography is used for imaging both dynamic events and static objects. Fluorescent screens play an important role in converting radiation to light, which is then observed directly or intensified and detected. The radiographic parameters for real-time radiography are similar to conventional film radiography with special emphasis on statistics and magnification. Direct-viewing fluoroscopy uses the human eye as a detector of fluorescent screen light or the light from an intensifier. Remote-viewing systems replace the human observer with a television camera. The remote-viewing systems have many advantages over the direct-viewing conditions such as safety, image enhancement, and the capability to produce permanent records. This report reviews real-time imaging system parameters and components

  10. Real-time capture of student reasoning while writing

    Directory of Open Access Journals (Sweden)

    Scott V. Franklin

    2014-09-01

    Full Text Available We present a new approach to investigating student reasoning while writing: real-time capture of the dynamics of the writing process. Key-capture or video software is used to record the entire writing episode, including all pauses, deletions, insertions, and revisions. A succinct shorthand, “S notation,” is used to highlight significant moments in the episode that may be indicative of shifts in understanding and can be used in followup interviews for triangulation. The methodology allows one to test the widespread belief that writing is a valuable pedagogical technique, which currently has little directly supportive research. To demonstrate the method, we present a case study of a writing episode. The data reveal an evolution of expression and articulation, discontinuous in both time and space. Distinct shifts in the tone and topic that follow long pauses and revisions are not restricted to the most recently written text. Real-time writing analysis, with its study of the temporal breaks and revision locations, can serve as a complementary tool to more traditional research methods (e.g., speak-aloud interviews into student reasoning during the writing process.

  11. Real-time vision systems

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R.; Hernandez, J.E.; Lu, Shin-yee [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Many industrial and defence applications require an ability to make instantaneous decisions based on sensor input of a time varying process. Such systems are referred to as `real-time systems` because they process and act on data as it occurs in time. When a vision sensor is used in a real-time system, the processing demands can be quite substantial, with typical data rates of 10-20 million samples per second. A real-time Machine Vision Laboratory (MVL) was established in FY94 to extend our years of experience in developing computer vision algorithms to include the development and implementation of real-time vision systems. The laboratory is equipped with a variety of hardware components, including Datacube image acquisition and processing boards, a Sun workstation, and several different types of CCD cameras, including monochrome and color area cameras and analog and digital line-scan cameras. The equipment is reconfigurable for prototyping different applications. This facility has been used to support several programs at LLNL, including O Division`s Peacemaker and Deadeye Projects as well as the CRADA with the U.S. Textile Industry, CAFE (Computer Aided Fabric Inspection). To date, we have successfully demonstrated several real-time applications: bullet tracking, stereo tracking and ranging, and web inspection. This work has been documented in the ongoing development of a real-time software library.

  12. Automatic real-time detection of endoscopic procedures using temporal features.

    Science.gov (United States)

    Stanek, Sean R; Tavanapong, Wallapak; Wong, Johnny; Oh, Jung Hwan; de Groen, Piet C

    2012-11-01

    Endoscopy is used for inspection of the inner surface of organs such as the colon. During endoscopic inspection of the colon or colonoscopy, a tiny video camera generates a video signal, which is displayed on a monitor for interpretation in real-time by physicians. In practice, these images are not typically captured, which may be attributed by lack of fully automated tools for capturing, analysis of important contents, and quick and easy retrieval of these contents. This paper presents the description and evaluation results of our novel software that uses new metrics based on image color and motion over time to automatically record all images of an individual endoscopic procedure into a single digitized video file. The software automatically discards out-patient video frames between different endoscopic procedures. We validated our software system on 2464 h of live video (over 265 million frames) from endoscopy units where colonoscopy and upper endoscopy were performed. Our previous classification method achieved a frame-based sensitivity of 100.00%, but only a specificity of 89.22%. Our new method achieved a frame-based sensitivity and specificity of 99.90% and 99.97%, a significant improvement. Our system is robust for day-to-day use in medical practice. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. Multi-level Bayesian analyses for single- and multi-vehicle freeway crashes.

    Science.gov (United States)

    Yu, Rongjie; Abdel-Aty, Mohamed

    2013-09-01

    This study presents multi-level analyses for single- and multi-vehicle crashes on a mountainous freeway. Data from a 15-mile mountainous freeway section on I-70 were investigated. Both aggregate and disaggregate models for the two crash conditions were developed. Five years of crash data were used in the aggregate investigation, while the disaggregate models utilized one year of crash data along with real-time traffic and weather data. For the aggregate analyses, safety performance functions were developed for the purpose of revealing the contributing factors for each crash type. Two methodologies, a Bayesian bivariate Poisson-lognormal model and a Bayesian hierarchical Poisson model with correlated random effects, were estimated to simultaneously analyze the two crash conditions with consideration of possible correlations. Except for the factors related to geometric characteristics, two exposure parameters (annual average daily traffic and segment length) were included. Two different sets of significant explanatory and exposure variables were identified for the single-vehicle (SV) and multi-vehicle (MV) crashes. It was found that the Bayesian bivariate Poisson-lognormal model is superior to the Bayesian hierarchical Poisson model, the former with a substantially lower DIC and more significant variables. In addition to the aggregate analyses, microscopic real-time crash risk evaluation models were developed for the two crash conditions. Multi-level Bayesian logistic regression models were estimated with the random parameters accounting for seasonal variations, crash-unit-level diversity and segment-level random effects capturing unobserved heterogeneity caused by the geometric characteristics. The model results indicate that the effects of the selected variables on crash occurrence vary across seasons and crash units; and that geometric characteristic variables contribute to the segment variations: the more unobserved heterogeneity have been accounted, the better

  14. FRB microstructure revealed by the real-time detection of FRB170827

    OpenAIRE

    Farah, W.; Flynn, C.; Bailes, M.; Jameson, A.; Bannister, K. W.; Barr, E. D.; Bateman, T.; Bhandari, S.; Caleb, M.; Campbell-Wilson, D.; Chang, S. -W.; Deller, A.; Green, A. J.; Hunstead, R.; Jankowski, F.

    2018-01-01

    We report a new Fast Radio Burst (FRB) discovered in real-time as part of the UTMOST project at the Molonglo Observatory Synthesis Radio Telescope (MOST). FRB170827 is the first detected with our low-latency ($< 24$ s), machine-learning-based FRB detection system. The FRB discovery was accompanied by the capture of voltage data at the native time and frequency resolution of the observing system, enabling coherent dedispersion and detailed off-line analysis, which have unveiled fine temporal a...

  15. A Bayesian Combined Model for Time-Dependent Turning Movement Proportions Estimation at Intersections

    Directory of Open Access Journals (Sweden)

    Pengpeng Jiao

    2014-01-01

    Full Text Available Time-dependent turning movement flows are very important input data for intelligent transportation systems but are impossible to be detected directly through current traffic surveillance systems. Existing estimation models have proved to be not accurate and reliable enough during all intervals. An improved way to address this problem is to develop a combined model framework that can integrate multiple submodels running simultaneously. This paper first presents a back propagation neural network model to estimate dynamic turning movements, as well as the self-adaptive learning rate approach and the gradient descent with momentum method for solving. Second, this paper develops an efficient Kalman filtering model and designs a revised sequential Kalman filtering algorithm. Based on the Bayesian method using both historical data and currently estimated results for error calibration, this paper further integrates above two submodels into a Bayesian combined model framework and proposes a corresponding algorithm. A field survey is implemented at an intersection in Beijing city to collect both time series of link counts and actual time-dependent turning movement flows, including historical and present data. The reported estimation results show that the Bayesian combined model is much more accurate and stable than other models.

  16. Gliding and Saccadic Gaze Gesture Recognition in Real Time

    DEFF Research Database (Denmark)

    Rozado, David; San Agustin, Javier; Rodriguez, Francisco

    2012-01-01

    , and their corresponding real-time recognition algorithms, Hierarchical Temporal Memory networks and the Needleman-Wunsch algorithm for sequence alignment. Our results show how a specific combination of gaze gesture modality, namely saccadic gaze gestures, and recognition algorithm, Needleman-Wunsch, allows for reliable...... usage of intentional gaze gestures to interact with a computer with accuracy rates of up to 98% and acceptable completion speed. Furthermore, the gesture recognition engine does not interfere with otherwise standard human-machine gaze interaction generating therefore, very low false positive rates...

  17. Memory controllers for real-time embedded systems predictable and composable real-time systems

    CERN Document Server

    Akesson, Benny

    2012-01-01

      Verification of real-time requirements in systems-on-chip becomes more complex as more applications are integrated. Predictable and composable systems can manage the increasing complexity using formal verification and simulation.  This book explains the concepts of predictability and composability and shows how to apply them to the design and analysis of a memory controller, which is a key component in any real-time system. This book is generally intended for readers interested in Systems-on-Chips with real-time applications.   It is especially well-suited for readers looking to use SDRAM memories in systems with hard or firm real-time requirements. There is a strong focus on real-time concepts, such as predictability and composability, as well as a brief discussion about memory controller architectures for high-performance computing. Readers will learn step-by-step how to go from an unpredictable SDRAM memory, offering highly variable bandwidth and latency, to a predictable and composable shared memory...

  18. Algorithm for real-time detection of signal patterns using phase synchrony: an application to an electrode array

    Science.gov (United States)

    Sadeghi, Saman; MacKay, William A.; van Dam, R. Michael; Thompson, Michael

    2011-02-01

    Real-time analysis of multi-channel spatio-temporal sensor data presents a considerable technical challenge for a number of applications. For example, in brain-computer interfaces, signal patterns originating on a time-dependent basis from an array of electrodes on the scalp (i.e. electroencephalography) must be analyzed in real time to recognize mental states and translate these to commands which control operations in a machine. In this paper we describe a new technique for recognition of spatio-temporal patterns based on performing online discrimination of time-resolved events through the use of correlation of phase dynamics between various channels in a multi-channel system. The algorithm extracts unique sensor signature patterns associated with each event during a training period and ranks importance of sensor pairs in order to distinguish between time-resolved stimuli to which the system may be exposed during real-time operation. We apply the algorithm to electroencephalographic signals obtained from subjects tested in the neurophysiology laboratories at the University of Toronto. The extension of this algorithm for rapid detection of patterns in other sensing applications, including chemical identification via chemical or bio-chemical sensor arrays, is also discussed.

  19. Real-time tomosynthesis for radiation therapy guidance.

    Science.gov (United States)

    Hsieh, Scott S; Ng, Lydia W

    2017-11-01

    Fluoroscopy has been a tool of choice for monitoring treatments or interventions because of its extremely fast imaging times. However, the contrast obtained in fluoroscopy may be insufficient for certain clinical applications. In stereotactic ablative radiation therapy of the lung, fluoroscopy often lacks sufficient contrast for gating treatment. The purpose of this work is to describe and assess a real-time tomosynthesis design that can produce sufficient contrast for guidance of lung tumor treatment within a small field of view. Previous tomosynthesis designs in radiation oncology have temporal resolution on the order of seconds. The proposed system design uses parallel acquisition of multiple frames by simultaneously illuminating the field of view with multiple sources, enabling a temporal resolution of up to 30 frames per second. For a small field of view, a single flat-panel detector could be used if different sectors of the detector are assigned to specific sources. Simulated images were generated by forward projection of existing clinical datasets. The authors varied the number of tubes and the power of each tube in order to determine the impact on tumor visualization. Visualization of the tumor was much clearer in tomosynthesis than in fluoroscopy. Contrast generally improved with the number of sources used, and a minimum of four sources should be used. The high contrast of the lung allows very low system power, and in most cases, less than 1 mA was needed. More power is required in the lateral direction than the AP direction. The proposed system produces images adequate for real-time guidance of radiation therapy. The additional hardware requirements are modest, and the system is capable of imaging at high frame rates and low dose. Further development, including a prototype system and a dosimetry study, is needed to further evaluate the feasibility of this device for radiation therapy guidance. © 2017 American Association of Physicists in Medicine.

  20. Big Data-Driven Based Real-Time Traffic Flow State Identification and Prediction

    Directory of Open Access Journals (Sweden)

    Hua-pu Lu

    2015-01-01

    Full Text Available With the rapid development of urban informatization, the era of big data is coming. To satisfy the demand of traffic congestion early warning, this paper studies the method of real-time traffic flow state identification and prediction based on big data-driven theory. Traffic big data holds several characteristics, such as temporal correlation, spatial correlation, historical correlation, and multistate. Traffic flow state quantification, the basis of traffic flow state identification, is achieved by a SAGA-FCM (simulated annealing genetic algorithm based fuzzy c-means based traffic clustering model. Considering simple calculation and predictive accuracy, a bilevel optimization model for regional traffic flow correlation analysis is established to predict traffic flow parameters based on temporal-spatial-historical correlation. A two-stage model for correction coefficients optimization is put forward to simplify the bilevel optimization model. The first stage model is built to calculate the number of temporal-spatial-historical correlation variables. The second stage model is present to calculate basic model formulation of regional traffic flow correlation. A case study based on a real-world road network in Beijing, China, is implemented to test the efficiency and applicability of the proposed modeling and computing methods.

  1. Incorporating real-time traffic and weather data to explore road accident likelihood and severity in urban arterials.

    Science.gov (United States)

    Theofilatos, Athanasios

    2017-06-01

    The effective treatment of road accidents and thus the enhancement of road safety is a major concern to societies due to the losses in human lives and the economic and social costs. The investigation of road accident likelihood and severity by utilizing real-time traffic and weather data has recently received significant attention by researchers. However, collected data mainly stem from freeways and expressways. Consequently, the aim of the present paper is to add to the current knowledge by investigating accident likelihood and severity by exploiting real-time traffic and weather data collected from urban arterials in Athens, Greece. Random Forests (RF) are firstly applied for preliminary analysis purposes. More specifically, it is aimed to rank candidate variables according to their relevant importance and provide a first insight on the potential significant variables. Then, Bayesian logistic regression as well finite mixture and mixed effects logit models are applied to further explore factors associated with accident likelihood and severity respectively. Regarding accident likelihood, the Bayesian logistic regression showed that variations in traffic significantly influence accident occurrence. On the other hand, accident severity analysis revealed a generally mixed influence of traffic variations on accident severity, although international literature states that traffic variations increase severity. Lastly, weather parameters did not find to have a direct influence on accident likelihood or severity. The study added to the current knowledge by incorporating real-time traffic and weather data from urban arterials to investigate accident occurrence and accident severity mechanisms. The identification of risk factors can lead to the development of effective traffic management strategies to reduce accident occurrence and severity of injuries in urban arterials. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.

  2. Forecast Accuracy and Economic Gains from Bayesian Model Averaging using Time Varying Weights

    NARCIS (Netherlands)

    L.F. Hoogerheide (Lennart); R.H. Kleijn (Richard); H.K. van Dijk (Herman); M.J.C.M. Verbeek (Marno)

    2009-01-01

    textabstractSeveral Bayesian model combination schemes, including some novel approaches that simultaneously allow for parameter uncertainty, model uncertainty and robust time varying model weights, are compared in terms of forecast accuracy and economic gains using financial and macroeconomic time

  3. Essays in real-time forecasting

    OpenAIRE

    Liebermann, Joelle

    2012-01-01

    This thesis contains three essays in the field of real-time econometrics, and more particularlyforecasting.The issue of using data as available in real-time to forecasters, policymakers or financialmarkets is an important one which has only recently been taken on board in the empiricalliterature. Data available and used in real-time are preliminary and differ from ex-postrevised data, and given that data revisions may be quite substantial, the use of latestavailable instead of real-time can s...

  4. Review of real-time on-line decision support system RODOS

    International Nuclear Information System (INIS)

    Rossi, J.

    1997-01-01

    RODOS (Real Time Off-site Decision Support System) is a research project, which aims at the development of a versatile decision support system for management of reactor accident off-site consequence assessments in real-time in Europe and in the western parts of the former Soviet Union. The system employs both local and regional environmental radiation monitoring results and meteorological forecasts by means of which the software prepares consistent predictions ranging from the release area to long distances covering all temporal phases of the accident. The data to be obtained from the environment will be processed and based on the mathematical and physical models and it will be prepared to intelligibly form about the prevalent or future environmental radiation situation. The software is intended for operative use of radiation safety authorities. Furthermore, the software is suitable for education and training of rescue field personnel. (refs.)

  5. Ovation Prime Real-Time

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ovation Prime Real-Time (OPRT) product is a real-time forecast and nowcast model of auroral power and is an operational implementation of the work by Newell et...

  6. Modeling Temporal Evolution and Multiscale Structure in Networks

    DEFF Research Database (Denmark)

    Herlau, Tue; Mørup, Morten; Schmidt, Mikkel Nørgaard

    2013-01-01

    Many real-world networks exhibit both temporal evolution and multiscale structure. We propose a model for temporally correlated multifurcating hierarchies in complex networks which jointly capture both effects. We use the Gibbs fragmentation tree as prior over multifurcating trees and a change......-point model to account for the temporal evolution of each vertex. We demonstrate that our model is able to infer time-varying multiscale structure in synthetic as well as three real world time-evolving complex networks. Our modeling of the temporal evolution of hierarchies brings new insights...

  7. Bayesian estimation of dose rate effectiveness

    International Nuclear Information System (INIS)

    Arnish, J.J.; Groer, P.G.

    2000-01-01

    A Bayesian statistical method was used to quantify the effectiveness of high dose rate 137 Cs gamma radiation at inducing fatal mammary tumours and increasing the overall mortality rate in BALB/c female mice. The Bayesian approach considers both the temporal and dose dependence of radiation carcinogenesis and total mortality. This paper provides the first direct estimation of dose rate effectiveness using Bayesian statistics. This statistical approach provides a quantitative description of the uncertainty of the factor characterising the dose rate in terms of a probability density function. The results show that a fixed dose from 137 Cs gamma radiation delivered at a high dose rate is more effective at inducing fatal mammary tumours and increasing the overall mortality rate in BALB/c female mice than the same dose delivered at a low dose rate. (author)

  8. Application of Bayesian Maximum Entropy Filter in parameter calibration of groundwater flow model in PingTung Plain

    Science.gov (United States)

    Cheung, Shao-Yong; Lee, Chieh-Han; Yu, Hwa-Lung

    2017-04-01

    Due to the limited hydrogeological observation data and high levels of uncertainty within, parameter estimation of the groundwater model has been an important issue. There are many methods of parameter estimation, for example, Kalman filter provides a real-time calibration of parameters through measurement of groundwater monitoring wells, related methods such as Extended Kalman Filter and Ensemble Kalman Filter are widely applied in groundwater research. However, Kalman Filter method is limited to linearity. This study propose a novel method, Bayesian Maximum Entropy Filtering, which provides a method that can considers the uncertainty of data in parameter estimation. With this two methods, we can estimate parameter by given hard data (certain) and soft data (uncertain) in the same time. In this study, we use Python and QGIS in groundwater model (MODFLOW) and development of Extended Kalman Filter and Bayesian Maximum Entropy Filtering in Python in parameter estimation. This method may provide a conventional filtering method and also consider the uncertainty of data. This study was conducted through numerical model experiment to explore, combine Bayesian maximum entropy filter and a hypothesis for the architecture of MODFLOW groundwater model numerical estimation. Through the virtual observation wells to simulate and observe the groundwater model periodically. The result showed that considering the uncertainty of data, the Bayesian maximum entropy filter will provide an ideal result of real-time parameters estimation.

  9. VERSE - Virtual Equivalent Real-time Simulation

    Science.gov (United States)

    Zheng, Yang; Martin, Bryan J.; Villaume, Nathaniel

    2005-01-01

    Distributed real-time simulations provide important timing validation and hardware in the- loop results for the spacecraft flight software development cycle. Occasionally, the need for higher fidelity modeling and more comprehensive debugging capabilities - combined with a limited amount of computational resources - calls for a non real-time simulation environment that mimics the real-time environment. By creating a non real-time environment that accommodates simulations and flight software designed for a multi-CPU real-time system, we can save development time, cut mission costs, and reduce the likelihood of errors. This paper presents such a solution: Virtual Equivalent Real-time Simulation Environment (VERSE). VERSE turns the real-time operating system RTAI (Real-time Application Interface) into an event driven simulator that runs in virtual real time. Designed to keep the original RTAI architecture as intact as possible, and therefore inheriting RTAI's many capabilities, VERSE was implemented with remarkably little change to the RTAI source code. This small footprint together with use of the same API allows users to easily run the same application in both real-time and virtual time environments. VERSE has been used to build a workstation testbed for NASA's Space Interferometry Mission (SIM PlanetQuest) instrument flight software. With its flexible simulation controls and inexpensive setup and replication costs, VERSE will become an invaluable tool in future mission development.

  10. Separation of spatial-temporal patterns ('climatic modes') by combined analysis of really measured and generated numerically vector time series

    Science.gov (United States)

    Feigin, A. M.; Mukhin, D.; Volodin, E. M.; Gavrilov, A.; Loskutov, E. M.

    2013-12-01

    The new method of decomposition of the Earth's climate system into well separated spatial-temporal patterns ('climatic modes') is discussed. The method is based on: (i) generalization of the MSSA (Multichannel Singular Spectral Analysis) [1] for expanding vector (space-distributed) time series in basis of spatial-temporal empirical orthogonal functions (STEOF), which makes allowance delayed correlations of the processes recorded in spatially separated points; (ii) expanding both real SST data, and longer by several times SST data generated numerically, in STEOF basis; (iii) use of the numerically produced STEOF basis for exclusion of 'too slow' (and thus not represented correctly) processes from real data. The application of the method allows by means of vector time series generated numerically by the INM RAS Coupled Climate Model [2] to separate from real SST anomalies data [3] two climatic modes possessing by noticeably different time scales: 3-5 and 9-11 years. Relations of separated modes to ENSO and PDO are investigated. Possible applications of spatial-temporal climatic patterns concept to prognosis of climate system evolution is discussed. 1. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 2. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm 3. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/

  11. Temporal and kinematic variables for real-world falls harvested from lumbar sensors in the elderly population.

    Science.gov (United States)

    Bourke, A K; Klenk, J; Schwickert, L; Aminian, K; Ihlen, E A F; Helbostad, J L; Chiari, L; Becker, C

    2015-01-01

    Automatic fall detection will reduce the consequences of falls in the elderly and promote independent living, ensuring people can confidently live safely at home. Inertial sensor technology can distinguish falls from normal activities. However, fall data recorded from elderly people in real life. The FARSEEING project has compiled a database of real life falls from elderly people, to gain new knowledge about fall events. We have extracted temporal and kinematic parameters to further improve the development of fall detection algorithms. A total of 100 real-world falls were analysed. Subjects with a known fall history were recruited, inertial sensors were attached to L5 and a fall report, following a fall, was used to extract the fall signal. This data-set was examined, and variables were extracted that include upper and lower impact peak values, posture angle change during the fall and time of occurrence. These extracted parameters, can be used to inform the design of fall-detection algorithms for real-world falls detection in the elderly.

  12. Real-Time View Correction for Mobile Devices.

    Science.gov (United States)

    Schops, Thomas; Oswald, Martin R; Speciale, Pablo; Yang, Shuoran; Pollefeys, Marc

    2017-11-01

    We present a real-time method for rendering novel virtual camera views from given RGB-D (color and depth) data of a different viewpoint. Missing color and depth information due to incomplete input or disocclusions is efficiently inpainted in a temporally consistent way. The inpainting takes the location of strong image gradients into account as likely depth discontinuities. We present our method in the context of a view correction system for mobile devices, and discuss how to obtain a screen-camera calibration and options for acquiring depth input. Our method has use cases in both augmented and virtual reality applications. We demonstrate the speed of our system and the visual quality of its results in multiple experiments in the paper as well as in the supplementary video.

  13. Comparative Study of Inference Methods for Bayesian Nonnegative Matrix Factorisation

    DEFF Research Database (Denmark)

    Brouwer, Thomas; Frellsen, Jes; Liò, Pietro

    2017-01-01

    In this paper, we study the trade-offs of different inference approaches for Bayesian matrix factorisation methods, which are commonly used for predicting missing values, and for finding patterns in the data. In particular, we consider Bayesian nonnegative variants of matrix factorisation and tri......-factorisation, and compare non-probabilistic inference, Gibbs sampling, variational Bayesian inference, and a maximum-a-posteriori approach. The variational approach is new for the Bayesian nonnegative models. We compare their convergence, and robustness to noise and sparsity of the data, on both synthetic and real...

  14. Bayesian Inference on the Memory Parameter for Gamma-Modulated Regression Models

    Directory of Open Access Journals (Sweden)

    Plinio Andrade

    2015-09-01

    Full Text Available In this work, we propose a Bayesian methodology to make inferences for the memory parameter and other characteristics under non-standard assumptions for a class of stochastic processes. This class generalizes the Gamma-modulated process, with trajectories that exhibit long memory behavior, as well as decreasing variability as time increases. Different values of the memory parameter influence the speed of this decrease, making this heteroscedastic model very flexible. Its properties are used to implement an approximate Bayesian computation and MCMC scheme to obtain posterior estimates. We test and validate our method through simulations and real data from the big earthquake that occurred in 2010 in Chile.

  15. ISTTOK real-time architecture

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Ivo S., E-mail: ivoc@ipfn.ist.utl.pt; Duarte, Paulo; Fernandes, Horácio; Valcárcel, Daniel F.; Carvalho, Pedro J.; Silva, Carlos; Duarte, André S.; Neto, André; Sousa, Jorge; Batista, António J.N.; Hekkert, Tiago; Carvalho, Bernardo B.

    2014-03-15

    Highlights: • All real-time diagnostics and actuators were integrated in the same control platform. • A 100 μs control cycle was achieved under the MARTe framework. • Time-windows based control with several event-driven control strategies implemented. • AC discharges with exception handling on iron core flux saturation. • An HTML discharge configuration was developed for configuring the MARTe system. - Abstract: The ISTTOK tokamak was upgraded with a plasma control system based on the Advanced Telecommunications Computing Architecture (ATCA) standard. This control system was designed to improve the discharge stability and to extend the operational space to the alternate plasma current (AC) discharges as part of the ISTTOK scientific program. In order to accomplish these objectives all ISTTOK diagnostics and actuators relevant for real-time operation were integrated in the control system. The control system was programmed in C++ over the Multi-threaded Application Real-Time executor (MARTe) which provides, among other features, a real-time scheduler, an interrupt handler, an intercommunications interface between code blocks and a clearly bounded interface with the external devices. As a complement to the MARTe framework, the BaseLib2 library provides the foundations for the data, code introspection and also a Hypertext Transfer Protocol (HTTP) server service. Taking advantage of the modular nature of MARTe, the algorithms of each diagnostic data processing, discharge timing, context switch, control and actuators output reference generation, run on well-defined blocks of code named Generic Application Module (GAM). This approach allows reusability of the code, simplified simulation, replacement or editing without changing the remaining GAMs. The ISTTOK control system GAMs run sequentially each 100 μs cycle on an Intel{sup ®} Q8200 4-core processor running at 2.33 GHz located in the ATCA crate. Two boards (inside the ATCA crate) with 32 analog

  16. ISTTOK real-time architecture

    International Nuclear Information System (INIS)

    Carvalho, Ivo S.; Duarte, Paulo; Fernandes, Horácio; Valcárcel, Daniel F.; Carvalho, Pedro J.; Silva, Carlos; Duarte, André S.; Neto, André; Sousa, Jorge; Batista, António J.N.; Hekkert, Tiago; Carvalho, Bernardo B.

    2014-01-01

    Highlights: • All real-time diagnostics and actuators were integrated in the same control platform. • A 100 μs control cycle was achieved under the MARTe framework. • Time-windows based control with several event-driven control strategies implemented. • AC discharges with exception handling on iron core flux saturation. • An HTML discharge configuration was developed for configuring the MARTe system. - Abstract: The ISTTOK tokamak was upgraded with a plasma control system based on the Advanced Telecommunications Computing Architecture (ATCA) standard. This control system was designed to improve the discharge stability and to extend the operational space to the alternate plasma current (AC) discharges as part of the ISTTOK scientific program. In order to accomplish these objectives all ISTTOK diagnostics and actuators relevant for real-time operation were integrated in the control system. The control system was programmed in C++ over the Multi-threaded Application Real-Time executor (MARTe) which provides, among other features, a real-time scheduler, an interrupt handler, an intercommunications interface between code blocks and a clearly bounded interface with the external devices. As a complement to the MARTe framework, the BaseLib2 library provides the foundations for the data, code introspection and also a Hypertext Transfer Protocol (HTTP) server service. Taking advantage of the modular nature of MARTe, the algorithms of each diagnostic data processing, discharge timing, context switch, control and actuators output reference generation, run on well-defined blocks of code named Generic Application Module (GAM). This approach allows reusability of the code, simplified simulation, replacement or editing without changing the remaining GAMs. The ISTTOK control system GAMs run sequentially each 100 μs cycle on an Intel ® Q8200 4-core processor running at 2.33 GHz located in the ATCA crate. Two boards (inside the ATCA crate) with 32 analog

  17. Design of Mixed-Criticality Applications on Distributed Real-Time Systems

    DEFF Research Database (Denmark)

    Tamas-Selicean, Domitian

    the concept of virtual links, and temporal separation, enforced through schedule tables for TT messages and bandwidth allocation for RC messages. The objective of this thesis is to develop methods and tools for distributed mixed-criticality real-time systems. At the processor level, we are interested......A mixed-criticality system implements applications of different safety-criticality levels onto the same platform. In such cases, the certification standards require that applications of different criticality levels are protected so they cannot influence each other. Otherwise, all tasks have...

  18. Hierarchical Bayesian Spatio Temporal Model Comparison on the Earth Trapped Particle Forecast

    International Nuclear Information System (INIS)

    Suparta, Wayan; Gusrizal

    2014-01-01

    We compared two hierarchical Bayesian spatio temporal (HBST) results, Gaussian process (GP) and autoregressive (AR) models, on the Earth trapped particle forecast. Two models were employed on the South Atlantic Anomaly (SAA) region. Electron of >30 keV (mep0e1) from National Oceanic and Atmospheric Administration (NOAA) 15-18 satellites data was chosen as the particle modeled. We used two weeks data to perform the model fitting on a 5°x5° grid of longitude and latitude, and 31 August 2007 was set as the date of forecast. Three statistical validations were performed on the data, i.e. the root mean square error (RMSE), mean absolute percentage error (MAPE) and bias (BIAS). The statistical analysis showed that GP model performed better than AR with the average of RMSE = 0.38 and 0.63, MAPE = 11.98 and 17.30, and BIAS = 0.32 and 0.24, for GP and AR, respectively. Visual validation on both models with the NOAA map's also confirmed the superior of the GP than the AR. The variance of log flux minimum = 0.09 and 1.09, log flux maximum = 1.15 and 1.35, and in successively represents GP and AR

  19. Real-time monitoring of corks' water absorption using laser speckle temporal correlation

    Science.gov (United States)

    Nassif, Rana; Abou Nader, Christelle; Pellen, Fabrice; Le Jeune, Bernard; Le Brun, Guy; Abboud, Marie

    2015-08-01

    Physical and mechanical properties of cork allow it solving many types of problems and make it suitable for a wide range of applications. Our objective consists into studying cork's water absorption by analyzing the dynamic speckle field using the temporal correlation method. Experimental results show that the medium was inert at first with the absence of activity, and as the cap cork was more and more immersed into water, the presence of the activity becomes more significant. This temporal parameter revealed the sensibility of biospeckle method to monitor the amount of absorbed water by cork caps.

  20. Online variational Bayesian filtering-based mobile target tracking in wireless sensor networks.

    Science.gov (United States)

    Zhou, Bingpeng; Chen, Qingchun; Li, Tiffany Jing; Xiao, Pei

    2014-11-11

    The received signal strength (RSS)-based online tracking for a mobile node in wireless sensor networks (WSNs) is investigated in this paper. Firstly, a multi-layer dynamic Bayesian network (MDBN) is introduced to characterize the target mobility with either directional or undirected movement. In particular, it is proposed to employ the Wishart distribution to approximate the time-varying RSS measurement precision's randomness due to the target movement. It is shown that the proposed MDBN offers a more general analysis model via incorporating the underlying statistical information of both the target movement and observations, which can be utilized to improve the online tracking capability by exploiting the Bayesian statistics. Secondly, based on the MDBN model, a mean-field variational Bayesian filtering (VBF) algorithm is developed to realize the online tracking of a mobile target in the presence of nonlinear observations and time-varying RSS precision, wherein the traditional Bayesian filtering scheme cannot be directly employed. Thirdly, a joint optimization between the real-time velocity and its prior expectation is proposed to enable online velocity tracking in the proposed online tacking scheme. Finally, the associated Bayesian Cramer-Rao Lower Bound (BCRLB) analysis and numerical simulations are conducted. Our analysis unveils that, by exploiting the potential state information via the general MDBN model, the proposed VBF algorithm provides a promising solution to the online tracking of a mobile node in WSNs. In addition, it is shown that the final tracking accuracy linearly scales with its expectation when the RSS measurement precision is time-varying.

  1. Inventory model using bayesian dynamic linear model for demand forecasting

    Directory of Open Access Journals (Sweden)

    Marisol Valencia-Cárdenas

    2014-12-01

    Full Text Available An important factor of manufacturing process is the inventory management of terminated product. Constantly, industry is looking for better alternatives to establish an adequate plan of production and stored quantities, with optimal cost, getting quantities in a time horizon, which permits to define resources and logistics with anticipation, needed to distribute products on time. Total absence of historical data, required by many statistical models to forecast, demands the search for other kind of accurate techniques. This work presents an alternative that not only permits to forecast, in an adjusted way, but also, to provide optimal quantities to produce and store with an optimal cost, using Bayesian statistics. The proposal is illustrated with real data. Palabras clave: estadística bayesiana, optimización, modelo de inventarios, modelo lineal dinámico bayesiano. Keywords: Bayesian statistics, opti

  2. Tablet disintegration studied by high-resolution real-time magnetic resonance imaging.

    OpenAIRE

    Quodbach, J.; Moussavi, A.; Tammer, R.; Frahm, J.; Kleinebudde, P.

    2014-01-01

    The present work employs recent advances in high-resolution real-time magnetic resonance imaging (MRI) to investigate the disintegration process of tablets containing disintegrants. A temporal resolution of 75 ms and a spatial resolution of 80 x 80 m with a section thickness of only 600 m were achieved. The histograms of MRI videos were quantitatively analyzed with MATLAB. The mechanisms of action of six commercially available disintegrants, the influence of relative tablet density, and the i...

  3. MO-FG-202-08: Real-Time Monte Carlo-Based Treatment Dose Reconstruction and Monitoring for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Z; Shi, F; Gu, X; Tan, J; Hassan-Rezaeian, N; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States)

    2016-06-15

    Purpose: This proof-of-concept study is to develop a real-time Monte Carlo (MC) based treatment-dose reconstruction and monitoring system for radiotherapy, especially for the treatments with complicated delivery, to catch treatment delivery errors at the earliest possible opportunity and interrupt the treatment only when an unacceptable dosimetric deviation from our expectation occurs. Methods: First an offline scheme is launched to pre-calculate the expected dose from the treatment plan, used as ground truth for real-time monitoring later. Then an online scheme with three concurrent threads is launched while treatment delivering, to reconstruct and monitor the patient dose in a temporally resolved fashion in real-time. Thread T1 acquires machine status every 20 ms to calculate and accumulate fluence map (FM). Once our accumulation threshold is reached, T1 transfers the FM to T2 for dose reconstruction ad starts to accumulate a new FM. A GPU-based MC dose calculation is performed on T2 when MC dose engine is ready and a new FM is available. The reconstructed instantaneous dose is directed to T3 for dose accumulation and real-time visualization. Multiple dose metrics (e.g. maximum and mean dose for targets and organs) are calculated from the current accumulated dose and compared with the pre-calculated expected values. Once the discrepancies go beyond our tolerance, an error message will be send to interrupt the treatment delivery. Results: A VMAT Head-and-neck patient case was used to test the performance of our system. Real-time machine status acquisition was simulated here. The differences between the actual dose metrics and the expected ones were 0.06%–0.36%, indicating an accurate delivery. ∼10Hz frequency of dose reconstruction and monitoring was achieved, with 287.94s online computation time compared to 287.84s treatment delivery time. Conclusion: Our study has demonstrated the feasibility of computing a dose distribution in a temporally resolved fashion

  4. Predicting Near Real-Time Inundation Occurrence from Complimentary Satellite Microwave Brightness Temperature Observations

    Science.gov (United States)

    Fisher, C. K.; Pan, M.; Wood, E. F.

    2017-12-01

    Throughout the world, there is an increasing need for new methods and data that can aid decision makers, emergency responders and scientists in the monitoring of flood events as they happen. In many regions, it is possible to examine the extent of historical and real-time inundation occurrence from visible and infrared imagery provided by sensors such as MODIS or the Landsat TM; however, this is not possible in regions that are densely vegetated or are under persistent cloud cover. In addition, there is often a temporal mismatch between the sampling of a particular sensor and a given flood event, leading to limited observations in near real-time. As a result, there is a need for alternative methods that take full advantage of complimentary remotely sensed data sources, such as available microwave brightness temperature observations (e.g., SMAP, SMOS, AMSR2, AMSR-E, and GMI), to aid in the estimation of global flooding. The objective of this work was to develop a high-resolution mapping of inundated areas derived from multiple satellite microwave sensor observations with a daily temporal resolution. This system consists of first retrieving water fractions from complimentary microwave sensors (AMSR-2 and SMAP) which may spatially and temporally overlap in the region of interest. Using additional information in a Random Forest classifier, including high resolution topography and multiple datasets of inundated area (both historical and empirical), the resulting retrievals are spatially downscaled to derive estimates of the extent of inundation at a scale relevant to management and flood response activities ( 90m or better) instead of the relatively coarse resolution water fractions, which are limited by the microwave sensor footprints ( 5-50km). Here we present the training and validation of this method for the 2015 floods that occurred in Houston, Texas. Comparing the predicted inundation against historical occurrence maps derived from the Landsat TM record and MODIS

  5. Co-simulation for real time safety verification of nuclear power plants

    International Nuclear Information System (INIS)

    Boafo, E.K.; Zhang, L.; Nasimi, E.; Gabbar, H.A.

    2015-01-01

    for fault detection and tuning of FSN, as well as fault diagnosis to understand the closest state of fault scenario. Intelligent algorithm for Bayesian Believe Networks (BBN) is developed to estimate probabilities associated with dynamic FSN with priori and posteriori probabilities. This will dynamically tune FSN with probabilities and real time and simulation data. Probabilistic risk are estimated for each propagation scenario along with the reliabilities of associated IPLs. This will accurately verify safety for all propagation scenarios during plant operation and maintenance. And in order to fine tune propagation scenarios within FSN, rules are synthesized using fuzzy logic using real time and simulation data. (author)

  6. Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application

    Science.gov (United States)

    Chen, Jinduan; Boccelli, Dominic L.

    2018-02-01

    Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.

  7. Bayesian Estimation and Inference using Stochastic Hardware

    Directory of Open Access Journals (Sweden)

    Chetan Singh Thakur

    2016-03-01

    Full Text Available In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker, demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND, we show how inference can be performed in a Directed Acyclic Graph (DAG using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.

  8. Bayesian Estimation and Inference Using Stochastic Electronics.

    Science.gov (United States)

    Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André

    2016-01-01

    In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.

  9. Determination of Uncalibrated Phase Delays for Real-Time PPP

    Science.gov (United States)

    Hinterberger, Fabian; Weber, Robert; Huber, Katrin; Lesjak, Roman

    2014-05-01

    Today PPP is a well-known technique of GNSS based positioning used for a wide range of post-processing applications. Using observations of a single GNSS receiver and applying precise orbit and clock information derived from global GNSS networks highly precise positions can be obtained. The atmospheric delays are usually mitigated by linear combination (ionosphere) and parameter estimation (troposphere). Within the last years also the demand for real-time PPP increased. In 2012, the IGS real-time working group started a pilot project to broadcast real-time precise orbits and clock correction streams. Nevertheless, real-time PPP is in its starting phase and currently only few applications make use of the technique although SSR-Messages are already implemented in RTCM3.1. The problems of still limited accuracy compared to Network-RTK as well as long convergence times might be solved by almost instantaneous integer ambiguity resolution at zero-difference level which is a major topic of current scientific investigations. Therefore a national consortium has carried out over the past 2 years the research project PPP-Serve (funded by the Austrian Research Promotion Agency - FFG), which aimed at the development of appropriate algorithms for real-time PPP with special emphasis on the ambiguity resolution of zero-difference observations. We have established a module which calculates based on GPS-reference station data-streams of a dense network (obtained from IGS via BKG) so-called wide-lane and narrow-lane satellite specific calibration phase delays. While the wide-lane phase delays are almost stable over longer periods, the estimation of narrow-lane phase delays has to be re-established every 24 hours. These phase-delays are submitted via a real-time module to the rover where they are used for point positioning via a PPP-model. This presentation deals with the process and obstacles of calculating the wide-lane and narrow-lane phase-delays (based on SD -observations between

  10. Improving near-real time deforestation monitoring in tropical dry forests by combining dense Sentinel-1 time series with Landsat and ALOS-2 PALSAR-2

    NARCIS (Netherlands)

    Reiche, Johannes; Hamunyela, Eliakim; Verbesselt, Jan; Hoekman, Dirk; Herold, Martin

    2018-01-01

    Combining observations from multiple optical and synthetic aperture radar (SAR) satellites can provide temporally dense and regular information at medium resolution scale, independently of weather, season, and location. This has the potential to improve near real-time deforestation monitoring in dry

  11. Real-time three-dimensional temperature mapping in photothermal therapy with optoacoustic tomography

    Science.gov (United States)

    Oyaga Landa, Francisco Javier; Deán-Ben, Xosé Luís.; Sroka, Ronald; Razansky, Daniel

    2017-07-01

    Ablation and photothermal therapy are widely employed medical protocols where the selective destruction of tissue is a necessity as in cancerous tissue removal or vascular and brain abnormalities. Tissue denaturation takes place when the temperature reaches a threshold value while the time of exposure determines the lesion size. Therefore, the spatio-temporal distribution of temperature plays a crucial role in the outcome of these clinical interventions. We demonstrate fast volumetric temperature mapping with optoacoustic tomography based on real-time optoacoustic readings from the treated region. The performance of the method was investigated in tissue-mimicking phantom experiments. The new ability to non-invasively measure temperature volumetrically in an entire treated region with high spatial and temporal resolutions holds potential for improving safety and efficacy of thermal ablation and to advance the general applicability of laser-based therapy.

  12. Bayesian Latent Class Analysis Tutorial.

    Science.gov (United States)

    Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca

    2018-01-01

    This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.

  13. BAYESIAN MAGNETOHYDRODYNAMIC SEISMOLOGY OF CORONAL LOOPS

    International Nuclear Information System (INIS)

    Arregui, I.; Asensio Ramos, A.

    2011-01-01

    We perform a Bayesian parameter inference in the context of resonantly damped transverse coronal loop oscillations. The forward problem is solved in terms of parametric results for kink waves in one-dimensional flux tubes in the thin tube and thin boundary approximations. For the inverse problem, we adopt a Bayesian approach to infer the most probable values of the relevant parameters, for given observed periods and damping times, and to extract their confidence levels. The posterior probability distribution functions are obtained by means of Markov Chain Monte Carlo simulations, incorporating observed uncertainties in a consistent manner. We find well-localized solutions in the posterior probability distribution functions for two of the three parameters of interest, namely the Alfven travel time and the transverse inhomogeneity length scale. The obtained estimates for the Alfven travel time are consistent with previous inversion results, but the method enables us to additionally constrain the transverse inhomogeneity length scale and to estimate real error bars for each parameter. When observational estimates for the density contrast are used, the method enables us to fully constrain the three parameters of interest. These results can serve to improve our current estimates of unknown physical parameters in coronal loops and to test the assumed theoretical model.

  14. Bayesian prediction and adaptive sampling algorithms for mobile sensor networks online environmental field reconstruction in space and time

    CERN Document Server

    Xu, Yunfei; Dass, Sarat; Maiti, Tapabrata

    2016-01-01

    This brief introduces a class of problems and models for the prediction of the scalar field of interest from noisy observations collected by mobile sensor networks. It also introduces the problem of optimal coordination of robotic sensors to maximize the prediction quality subject to communication and mobility constraints either in a centralized or distributed manner. To solve such problems, fully Bayesian approaches are adopted, allowing various sources of uncertainties to be integrated into an inferential framework effectively capturing all aspects of variability involved. The fully Bayesian approach also allows the most appropriate values for additional model parameters to be selected automatically by data, and the optimal inference and prediction for the underlying scalar field to be achieved. In particular, spatio-temporal Gaussian process regression is formulated for robotic sensors to fuse multifactorial effects of observations, measurement noise, and prior distributions for obtaining the predictive di...

  15. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.

  16. A Bayesian method for construction of Markov models to describe dynamics on various time-scales.

    Science.gov (United States)

    Rains, Emily K; Andersen, Hans C

    2010-10-14

    The dynamics of many biological processes of interest, such as the folding of a protein, are slow and complicated enough that a single molecular dynamics simulation trajectory of the entire process is difficult to obtain in any reasonable amount of time. Moreover, one such simulation may not be sufficient to develop an understanding of the mechanism of the process, and multiple simulations may be necessary. One approach to circumvent this computational barrier is the use of Markov state models. These models are useful because they can be constructed using data from a large number of shorter simulations instead of a single long simulation. This paper presents a new Bayesian method for the construction of Markov models from simulation data. A Markov model is specified by (τ,P,T), where τ is the mesoscopic time step, P is a partition of configuration space into mesostates, and T is an N(P)×N(P) transition rate matrix for transitions between the mesostates in one mesoscopic time step, where N(P) is the number of mesostates in P. The method presented here is different from previous Bayesian methods in several ways. (1) The method uses Bayesian analysis to determine the partition as well as the transition probabilities. (2) The method allows the construction of a Markov model for any chosen mesoscopic time-scale τ. (3) It constructs Markov models for which the diagonal elements of T are all equal to or greater than 0.5. Such a model will be called a "consistent mesoscopic Markov model" (CMMM). Such models have important advantages for providing an understanding of the dynamics on a mesoscopic time-scale. The Bayesian method uses simulation data to find a posterior probability distribution for (P,T) for any chosen τ. This distribution can be regarded as the Bayesian probability that the kinetics observed in the atomistic simulation data on the mesoscopic time-scale τ was generated by the CMMM specified by (P,T). An optimization algorithm is used to find the most

  17. Towards Real-Time Argumentation

    Directory of Open Access Journals (Sweden)

    Vicente JULIÁN

    2016-07-01

    Full Text Available In this paper, we deal with the problem of real-time coordination with the more general approach of reaching real-time agreements in MAS. Concretely, this work proposes a real-time argumentation framework in an attempt to provide agents with the ability of engaging in argumentative dialogues and come with a solution for their underlying agreement process within a bounded period of time. The framework has been implemented and evaluated in the domain of a customer support application. Concretely, we consider a society of agents that act on behalf of a group of technicians that must solve problems in a Technology Management Centre (TMC within a bounded time. This centre controls every process implicated in the provision of technological and customer support services to private or public organisations by means of a call centre. The contract signed between the TCM and the customer establishes penalties if the specified time is exceeded.

  18. The GFZ real-time GNSS precise positioning service system and its adaption for COMPASS

    Science.gov (United States)

    Li, Xingxing; Ge, Maorong; Zhang, Hongping; Nischan, Thomas; Wickert, Jens

    2013-03-01

    Motivated by the IGS real-time Pilot Project, GFZ has been developing its own real-time precise positioning service for various applications. An operational system at GFZ is now broadcasting real-time orbits, clocks, global ionospheric model, uncalibrated phase delays and regional atmospheric corrections for standard PPP, PPP with ambiguity fixing, single-frequency PPP and regional augmented PPP. To avoid developing various algorithms for different applications, we proposed a uniform algorithm and implemented it into our real-time software. In the new processing scheme, we employed un-differenced raw observations with atmospheric delays as parameters, which are properly constrained by real-time derived global ionospheric model or regional atmospheric corrections and by the empirical characteristics of the atmospheric delay variation in time and space. The positioning performance in terms of convergence time and ambiguity fixing depends mainly on the quality of the received atmospheric information and the spatial and temporal constraints. The un-differenced raw observation model can not only integrate PPP and NRTK into a seamless positioning service, but also syncretize these two techniques into a unique model and algorithm. Furthermore, it is suitable for both dual-frequency and sing-frequency receivers. Based on the real-time data streams from IGS, EUREF and SAPOS reference networks, we can provide services of global precise point positioning (PPP) with 5-10 cm accuracy, PPP with ambiguity-fixing of 2-5 cm accuracy, PPP using single-frequency receiver with accuracy of better than 50 cm and PPP with regional augmentation for instantaneous ambiguity resolution of 1-3 cm accuracy. We adapted the system for current COMPASS to provide PPP service. COMPASS observations from a regional network of nine stations are used for precise orbit determination and clock estimation in simulated real-time mode, the orbit and clock products are applied for real-time precise point

  19. Identifying Optimal Temporal Scale for the Correlation of AOD and Ground Measurements of PM2.5 to Improve the Model Performance in a Real-time Air Quality Estimation System

    Science.gov (United States)

    Li, Hui; Faruque, Fazlay; Williams, Worth; Al-Hamdan, Mohammad; Luvall, Jeffrey C.; Crosson, William; Rickman, Douglas; Limaye, Ashutosh

    2009-01-01

    Aerosol optical depth (AOD), an indirect estimate of particle matter using satellite observations, has shown great promise in improving estimates of PM 2.5 air quality surface. Currently, few studies have been conducted to explore the optimal way to apply AOD data to improve the model accuracy of PM 2.5 surface estimation in a real-time air quality system. We believe that two major aspects may be worthy of consideration in that area: 1) the approach to integrate satellite measurements with ground measurements in the pollution estimation, and 2) identification of an optimal temporal scale to calculate the correlation of AOD and ground measurements. This paper is focused on the second aspect on the identifying the optimal temporal scale to correlate AOD with PM2.5. Five following different temporal scales were chosen to evaluate their impact on the model performance: 1) within the last 3 days, 2) within the last 10 days, 3) within the last 30 days, 4) within the last 90 days, and 5) the time period with the highest correlation in a year. The model performance is evaluated for its accuracy, bias, and errors based on the following selected statistics: the Mean Bias, the Normalized Mean Bias, the Root Mean Square Error, Normalized Mean Error, and the Index of Agreement. This research shows that the model with the temporal scale of within the last 30 days displays the best model performance in this study area using 2004 and 2005 data sets.

  20. Real-time PCR in virology.

    Science.gov (United States)

    Mackay, Ian M; Arden, Katherine E; Nitsche, Andreas

    2002-03-15

    The use of the polymerase chain reaction (PCR) in molecular diagnostics has increased to the point where it is now accepted as the gold standard for detecting nucleic acids from a number of origins and it has become an essential tool in the research laboratory. Real-time PCR has engendered wider acceptance of the PCR due to its improved rapidity, sensitivity, reproducibility and the reduced risk of carry-over contamination. There are currently five main chemistries used for the detection of PCR product during real-time PCR. These are the DNA binding fluorophores, the 5' endonuclease, adjacent linear and hairpin oligoprobes and the self-fluorescing amplicons, which are described in detail. We also discuss factors that have restricted the development of multiplex real-time PCR as well as the role of real-time PCR in quantitating nucleic acids. Both amplification hardware and the fluorogenic detection chemistries have evolved rapidly as the understanding of real-time PCR has developed and this review aims to update the scientist on the current state of the art. We describe the background, advantages and limitations of real-time PCR and we review the literature as it applies to virus detection in the routine and research laboratory in order to focus on one of the many areas in which the application of real-time PCR has provided significant methodological benefits and improved patient outcomes. However, the technology discussed has been applied to other areas of microbiology as well as studies of gene expression and genetic disease.

  1. Real time programming environment for Windows

    Energy Technology Data Exchange (ETDEWEB)

    LaBelle, D.R. [LaBelle (Dennis R.), Clifton Park, NY (United States)

    1998-04-01

    This document provides a description of the Real Time Programming Environment (RTProE). RTProE tools allow a programmer to create soft real time projects under general, multi-purpose operating systems. The basic features necessary for real time applications are provided by RTProE, leaving the programmer free to concentrate efforts on his specific project. The current version supports Microsoft Windows{trademark} 95 and NT. The tasks of real time synchronization and communication with other programs are handled by RTProE. RTProE includes a generic method for connecting a graphical user interface (GUI) to allow real time control and interaction with the programmer`s product. Topics covered in this paper include real time performance issues, portability, details of shared memory management, code scheduling, application control, Operating System specific concerns and the use of Computer Aided Software Engineering (CASE) tools. The development of RTProE is an important step in the expansion of the real time programming community. The financial costs associated with using the system are minimal. All source code for RTProE has been made publicly available. Any person with access to a personal computer, Windows 95 or NT, and C or FORTRAN compilers can quickly enter the world of real time modeling and simulation.

  2. An In-Home Digital Network Architecture for Real-Time and Non-Real-Time Communication

    NARCIS (Netherlands)

    Scholten, Johan; Jansen, P.G.; Hanssen, F.T.Y.; Hattink, Tjalling

    2002-01-01

    This paper describes an in-home digital network architecture that supports both real-time and non-real-time communication. The architecture deploys a distributed token mechanism to schedule communication streams and to offer guaranteed quality-ofservice. Essentially, the token mechanism prevents

  3. MARTe: A Multiplatform Real-Time Framework

    Science.gov (United States)

    Neto, André C.; Sartori, Filippo; Piccolo, Fabio; Vitelli, Riccardo; De Tommasi, Gianmaria; Zabeo, Luca; Barbalace, Antonio; Fernandes, Horacio; Valcarcel, Daniel F.; Batista, Antonio J. N.

    2010-04-01

    Development of real-time applications is usually associated with nonportable code targeted at specific real-time operating systems. The boundary between hardware drivers, system services, and user code is commonly not well defined, making the development in the target host significantly difficult. The Multithreaded Application Real-Time executor (MARTe) is a framework built over a multiplatform library that allows the execution of the same code in different operating systems. The framework provides the high-level interfaces with hardware, external configuration programs, and user interfaces, assuring at the same time hard real-time performances. End-users of the framework are required to define and implement algorithms inside a well-defined block of software, named Generic Application Module (GAM), that is executed by the real-time scheduler. Each GAM is reconfigurable with a set of predefined configuration meta-parameters and interchanges information using a set of data pipes that are provided as inputs and required as output. Using these connections, different GAMs can be chained either in series or parallel. GAMs can be developed and debugged in a non-real-time system and, only once the robustness of the code and correctness of the algorithm are verified, deployed to the real-time system. The software also supplies a large set of utilities that greatly ease the interaction and debugging of a running system. Among the most useful are a highly efficient real-time logger, HTTP introspection of real-time objects, and HTTP remote configuration. MARTe is currently being used to successfully drive the plasma vertical stabilization controller on the largest magnetic confinement fusion device in the world, with a control loop cycle of 50 ?s and a jitter under 1 ?s. In this particular project, MARTe is used with the Real-Time Application Interface (RTAI)/Linux operating system exploiting the new ?86 multicore processors technology.

  4. Linear filters as a method of real-time prediction of geomagnetic activity

    International Nuclear Information System (INIS)

    McPherron, R.L.; Baker, D.N.; Bargatze, L.F.

    1985-01-01

    Important factors controlling geomagnetic activity include the solar wind velocity, the strength of the interplanetary magnetic field (IMF), and the field orientation. Because these quantities change so much in transit through the solar wind, real-time monitoring immediately upstream of the earth provides the best input for any technique of real-time prediction. One such technique is linear prediction filtering which utilizes past histories of the input and output of a linear system to create a time-invariant filter characterizing the system. Problems of nonlinearity or temporal changes of the system can be handled by appropriate choice of input parameters and piecewise approximation in various ranges of the input. We have created prediction filters for all the standard magnetic indices and tested their efficiency. The filters show that the initial response of the magnetosphere to a southward turning of the IMF peaks in 20 minutes and then again in 55 minutes. After a northward turning, auroral zone indices and the midlatitude ASYM index return to background within 2 hours, while Dst decays exponentially with a time constant of about 8 hours. This paper describes a simple, real-time system utilizing these filters which could predict a substantial fraction of the variation in magnetic activity indices 20 to 50 minutes in advance

  5. Real-Time and Real-Fast Performance of General-Purpose and Real-Time Operating Systems in Multithreaded Physical Simulation of Complex Mechanical Systems

    Directory of Open Access Journals (Sweden)

    Carlos Garre

    2014-01-01

    Full Text Available Physical simulation is a valuable tool in many fields of engineering for the tasks of design, prototyping, and testing. General-purpose operating systems (GPOS are designed for real-fast tasks, such as offline simulation of complex physical models that should finish as soon as possible. Interfacing hardware at a given rate (as in a hardware-in-the-loop test requires instead maximizing time determinism, for which real-time operating systems (RTOS are designed. In this paper, real-fast and real-time performance of RTOS and GPOS are compared when simulating models of high complexity with large time steps. This type of applications is usually present in the automotive industry and requires a good trade-off between real-fast and real-time performance. The performance of an RTOS and a GPOS is compared by running a tire model scalable on the number of degrees-of-freedom and parallel threads. The benchmark shows that the GPOS present better performance in real-fast runs but worse in real-time due to nonexplicit task switches and to the latency associated with interprocess communication (IPC and task switch.

  6. GNSS global real-time augmentation positioning: Real-time precise satellite clock estimation, prototype system construction and performance analysis

    Science.gov (United States)

    Chen, Liang; Zhao, Qile; Hu, Zhigang; Jiang, Xinyuan; Geng, Changjiang; Ge, Maorong; Shi, Chuang

    2018-01-01

    Lots of ambiguities in un-differenced (UD) model lead to lower calculation efficiency, which isn't appropriate for the high-frequency real-time GNSS clock estimation, like 1 Hz. Mixed differenced model fusing UD pseudo-range and epoch-differenced (ED) phase observations has been introduced into real-time clock estimation. In this contribution, we extend the mixed differenced model for realizing multi-GNSS real-time clock high-frequency updating and a rigorous comparison and analysis on same conditions are performed to achieve the best real-time clock estimation performance taking the efficiency, accuracy, consistency and reliability into consideration. Based on the multi-GNSS real-time data streams provided by multi-GNSS Experiment (MGEX) and Wuhan University, GPS + BeiDou + Galileo global real-time augmentation positioning prototype system is designed and constructed, including real-time precise orbit determination, real-time precise clock estimation, real-time Precise Point Positioning (RT-PPP) and real-time Standard Point Positioning (RT-SPP). The statistical analysis of the 6 h-predicted real-time orbits shows that the root mean square (RMS) in radial direction is about 1-5 cm for GPS, Beidou MEO and Galileo satellites and about 10 cm for Beidou GEO and IGSO satellites. Using the mixed differenced estimation model, the prototype system can realize high-efficient real-time satellite absolute clock estimation with no constant clock-bias and can be used for high-frequency augmentation message updating (such as 1 Hz). The real-time augmentation message signal-in-space ranging error (SISRE), a comprehensive accuracy of orbit and clock and effecting the users' actual positioning performance, is introduced to evaluate and analyze the performance of GPS + BeiDou + Galileo global real-time augmentation positioning system. The statistical analysis of real-time augmentation message SISRE is about 4-7 cm for GPS, whlile 10 cm for Beidou IGSO/MEO, Galileo and about 30 cm

  7. An accuracy assessment of realtime GNSS time series toward semi- real time seafloor geodetic observation

    Science.gov (United States)

    Osada, Y.; Ohta, Y.; Demachi, T.; Kido, M.; Fujimoto, H.; Azuma, R.; Hino, R.

    2013-12-01

    Large interplate earthquake repeatedly occurred in Japan Trench. Recently, the detail crustal deformation revealed by the nation-wide inland GPS network called as GEONET by GSI. However, the maximum displacement region for interplate earthquake is mainly located offshore region. GPS/Acoustic seafloor geodetic observation (hereafter GPS/A) is quite important and useful for understanding of shallower part of the interplate coupling between subducting and overriding plates. We typically conduct GPS/A in specific ocean area based on repeated campaign style using research vessel or buoy. Therefore, we cannot monitor the temporal variation of seafloor crustal deformation in real time. The one of technical issue on real time observation is kinematic GPS analysis because kinematic GPS analysis based on reference and rover data. If the precise kinematic GPS analysis will be possible in the offshore region, it should be promising method for real time GPS/A with USV (Unmanned Surface Vehicle) and a moored buoy. We assessed stability, precision and accuracy of StarFireTM global satellites based augmentation system. We primarily tested for StarFire in the static condition. In order to assess coordinate precision and accuracy, we compared 1Hz StarFire time series and post-processed precise point positioning (PPP) 1Hz time series by GIPSY-OASIS II processing software Ver. 6.1.2 with three difference product types (ultra-rapid, rapid, and final orbits). We also used difference interval clock information (30 and 300 seconds) for the post-processed PPP processing. The standard deviation of real time StarFire time series is less than 30 mm (horizontal components) and 60 mm (vertical component) based on 1 month continuous processing. We also assessed noise spectrum of the estimated time series by StarFire and post-processed GIPSY PPP results. We found that the noise spectrum of StarFire time series is similar pattern with GIPSY-OASIS II processing result based on JPL rapid orbit

  8. The temporal-relevance temporal-uncertainty model of prospective duration judgment.

    Science.gov (United States)

    Zakay, Dan

    2015-12-15

    A model aimed at explaining prospective duration judgments in real life settings (as well as in the laboratory) is presented. The model is based on the assumption that situational meaning is continuously being extracted by humans' perceptual and cognitive information processing systems. Time is one of the important dimensions of situational meaning. Based on the situational meaning, a value for Temporal Relevance is set. Temporal Relevance reflects the importance of temporal aspects for enabling adaptive behavior in a specific moment in time. When Temporal Relevance is above a certain threshold a prospective duration judgment process is evoked automatically. In addition, a search for relevant temporal information is taking place and its outcomes determine the level of Temporal Uncertainty which reflects the degree of knowledge one has regarding temporal aspects of the task to be performed. The levels of Temporal Relevance and Temporal Uncertainty determine the amount of attentional resources allocated for timing by the executive system. The merit of the model is in connecting timing processes with the ongoing general information processing stream. The model rests on findings in various domains which indicate that cognitive-relevance and self-relevance are powerful determinants of resource allocation policy. The feasibility of the model is demonstrated by analyzing various temporal phenomena. Suggestions for further empirical validation of the model are presented. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Scalable Real-Time Negotiation Toolkit

    National Research Council Canada - National Science Library

    Lesser, Victor

    2004-01-01

    ... to implement an adaptive distributed sensor network. These activities involved the development of a distributed soft, real-time heuristic resource allocation protocol, the development of a domain-independent soft, real time agent architecture...

  10. A modified GO-FLOW methodology with common cause failure based on Discrete Time Bayesian Network

    International Nuclear Information System (INIS)

    Fan, Dongming; Wang, Zili; Liu, Linlin; Ren, Yi

    2016-01-01

    Highlights: • Identification of particular causes of failure for common cause failure analysis. • Comparison two formalisms (GO-FLOW and Discrete Time Bayesian network) and establish the correlation between them. • Mapping the GO-FLOW model into Bayesian network model. • Calculated GO-FLOW model with common cause failures based on DTBN. - Abstract: The GO-FLOW methodology is a success-oriented system reliability modelling technique for multi-phase missions involving complex time-dependent, multi-state and common cause failure (CCF) features. However, the analysis algorithm cannot easily handle the multiple shared signals and CCFs. In addition, the simulative algorithm is time consuming when vast multi-state components exist in the model, and the multiple time points of phased mission problems increases the difficulty of the analysis method. In this paper, the Discrete Time Bayesian Network (DTBN) and the GO-FLOW methodology are integrated by the unified mapping rules. Based on these rules, the multi operators can be mapped into DTBN followed by, a complete GO-FLOW model with complex characteristics (e.g. phased mission, multi-state, and CCF) can be converted to the isomorphic DTBN and easily analyzed by utilizing the DTBN. With mature algorithms and tools, the multi-phase mission reliability parameter can be efficiently obtained via the proposed approach without considering the shared signals and the various complex logic operation. Meanwhile, CCF can also arise in the computing process.

  11. Model Checking Real-Time Systems

    DEFF Research Database (Denmark)

    Bouyer, Patricia; Fahrenberg, Uli; Larsen, Kim Guldstrand

    2018-01-01

    This chapter surveys timed automata as a formalism for model checking real-time systems. We begin with introducing the model, as an extension of finite-state automata with real-valued variables for measuring time. We then present the main model-checking results in this framework, and give a hint...

  12. Modular specification of real-time systems

    DEFF Research Database (Denmark)

    Inal, Recep

    1994-01-01

    Duration Calculus, a real-time interval logic, has been embedded in the Z specification language to provide a notation for real-time systems that combines the modularisation and abstraction facilities of Z with a logic suitable for reasoning about real-time properties. In this article the notation...

  13. Hard Real-Time Networking on Firewire

    NARCIS (Netherlands)

    Zhang, Yuchen; Orlic, Bojan; Visser, Peter; Broenink, Jan

    2005-01-01

    This paper investigates the possibility of using standard, low-cost, widely used FireWire as a new generation fieldbus medium for real-time distributed control applications. A real-time software subsys- tem, RT-FireWire was designed that can, in combination with Linux-based real-time operating

  14. A platform for real-time online health analytics during spaceflight

    Science.gov (United States)

    McGregor, Carolyn

    Monitoring the health and wellbeing of astronauts during spaceflight is an important aspect of any manned mission. To date the monitoring has been based on a sequential set of discontinuous samplings of physiological data to support initial studies on aspects such as weightlessness, and its impact on the cardiovascular system and to perform proactive monitoring for health status. The research performed and the real-time monitoring has been hampered by the lack of a platform to enable a more continuous approach to real-time monitoring. While any spaceflight is monitored heavily by Mission Control, an important requirement within the context of any spaceflight setting and in particular where there are extended periods with a lack of communication with Mission Control, is the ability for the mission to operate in an autonomous manner. This paper presents a platform to enable real-time astronaut monitoring for prognostics and health management within space medicine using online health analytics. The platform is based on extending previous online health analytics research known as the Artemis and Artemis Cloud platforms which have demonstrated their relevance for multi-patient, multi-diagnosis and multi-stream temporal analysis in real-time for clinical management and research within Neonatal Intensive Care. Artemis and Artemis Cloud source data from a range of medical devices capable of transmission of the signal via wired or wireless connectivity and hence are well suited to process real-time data acquired from astronauts. A key benefit of this platform is its ability to monitor their health and wellbeing onboard the mission as well as enabling the astronaut's physiological data, and other clinical data, to be sent to the platform components at Mission Control at each stage when that communication is available. As a result, researchers at Mission Control would be able to simulate, deploy and tailor predictive analytics and diagnostics during the same spaceflight for

  15. Multiprocessor scheduling for real-time systems

    CERN Document Server

    Baruah, Sanjoy; Buttazzo, Giorgio

    2015-01-01

    This book provides a comprehensive overview of both theoretical and pragmatic aspects of resource-allocation and scheduling in multiprocessor and multicore hard-real-time systems.  The authors derive new, abstract models of real-time tasks that capture accurately the salient features of real application systems that are to be implemented on multiprocessor platforms, and identify rules for mapping application systems onto the most appropriate models.  New run-time multiprocessor scheduling algorithms are presented, which are demonstrably better than those currently used, both in terms of run-time efficiency and tractability of off-line analysis.  Readers will benefit from a new design and analysis framework for multiprocessor real-time systems, which will translate into a significantly enhanced ability to provide formally verified, safety-critical real-time systems at a significantly lower cost.

  16. Tablet disintegration studied by high-resolution real-time magnetic resonance imaging.

    Science.gov (United States)

    Quodbach, Julian; Moussavi, Amir; Tammer, Roland; Frahm, Jens; Kleinebudde, Peter

    2014-01-01

    The present work employs recent advances in high-resolution real-time magnetic resonance imaging (MRI) to investigate the disintegration process of tablets containing disintegrants. A temporal resolution of 75 ms and a spatial resolution of 80 × 80 µm with a section thickness of only 600 µm were achieved. The histograms of MRI videos were quantitatively analyzed with MATLAB. The mechanisms of action of six commercially available disintegrants, the influence of relative tablet density, and the impact of disintegrant concentration were examined. Crospovidone seems to be the only disintegrant acting by a shape memory effect, whereas the others mainly swell. A higher relative density of tablets containing croscarmellose sodium leads to a more even distribution of water within the tablet matrix but hardly impacts the disintegration kinetics. Increasing the polacrilin potassium disintegrant concentration leads to a quicker and more thorough disintegration process. Real-time MRI emerges as valuable tool to visualize and investigate the process of tablet disintegration.

  17. Estimating safety effects of pavement management factors utilizing Bayesian random effect models.

    Science.gov (United States)

    Jiang, Ximiao; Huang, Baoshan; Zaretzki, Russell L; Richards, Stephen; Yan, Xuedong

    2013-01-01

    Previous studies of pavement management factors that relate to the occurrence of traffic-related crashes are rare. Traditional research has mostly employed summary statistics of bidirectional pavement quality measurements in extended longitudinal road segments over a long time period, which may cause a loss of important information and result in biased parameter estimates. The research presented in this article focuses on crash risk of roadways with overall fair to good pavement quality. Real-time and location-specific data were employed to estimate the effects of pavement management factors on the occurrence of crashes. This research is based on the crash data and corresponding pavement quality data for the Tennessee state route highways from 2004 to 2009. The potential temporal and spatial correlations among observations caused by unobserved factors were considered. Overall 6 models were built accounting for no correlation, temporal correlation only, and both the temporal and spatial correlations. These models included Poisson, negative binomial (NB), one random effect Poisson and negative binomial (OREP, ORENB), and two random effect Poisson and negative binomial (TREP, TRENB) models. The Bayesian method was employed to construct these models. The inference is based on the posterior distribution from the Markov chain Monte Carlo (MCMC) simulation. These models were compared using the deviance information criterion. Analysis of the posterior distribution of parameter coefficients indicates that the pavement management factors indexed by Present Serviceability Index (PSI) and Pavement Distress Index (PDI) had significant impacts on the occurrence of crashes, whereas the variable rutting depth was not significant. Among other factors, lane width, median width, type of terrain, and posted speed limit were significant in affecting crash frequency. The findings of this study indicate that a reduction in pavement roughness would reduce the likelihood of traffic

  18. Real-Time Risk Assessment Framework for Unmanned Aircraft System (UAS) Traffic Management (UTM)

    Science.gov (United States)

    Ancel, Ersin; Capristan, Francisco M.; Foster, John V.; Condotta, Ryan

    2017-01-01

    The new Federal Aviation Administration (FAA) Small Unmanned Aircraft rule (Part 107) marks the first national regulations for commercial operation of small unmanned aircraft systems (sUAS) under 55 pounds within the National Airspace System (NAS). Although sUAS flights may not be performed beyond visual line-of-sight or over non- participant structures and people, safety of sUAS operations must still be maintained and tracked at all times. Moreover, future safety-critical operation of sUAS (e.g., for package delivery) are already being conceived and tested. NASA's Unmanned Aircraft System Trac Management (UTM) concept aims to facilitate the safe use of low-altitude airspace for sUAS operations. This paper introduces the UTM Risk Assessment Framework (URAF) which was developed to provide real-time safety evaluation and tracking capability within the UTM concept. The URAF uses Bayesian Belief Networks (BBNs) to propagate off -nominal condition probabilities based on real-time component failure indicators. This information is then used to assess the risk to people on the ground by calculating the potential impact area and the effects of the impact. The visual representation of the expected area of impact and the nominal risk level can assist operators and controllers with dynamic trajectory planning and execution. The URAF was applied to a case study to illustrate the concept.

  19. Nucleo multiprocessado para aplicações em tempo-real

    OpenAIRE

    Roberto Andre Hexsel

    1988-01-01

    Resumo: Esta Dissertação descreve o nucleo de tempo-real do Multiprocessador para Sistemas de Controle (MSC). O MSC foi desenvolvido no Instituto de Automação do Centro Tecnológico para Informática e possui características que o tornam adequado a aplicações em controle de processos e automação industrial. Estas aplicações exigem respostas rápidas a eventos externos e grande capacidade de processamento. O MSC pode ser configurado para satisfazer as mais diversas aplicações e diferentes níveis ...

  20. Prototyping real-time systems

    OpenAIRE

    Clynch, Gary

    1994-01-01

    The traditional software development paradigm, the waterfall life cycle model, is defective when used for developing real-time systems. This thesis puts forward an executable prototyping approach for the development of real-time systems. A prototyping system is proposed which uses ESML (Extended Systems Modelling Language) as a prototype specification language. The prototyping system advocates the translation of non-executable ESML specifications into executable LOOPN (Language of Object ...

  1. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  2. Software Design Methods for Real-Time Systems

    Science.gov (United States)

    1989-12-01

    This module describes the concepts and methods used in the software design of real time systems . It outlines the characteristics of real time systems , describes...the role of software design in real time system development, surveys and compares some software design methods for real - time systems , and

  3. ertCPN: The adaptations of the coloured Petri-Net theory for real-time embedded system modeling and automatic code generation

    Directory of Open Access Journals (Sweden)

    Wattanapong Kurdthongmee

    2003-05-01

    Full Text Available A real-time system is a computer system that monitors or controls an external environment. The system must meet various timing and other constraints that are imposed on it by the real-time behaviour of the external world. One of the differences between a real-time and a conventional software is that a real-time program must be both logically and temporally correct. To successfully design and implement a real-time system, some analysis is typically done to assure that requirements or designs are consistent and that they satisfy certain desirable properties that may not be immediately obvious from specification. Executable specifications, prototypes and simulation are particularly useful in real-time systems for debugging specifications. In this paper, we propose the adaptations to the coloured Petri-net theory to ease the modeling, simulation and code generation process of an embedded, microcontroller-based, real-time system. The benefits of the proposed approach are demonstrated by use of our prototype software tool called ENVisAge (an Extended Coloured Petri-Net Based Visual Application Generator Tool.

  4. Real-time Pricing in Power Markets

    DEFF Research Database (Denmark)

    Boom, Anette; Schwenen, Sebastian

    We examine welfare e ects of real-time pricing in electricity markets. Before stochastic energy demand is known, competitive retailers contract with nal consumers who exogenously do not have real-time meters. After demand is realized, two electricity generators compete in a uniform price auction...... to satisfy demand from retailers acting on behalf of subscribed customers and from consumers with real-time meters. Increasing the number of consumers on real-time pricing does not always increase welfare since risk-averse consumers dislike uncertain and high prices arising through market power...

  5. Real-time Pricing in Power Markets

    DEFF Research Database (Denmark)

    Boom, Anette; Schwenen, Sebastian

    We examine welfare eects of real-time pricing in electricity markets. Before stochastic energy demand is known, competitive retailers contract with nal consumers who exogenously do not have real-time meters. After demand is realized, two electricity generators compete in a uniform price auction...... to satisfy demand from retailers acting on behalf of subscribed customers and from consumers with real-time meters. Increasing the number of consumers on real-time pricing does not always increase welfare since risk-averse consumers dislike uncertain and high prices arising through market power...

  6. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  7. Distributed, Embedded and Real-time Java Systems

    CERN Document Server

    Wellings, Andy

    2012-01-01

    Research on real-time Java technology has been prolific over the past decade, leading to a large number of corresponding hardware and software solutions, and frameworks for distributed and embedded real-time Java systems.  This book is aimed primarily at researchers in real-time embedded systems, particularly those who wish to understand the current state of the art in using Java in this domain.  Much of the work in real-time distributed, embedded and real-time Java has focused on the Real-time Specification for Java (RTSJ) as the underlying base technology, and consequently many of the Chapters in this book address issues with, or solve problems using, this framework. Describes innovative techniques in: scheduling, memory management, quality of service and communication systems supporting real-time Java applications; Includes coverage of multiprocessor embedded systems and parallel programming; Discusses state-of-the-art resource management for embedded systems, including Java’s real-time garbage collect...

  8. Design Optimization of Mixed-Criticality Real-Time Applications on Cost-Constrained Partitioned Architectures

    DEFF Research Database (Denmark)

    Tamas-Selicean, Domitian; Pop, Paul

    2011-01-01

    In this paper we are interested to implement mixed-criticality hard real-time applications on a given heterogeneous distributed architecture. Applications have different criticality levels, captured by their Safety-Integrity Level (SIL), and are scheduled using static-cyclic scheduling. Mixed......-criticality tasks can be integrated onto the same architecture only if there is enough spatial and temporal separation among them. We consider that the separation is provided by partitioning, such that applications run in separate partitions, and each partition is allocated several time slots on a processor. Tasks...... slots on each processor and (iv) the schedule tables, such that all the applications are schedulable and the development costs are minimized. We have proposed a Tabu Search-based approach to solve this optimization problem. The proposed algorithm has been evaluated using several synthetic and real...

  9. Research of real-time communication software

    Science.gov (United States)

    Li, Maotang; Guo, Jingbo; Liu, Yuzhong; Li, Jiahong

    2003-11-01

    Real-time communication has been playing an increasingly important role in our work, life and ocean monitor. With the rapid progress of computer and communication technique as well as the miniaturization of communication system, it is needed to develop the adaptable and reliable real-time communication software in the ocean monitor system. This paper involves the real-time communication software research based on the point-to-point satellite intercommunication system. The object-oriented design method is adopted, which can transmit and receive video data and audio data as well as engineering data by satellite channel. In the real-time communication software, some software modules are developed, which can realize the point-to-point satellite intercommunication in the ocean monitor system. There are three advantages for the real-time communication software. One is that the real-time communication software increases the reliability of the point-to-point satellite intercommunication system working. Second is that some optional parameters are intercalated, which greatly increases the flexibility of the system working. Third is that some hardware is substituted by the real-time communication software, which not only decrease the expense of the system and promotes the miniaturization of communication system, but also aggrandizes the agility of the system.

  10. Application of dynamic Bayesian network to risk analysis of domino effects in chemical infrastructures

    International Nuclear Information System (INIS)

    Khakzad, Nima

    2015-01-01

    A domino effect is a low frequency high consequence chain of accidents where a primary accident (usually fire and explosion) in a unit triggers secondary accidents in adjacent units. High complexity and growing interdependencies of chemical infrastructures make them increasingly vulnerable to domino effects. Domino effects can be considered as time dependent processes. Thus, not only the identification of involved units but also their temporal entailment in the chain of accidents matter. More importantly, in the case of domino-induced fires which can generally last much longer compared to explosions, foreseeing the temporal evolution of domino effects and, in particular, predicting the most probable sequence of accidents (or involved units) in a domino effect can be of significance in the allocation of preventive and protective safety measures. Although many attempts have been made to identify the spatial evolution of domino effects, the temporal evolution of such accidents has been overlooked. We have proposed a methodology based on dynamic Bayesian network to model both the spatial and temporal evolutions of domino effects and also to quantify the most probable sequence of accidents in a potential domino effect. The application of the developed methodology has been demonstrated via a hypothetical fuel storage plant. - Highlights: • A Dynamic Bayesian Network methodology has been developed to model domino effects. • Considering time-dependencies, both spatial and temporal evolutions of domino effects have been modeled. • The concept of most probable sequence of accidents has been proposed instead of the most probable combination of accidents. • Using backward analysis, the most vulnerable units have been identified during a potential domino effect. • The proposed methodology does not need to identify a unique primary unit (accident) for domino effect modeling

  11. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  12. Spatio-temporal networks: reachability, centrality and robustness.

    Science.gov (United States)

    Williams, Matthew J; Musolesi, Mirco

    2016-06-01

    Recent advances in spatial and temporal networks have enabled researchers to more-accurately describe many real-world systems such as urban transport networks. In this paper, we study the response of real-world spatio-temporal networks to random error and systematic attack, taking a unified view of their spatial and temporal performance. We propose a model of spatio-temporal paths in time-varying spatially embedded networks which captures the property that, as in many real-world systems, interaction between nodes is non-instantaneous and governed by the space in which they are embedded. Through numerical experiments on three real-world urban transport systems, we study the effect of node failure on a network's topological, temporal and spatial structure. We also demonstrate the broader applicability of this framework to three other classes of network. To identify weaknesses specific to the behaviour of a spatio-temporal system, we introduce centrality measures that evaluate the importance of a node as a structural bridge and its role in supporting spatio-temporally efficient flows through the network. This exposes the complex nature of fragility in a spatio-temporal system, showing that there is a variety of failure modes when a network is subject to systematic attacks.

  13. Continuous Fine-Fault Estimation with Real-Time GNSS

    Science.gov (United States)

    Norford, B. B.; Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C.; Senko, J.; Larsen, D.

    2017-12-01

    Thousands of real-time telemetered GNSS stations operate throughout the circum-Pacific that may be used for rapid earthquake characterization and estimation of local tsunami excitation. We report on the development of a GNSS-based finite-fault inversion system that continuously estimates slip using real-time GNSS position streams from the Cascadia subduction zone and which is being expanded throughout the circum-Pacific. The system uses 1 Hz precise point position streams computed in the ITRF14 reference frame using clock and satellite orbit corrections from the IGS. The software is implemented as seven independent modules that filter time series using Kalman filters, trigger and estimate coseismic offsets, invert for slip using a non-negative least squares method developed by Lawson and Hanson (1974) and elastic half-space Green's Functions developed by Okada (1985), smooth the results temporally and spatially, and write the resulting streams of time-dependent slip to a RabbitMQ messaging server for use by downstream modules such as tsunami excitation modules. Additional fault models can be easily added to the system for other circum-Pacific subduction zones as additional real-time GNSS data become available. The system is currently being tested using data from well-recorded earthquakes including the 2011 Tohoku earthquake, the 2010 Maule earthquake, the 2015 Illapel earthquake, the 2003 Tokachi-oki earthquake, the 2014 Iquique earthquake, the 2010 Mentawai earthquake, the 2016 Kaikoura earthquake, the 2016 Ecuador earthquake, the 2015 Gorkha earthquake, and others. Test data will be fed to the system and the resultant earthquake characterizations will be compared with published earthquake parameters. Seismic events will be assumed to occur on major faults, so, for example, only the San Andreas fault will be considered in Southern California, while the hundreds of other faults in the region will be ignored. Rake will be constrained along each subfault to be

  14. The future of the London Buy-To-Let property market: Simulation with temporal Bayesian Networks

    Science.gov (United States)

    Fenton, Norman

    2017-01-01

    In 2015 the British government announced a number of major tax reforms for individual landlords. To give landlords time to adjust, some of these tax measures are being introduced gradually from April 2017, with full effect in tax year 2020/21. The changes in taxation have received much media attention since there has been widespread belief that the new measures were sufficiently skewed against landlords that they could signal the end of the Buy-To-Let (BTL) investment era in the UK. This paper assesses the prospective performance of BTL investments in London from the investor’s perspective, and examines the impact of incoming tax reforms using a novel Temporal Bayesian Network model. The model captures uncertainties of interest by simulating the impact of changing circumstances and the interventions available to an investor at various time-steps of a BTL investment portfolio. The simulation results suggest that the new tax reforms are likely to have a detrimental effect on net profits from rental income, and this hits risk-seeking investors who favour leverage much harder than risk-averse investors who do not seek to expand their property portfolio. The impact on net profits also poses substantial risks for lossmaking returns excluding capital gains, especially in the case of rising interest rates. While this makes it less desirable or even non-viable for some to continue being a landlord, based on the current status of all factors taken into consideration for simulation, investment prospects are still likely to remain good within a reasonable range of interest rate and capital growth rate variations. The results also suggest that the recent trend of property prices in London increasing faster than rents will not continue for much longer; either capital growth rates will have to decrease, rental growth rates will have to increase, or we shall observe a combination of the two events. PMID:28654698

  15. The future of the London Buy-To-Let property market: Simulation with temporal Bayesian Networks.

    Science.gov (United States)

    Constantinou, Anthony C; Fenton, Norman

    2017-01-01

    In 2015 the British government announced a number of major tax reforms for individual landlords. To give landlords time to adjust, some of these tax measures are being introduced gradually from April 2017, with full effect in tax year 2020/21. The changes in taxation have received much media attention since there has been widespread belief that the new measures were sufficiently skewed against landlords that they could signal the end of the Buy-To-Let (BTL) investment era in the UK. This paper assesses the prospective performance of BTL investments in London from the investor's perspective, and examines the impact of incoming tax reforms using a novel Temporal Bayesian Network model. The model captures uncertainties of interest by simulating the impact of changing circumstances and the interventions available to an investor at various time-steps of a BTL investment portfolio. The simulation results suggest that the new tax reforms are likely to have a detrimental effect on net profits from rental income, and this hits risk-seeking investors who favour leverage much harder than risk-averse investors who do not seek to expand their property portfolio. The impact on net profits also poses substantial risks for lossmaking returns excluding capital gains, especially in the case of rising interest rates. While this makes it less desirable or even non-viable for some to continue being a landlord, based on the current status of all factors taken into consideration for simulation, investment prospects are still likely to remain good within a reasonable range of interest rate and capital growth rate variations. The results also suggest that the recent trend of property prices in London increasing faster than rents will not continue for much longer; either capital growth rates will have to decrease, rental growth rates will have to increase, or we shall observe a combination of the two events.

  16. The future of the London Buy-To-Let property market: Simulation with temporal Bayesian Networks.

    Directory of Open Access Journals (Sweden)

    Anthony C Constantinou

    Full Text Available In 2015 the British government announced a number of major tax reforms for individual landlords. To give landlords time to adjust, some of these tax measures are being introduced gradually from April 2017, with full effect in tax year 2020/21. The changes in taxation have received much media attention since there has been widespread belief that the new measures were sufficiently skewed against landlords that they could signal the end of the Buy-To-Let (BTL investment era in the UK. This paper assesses the prospective performance of BTL investments in London from the investor's perspective, and examines the impact of incoming tax reforms using a novel Temporal Bayesian Network model. The model captures uncertainties of interest by simulating the impact of changing circumstances and the interventions available to an investor at various time-steps of a BTL investment portfolio. The simulation results suggest that the new tax reforms are likely to have a detrimental effect on net profits from rental income, and this hits risk-seeking investors who favour leverage much harder than risk-averse investors who do not seek to expand their property portfolio. The impact on net profits also poses substantial risks for lossmaking returns excluding capital gains, especially in the case of rising interest rates. While this makes it less desirable or even non-viable for some to continue being a landlord, based on the current status of all factors taken into consideration for simulation, investment prospects are still likely to remain good within a reasonable range of interest rate and capital growth rate variations. The results also suggest that the recent trend of property prices in London increasing faster than rents will not continue for much longer; either capital growth rates will have to decrease, rental growth rates will have to increase, or we shall observe a combination of the two events.

  17. Research on classified real-time flood forecasting framework based on K-means cluster and rough set.

    Science.gov (United States)

    Xu, Wei; Peng, Yong

    2015-01-01

    This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods.

  18. A Bayesian least squares support vector machines based framework for fault diagnosis and failure prognosis

    Science.gov (United States)

    Khawaja, Taimoor Saleem

    A high-belief low-overhead Prognostics and Health Management (PHM) system is desired for online real-time monitoring of complex non-linear systems operating in a complex (possibly non-Gaussian) noise environment. This thesis presents a Bayesian Least Squares Support Vector Machine (LS-SVM) based framework for fault diagnosis and failure prognosis in nonlinear non-Gaussian systems. The methodology assumes the availability of real-time process measurements, definition of a set of fault indicators and the existence of empirical knowledge (or historical data) to characterize both nominal and abnormal operating conditions. An efficient yet powerful Least Squares Support Vector Machine (LS-SVM) algorithm, set within a Bayesian Inference framework, not only allows for the development of real-time algorithms for diagnosis and prognosis but also provides a solid theoretical framework to address key concepts related to classification for diagnosis and regression modeling for prognosis. SVM machines are founded on the principle of Structural Risk Minimization (SRM) which tends to find a good trade-off between low empirical risk and small capacity. The key features in SVM are the use of non-linear kernels, the absence of local minima, the sparseness of the solution and the capacity control obtained by optimizing the margin. The Bayesian Inference framework linked with LS-SVMs allows a probabilistic interpretation of the results for diagnosis and prognosis. Additional levels of inference provide the much coveted features of adaptability and tunability of the modeling parameters. The two main modules considered in this research are fault diagnosis and failure prognosis. With the goal of designing an efficient and reliable fault diagnosis scheme, a novel Anomaly Detector is suggested based on the LS-SVM machines. The proposed scheme uses only baseline data to construct a 1-class LS-SVM machine which, when presented with online data is able to distinguish between normal behavior

  19. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  20. Implementing a combined polar-geostationary algorithm for smoke emissions estimation in near real time

    Science.gov (United States)

    Hyer, E. J.; Schmidt, C. C.; Hoffman, J.; Giglio, L.; Peterson, D. A.

    2013-12-01

    Polar and geostationary satellites are used operationally for fire detection and smoke source estimation by many near-real-time operational users, including operational forecast centers around the globe. The input satellite radiance data are processed by data providers to produce Level-2 and Level -3 fire detection products, but processing these data into spatially and temporally consistent estimates of fire activity requires a substantial amount of additional processing. The most significant processing steps are correction for variable coverage of the satellite observations, and correction for conditions that affect the detection efficiency of the satellite sensors. We describe a system developed by the Naval Research Laboratory (NRL) that uses the full raster information from the entire constellation to diagnose detection opportunities, calculate corrections for factors such as angular dependence of detection efficiency, and generate global estimates of fire activity at spatial and temporal scales suitable for atmospheric modeling. By incorporating these improved fire observations, smoke emissions products, such as NRL's FLAMBE, are able to produce improved estimates of global emissions. This talk provides an overview of the system, demonstrates the achievable improvement over older methods, and describes challenges for near-real-time implementation.

  1. Fast model updating coupling Bayesian inference and PGD model reduction

    Science.gov (United States)

    Rubio, Paul-Baptiste; Louf, François; Chamoin, Ludovic

    2018-04-01

    The paper focuses on a coupled Bayesian-Proper Generalized Decomposition (PGD) approach for the real-time identification and updating of numerical models. The purpose is to use the most general case of Bayesian inference theory in order to address inverse problems and to deal with different sources of uncertainties (measurement and model errors, stochastic parameters). In order to do so with a reasonable CPU cost, the idea is to replace the direct model called for Monte-Carlo sampling by a PGD reduced model, and in some cases directly compute the probability density functions from the obtained analytical formulation. This procedure is first applied to a welding control example with the updating of a deterministic parameter. In the second application, the identification of a stochastic parameter is studied through a glued assembly example.

  2. Bayesian Peptide Peak Detection for High Resolution TOF Mass Spectrometry.

    Science.gov (United States)

    Zhang, Jianqiu; Zhou, Xiaobo; Wang, Honghui; Suffredini, Anthony; Zhang, Lin; Huang, Yufei; Wong, Stephen

    2010-11-01

    In this paper, we address the issue of peptide ion peak detection for high resolution time-of-flight (TOF) mass spectrometry (MS) data. A novel Bayesian peptide ion peak detection method is proposed for TOF data with resolution of 10 000-15 000 full width at half-maximum (FWHW). MS spectra exhibit distinct characteristics at this resolution, which are captured in a novel parametric model. Based on the proposed parametric model, a Bayesian peak detection algorithm based on Markov chain Monte Carlo (MCMC) sampling is developed. The proposed algorithm is tested on both simulated and real datasets. The results show a significant improvement in detection performance over a commonly employed method. The results also agree with expert's visual inspection. Moreover, better detection consistency is achieved across MS datasets from patients with identical pathological condition.

  3. Integration of MDSplus in real-time systems

    International Nuclear Information System (INIS)

    Luchetta, A.; Manduchi, G.; Taliercio, C.

    2006-01-01

    RFX-mod makes extensive usage of real-time systems for feedback control and uses MDSplus to interface them to the main Data Acquisition system. For this purpose, the core of MDSplus has been ported to VxWorks, the operating system used for real-time control in RFX. Using this approach, it is possible to integrate real-time systems, but MDSplus is used only for non-real-time tasks, i.e. those tasks which are executed before and after the pulse and whose performance does not affect the system time constraints. More extensive use of MDSplus in real-time systems is foreseen, and a real-time layer for MDSplus is under development, which will provide access to memory-mapped pulse files, shared by the tasks running on the same CPU. Real-time communication will also be integrated in the MDSplus core to provide support for distributed memory-mapped pulse files

  4. Dense time discretization technique for verification of real time systems

    International Nuclear Information System (INIS)

    Makackas, Dalius; Miseviciene, Regina

    2016-01-01

    Verifying the real-time system there are two different models to control the time: discrete and dense time based models. This paper argues a novel verification technique, which calculates discrete time intervals from dense time in order to create all the system states that can be reached from the initial system state. The technique is designed for real-time systems specified by a piece-linear aggregate approach. Key words: real-time system, dense time, verification, model checking, piece-linear aggregate

  5. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  6. A Bayesian decision approach to rainfall thresholds based flood warning

    Directory of Open Access Journals (Sweden)

    M. L. V. Martina

    2006-01-01

    Full Text Available Operational real time flood forecasting systems generally require a hydrological model to run in real time as well as a series of hydro-informatics tools to transform the flood forecast into relatively simple and clear messages to the decision makers involved in flood defense. The scope of this paper is to set forth the possibility of providing flood warnings at given river sections based on the direct comparison of the quantitative precipitation forecast with critical rainfall threshold values, without the need of an on-line real time forecasting system. This approach leads to an extremely simplified alert system to be used by non technical stakeholders and could also be used to supplement the traditional flood forecasting systems in case of system failures. The critical rainfall threshold values, incorporating the soil moisture initial conditions, result from statistical analyses using long hydrological time series combined with a Bayesian utility function minimization. In the paper, results of an application of the proposed methodology to the Sieve river, a tributary of the Arno river in Italy, are given to exemplify its practical applicability.

  7. Sparse Bayesian Learning for Nonstationary Data Sources

    Science.gov (United States)

    Fujimaki, Ryohei; Yairi, Takehisa; Machida, Kazuo

    This paper proposes an online Sparse Bayesian Learning (SBL) algorithm for modeling nonstationary data sources. Although most learning algorithms implicitly assume that a data source does not change over time (stationary), one in the real world usually does due to such various factors as dynamically changing environments, device degradation, sudden failures, etc (nonstationary). The proposed algorithm can be made useable for stationary online SBL by setting time decay parameters to zero, and as such it can be interpreted as a single unified framework for online SBL for use with stationary and nonstationary data sources. Tests both on four types of benchmark problems and on actual stock price data have shown it to perform well.

  8. Continuous time modelling of dynamical spatial lattice data observed at sparsely distributed times

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl; Møller, Jesper

    2007-01-01

    Summary. We consider statistical and computational aspects of simulation-based Bayesian inference for a spatial-temporal model based on a multivariate point process which is only observed at sparsely distributed times. The point processes are indexed by the sites of a spatial lattice......, and they exhibit spatial interaction. For specificity we consider a particular dynamical spatial lattice data set which has previously been analysed by a discrete time model involving unknown normalizing constants. We discuss the advantages and disadvantages of using continuous time processes compared...... with discrete time processes in the setting of the present paper as well as other spatial-temporal situations....

  9. Evaluation of Real-Time Performance of the Virtual Seismologist Earthquake Early Warning Algorithm in Switzerland and California

    Science.gov (United States)

    Behr, Y.; Cua, G. B.; Clinton, J. F.; Heaton, T. H.

    2012-12-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms - the other two being ElarmS (Allen and Kanamori, 2003) and On-Site (Wu and Kanamori, 2005; Boese et al., 2008) algorithms - that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS will be installed and tested at other European networks. VS has been running in real-time on stations of the Southern California Seismic Network (SCSN) since July 2008, and on stations of the Berkeley Digital Seismic Network (BDSN) and the USGS Menlo Park strong motion network in northern California since February 2009. In Switzerland, VS has been running in real-time on stations monitored by the Swiss Seismological Service (including stations from Austria, France, Germany, and Italy) since 2010. We present summaries of the real-time performance of VS in Switzerland and California over the past two and three years respectively. The empirical relationships used by VS to estimate magnitudes and ground motion, originally derived from southern California data, are demonstrated to perform well in northern California and Switzerland. Implementation in real-time and off-line testing in Europe will potentially be extended to southern Italy, western Greece, Istanbul, Romania, and Iceland. Integration of the VS algorithm into both the CISN Advanced

  10. Energy-efficient fault tolerance in multiprocessor real-time systems

    Science.gov (United States)

    Guo, Yifeng

    The recent progress in the multiprocessor/multicore systems has important implications for real-time system design and operation. From vehicle navigation to space applications as well as industrial control systems, the trend is to deploy multiple processors in real-time systems: systems with 4 -- 8 processors are common, and it is expected that many-core systems with dozens of processing cores will be available in near future. For such systems, in addition to general temporal requirement common for all real-time systems, two additional operational objectives are seen as critical: energy efficiency and fault tolerance. An intriguing dimension of the problem is that energy efficiency and fault tolerance are typically conflicting objectives, due to the fact that tolerating faults (e.g., permanent/transient) often requires extra resources with high energy consumption potential. In this dissertation, various techniques for energy-efficient fault tolerance in multiprocessor real-time systems have been investigated. First, the Reliability-Aware Power Management (RAPM) framework, which can preserve the system reliability with respect to transient faults when Dynamic Voltage Scaling (DVS) is applied for energy savings, is extended to support parallel real-time applications with precedence constraints. Next, the traditional Standby-Sparing (SS) technique for dual processor systems, which takes both transient and permanent faults into consideration while saving energy, is generalized to support multiprocessor systems with arbitrary number of identical processors. Observing the inefficient usage of slack time in the SS technique, a Preference-Oriented Scheduling Framework is designed to address the problem where tasks are given preferences for being executed as soon as possible (ASAP) or as late as possible (ALAP). A preference-oriented earliest deadline (POED) scheduler is proposed and its application in multiprocessor systems for energy-efficient fault tolerance is

  11. Bayesian modeling of ChIP-chip data using latent variables.

    KAUST Repository

    Wu, Mingqi

    2009-10-26

    BACKGROUND: The ChIP-chip technology has been used in a wide range of biomedical studies, such as identification of human transcription factor binding sites, investigation of DNA methylation, and investigation of histone modifications in animals and plants. Various methods have been proposed in the literature for analyzing the ChIP-chip data, such as the sliding window methods, the hidden Markov model-based methods, and Bayesian methods. Although, due to the integrated consideration of uncertainty of the models and model parameters, Bayesian methods can potentially work better than the other two classes of methods, the existing Bayesian methods do not perform satisfactorily. They usually require multiple replicates or some extra experimental information to parametrize the model, and long CPU time due to involving of MCMC simulations. RESULTS: In this paper, we propose a Bayesian latent model for the ChIP-chip data. The new model mainly differs from the existing Bayesian models, such as the joint deconvolution model, the hierarchical gamma mixture model, and the Bayesian hierarchical model, in two respects. Firstly, it works on the difference between the averaged treatment and control samples. This enables the use of a simple model for the data, which avoids the probe-specific effect and the sample (control/treatment) effect. As a consequence, this enables an efficient MCMC simulation of the posterior distribution of the model, and also makes the model more robust to the outliers. Secondly, it models the neighboring dependence of probes by introducing a latent indicator vector. A truncated Poisson prior distribution is assumed for the latent indicator variable, with the rationale being justified at length. CONCLUSION: The Bayesian latent method is successfully applied to real and ten simulated datasets, with comparisons with some of the existing Bayesian methods, hidden Markov model methods, and sliding window methods. The numerical results indicate that the

  12. Storm real-time processing cookbook

    CERN Document Server

    Anderson, Quinton

    2013-01-01

    A Cookbook with plenty of practical recipes for different uses of Storm.If you are a Java developer with basic knowledge of real-time processing and would like to learn Storm to process unbounded streams of data in real time, then this book is for you.

  13. Mixed - mode Operating System for Real - time Performance

    Directory of Open Access Journals (Sweden)

    Hasan M. M.

    2017-11-01

    Full Text Available The purpose of the mixed-mode system research is to handle devices with the accuracy of real-time systems and at the same time, having all the benefits and facilities of a matured Graphic User Interface(GUIoperating system which is typicallynon-real-time. This mixed-mode operating system comprising of a real-time portion and a non-real-time portion was studied and implemented to identify the feasibilities and performances in practical applications (in the context of scheduled the real-time events. In this research an i8751 microcontroller-based hardware was used to measure the performance of the system in real-time-only as well as non-real-time-only configurations. The real-time portion is an 486DX-40 IBM PC system running under DOS-based real-time kernel and the non-real-time portion is a Pentium IIIbased system running under Windows NT. It was found that mixed-mode systems performed as good as a typical real-time system and in fact, gave many additional benefits such as simplified/modular programming and load tolerance.

  14. Bayesian Plackett-Luce Mixture Models for Partially Ranked Data.

    Science.gov (United States)

    Mollica, Cristina; Tardella, Luca

    2017-06-01

    The elicitation of an ordinal judgment on multiple alternatives is often required in many psychological and behavioral experiments to investigate preference/choice orientation of a specific population. The Plackett-Luce model is one of the most popular and frequently applied parametric distributions to analyze rankings of a finite set of items. The present work introduces a Bayesian finite mixture of Plackett-Luce models to account for unobserved sample heterogeneity of partially ranked data. We describe an efficient way to incorporate the latent group structure in the data augmentation approach and the derivation of existing maximum likelihood procedures as special instances of the proposed Bayesian method. Inference can be conducted with the combination of the Expectation-Maximization algorithm for maximum a posteriori estimation and the Gibbs sampling iterative procedure. We additionally investigate several Bayesian criteria for selecting the optimal mixture configuration and describe diagnostic tools for assessing the fitness of ranking distributions conditionally and unconditionally on the number of ranked items. The utility of the novel Bayesian parametric Plackett-Luce mixture for characterizing sample heterogeneity is illustrated with several applications to simulated and real preference ranked data. We compare our method with the frequentist approach and a Bayesian nonparametric mixture model both assuming the Plackett-Luce model as a mixture component. Our analysis on real datasets reveals the importance of an accurate diagnostic check for an appropriate in-depth understanding of the heterogenous nature of the partial ranking data.

  15. A Hierarchical Bayesian Model for the Identification of PET Markers Associated to the Prediction of Surgical Outcome after Anterior Temporal Lobe Resection

    Directory of Open Access Journals (Sweden)

    Sharon Chiang

    2017-12-01

    Full Text Available We develop an integrative Bayesian predictive modeling framework that identifies individual pathological brain states based on the selection of fluoro-deoxyglucose positron emission tomography (PET imaging biomarkers and evaluates the association of those states with a clinical outcome. We consider data from a study on temporal lobe epilepsy (TLE patients who subsequently underwent anterior temporal lobe resection. Our modeling framework looks at the observed profiles of regional glucose metabolism in PET as the phenotypic manifestation of a latent individual pathologic state, which is assumed to vary across the population. The modeling strategy we adopt allows the identification of patient subgroups characterized by latent pathologies differentially associated to the clinical outcome of interest. It also identifies imaging biomarkers characterizing the pathological states of the subjects. In the data application, we identify a subgroup of TLE patients at high risk for post-surgical seizure recurrence after anterior temporal lobe resection, together with a set of discriminatory brain regions that can be used to distinguish the latent subgroups. We show that the proposed method achieves high cross-validated accuracy in predicting post-surgical seizure recurrence.

  16. Research in Distributed Real-Time Systems

    Science.gov (United States)

    Mukkamala, R.

    1997-01-01

    This document summarizes the progress we have made on our study of issues concerning the schedulability of real-time systems. Our study has produced several results in the scalability issues of distributed real-time systems. In particular, we have used our techniques to resolve schedulability issues in distributed systems with end-to-end requirements. During the next year (1997-98), we propose to extend the current work to address the modeling and workload characterization issues in distributed real-time systems. In particular, we propose to investigate the effect of different workload models and component models on the design and the subsequent performance of distributed real-time systems.

  17. Bayesian data analysis in population ecology: motivations, methods, and benefits

    Science.gov (United States)

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  18. Toward transient finite element simulation of thermal deformation of machine tools in real-time

    Science.gov (United States)

    Naumann, Andreas; Ruprecht, Daniel; Wensch, Joerg

    2018-01-01

    Finite element models without simplifying assumptions can accurately describe the spatial and temporal distribution of heat in machine tools as well as the resulting deformation. In principle, this allows to correct for displacements of the Tool Centre Point and enables high precision manufacturing. However, the computational cost of FE models and restriction to generic algorithms in commercial tools like ANSYS prevents their operational use since simulations have to run faster than real-time. For the case where heat diffusion is slow compared to machine movement, we introduce a tailored implicit-explicit multi-rate time stepping method of higher order based on spectral deferred corrections. Using the open-source FEM library DUNE, we show that fully coupled simulations of the temperature field are possible in real-time for a machine consisting of a stock sliding up and down on rails attached to a stand.

  19. Unsupervised Bayesian linear unmixing of gene expression microarrays.

    Science.gov (United States)

    Bazot, Cécile; Dobigeon, Nicolas; Tourneret, Jean-Yves; Zaas, Aimee K; Ginsburg, Geoffrey S; Hero, Alfred O

    2013-03-19

    This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores

  20. Real-time data access layer for MDSplus

    International Nuclear Information System (INIS)

    Manduchi, G.; Luchetta, A.; Taliercio, C.; Fredian, T.; Stillerman, J.

    2008-01-01

    Recent extensions to MDSplus allow data handling in long discharges and provide a real-time data access and communication layer. The real-time data access layer is an additional component of MDSplus: it is possible to use the traditional MDSplus API during normal operation, and to select a subset of data items to be used in real time. Real-time notification is provided by a communication layer using a publish-subscribe pattern. The notification covers processes sharing the same data items even running on different machines, thus allowing the implementation of distributed control systems. The real-time data access layer has been developed for Windows, Linux, and VxWorks; it is currently being ported to Linux RTAI. In order to quantify the fingerprint of the presented system, the performance of the real-time access layer approach is compared with that of an ad hoc, manually optimized program in a sample real-time application

  1. A Real-Time Systems Symposium Preprint.

    Science.gov (United States)

    1983-09-01

    Real - Time Systems Symposium Preprint Interim Tech...estimate of the occurence of the error. Unclassii ledSECUqITY CLASSIF’ICA T" NO MI*IA If’ inDI /’rrd erter for~~ble. ’Corrputnqg A REAL - TIME SYSTEMS SYMPOSIUM...ABSTRACT This technical report contains a preprint of a paper accepted for presentation at the REAL - TIME SYSTEMS SYMPOSIUM, Arlington,

  2. Benefits of real-time gas management

    International Nuclear Information System (INIS)

    Nolty, R.; Dolezalek, D. Jr.

    1994-01-01

    In today's competitive gas gathering, processing, storage and transportation business environment, the requirements to do business are continually changing. These changes arise from government regulations such as the amendments to the Clean Air Act concerning the environment and FERC Order 636 concerning business practices. Other changes are due to advances in technology such as electronic flow measurement (EFM) and real-time communications capabilities within the gas industry. Gas gathering, processing, storage and transportation companies must be flexible in adapting to these changes to remain competitive. These dynamic requirements can be met with an open, real-time gas management computer information system. Such a system provides flexible services with a variety of software applications. Allocations, nominations management and gas dispatching are examples of applications that are provided on a real-time basis. By providing real-time services, the gas management system enables operations personnel to make timely adjustments within the current accounting period. Benefits realized from implementing a real-time gas management system include reduced unaccountable gas, reduced imbalance penalties, reduced regulatory violations, improved facility operations and better service to customers. These benefits give a company the competitive edge. This article discusses the applications provided, the benefits from implementing a real-time gas management system, and the definition of such a system

  3. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  4. MedTime: a temporal information extraction system for clinical narratives.

    Science.gov (United States)

    Lin, Yu-Kai; Chen, Hsinchun; Brown, Randall A

    2013-12-01

    Temporal information extraction from clinical narratives is of critical importance to many clinical applications. We participated in the EVENT/TIMEX3 track of the 2012 i2b2 clinical temporal relations challenge, and presented our temporal information extraction system, MedTime. MedTime comprises a cascade of rule-based and machine-learning pattern recognition procedures. It achieved a micro-averaged f-measure of 0.88 in both the recognitions of clinical events and temporal expressions. We proposed and evaluated three time normalization strategies to normalize relative time expressions in clinical texts. The accuracy was 0.68 in normalizing temporal expressions of dates, times, durations, and frequencies. This study demonstrates and evaluates the integration of rule-based and machine-learning-based approaches for high performance temporal information extraction from clinical narratives. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Making real-time reactive systems reliable

    Science.gov (United States)

    Marzullo, Keith; Wood, Mark

    1990-01-01

    A reactive system is characterized by a control program that interacts with an environment (or controlled program). The control program monitors the environment and reacts to significant events by sending commands to the environment. This structure is quite general. Not only are most embedded real time systems reactive systems, but so are monitoring and debugging systems and distributed application management systems. Since reactive systems are usually long running and may control physical equipment, fault tolerance is vital. The research tries to understand the principal issues of fault tolerance in real time reactive systems and to build tools that allow a programmer to design reliable, real time reactive systems. In order to make real time reactive systems reliable, several issues must be addressed: (1) How can a control program be built to tolerate failures of sensors and actuators. To achieve this, a methodology was developed for transforming a control program that references physical value into one that tolerates sensors that can fail and can return inaccurate values; (2) How can the real time reactive system be built to tolerate failures of the control program. Towards this goal, whether the techniques presented can be extended to real time reactive systems is investigated; and (3) How can the environment be specified in a way that is useful for writing a control program. Towards this goal, whether a system with real time constraints can be expressed as an equivalent system without such constraints is also investigated.

  6. Bayesian signal processing classical, modern, and particle filtering methods

    CERN Document Server

    Candy, James V

    2016-01-01

    This book aims to give readers a unified Bayesian treatment starting from the basics (Baye's rule) to the more advanced (Monte Carlo sampling), evolving to the next-generation model-based techniques (sequential Monte Carlo sampling). This next edition incorporates a new chapter on "Sequential Bayesian Detection," a new section on "Ensemble Kalman Filters" as well as an expansion of Case Studies that detail Bayesian solutions for a variety of applications. These studies illustrate Bayesian approaches to real-world problems incorporating detailed particle filter designs, adaptive particle filters and sequential Bayesian detectors. In addition to these major developments a variety of sections are expanded to "fill-in-the gaps" of the first edition. Here metrics for particle filter (PF) designs with emphasis on classical "sanity testing" lead to ensemble techniques as a basic requirement for performance analysis. The expansion of information theory metrics and their application to PF designs is fully developed an...

  7. Bayesian multi-scale smoothing of photon-limited images with applications to astronomy and medicine

    Science.gov (United States)

    White, John

    Multi-scale models for smoothing Poisson signals or images have gained much attention over the past decade. A new Bayesian model is developed using the concept of the Chinese restaurant process to find structures in two-dimensional images when performing image reconstruction or smoothing. This new model performs very well when compared to other leading methodologies for the same problem. It is developed and evaluated theoretically and empirically throughout Chapter 2. The newly developed Bayesian model is extended to three-dimensional images in Chapter 3. The third dimension has numerous different applications, such as different energy spectra, another spatial index, or possibly a temporal dimension. Empirically, this method shows promise in reducing error with the use of simulation studies. A further development removes background noise in the image. This removal can further reduce the error and is done using a modeling adjustment and post-processing techniques. These details are given in Chapter 4. Applications to real world problems are given throughout. Photon-based images are common in astronomical imaging due to the collection of different types of energy such as X-Rays. Applications to real astronomical images are given, and these consist of X-ray images from the Chandra X-ray observatory satellite. Diagnostic medicine uses many types of imaging such as magnetic resonance imaging and computed tomography that can also benefit from smoothing techniques such as the one developed here. Reducing the amount of radiation a patient takes will make images more noisy, but this can be mitigated through the use of image smoothing techniques. Both types of images represent the potential real world use for these methods.

  8. A study of finite mixture model: Bayesian approach on financial time series data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-07-01

    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  9. Space Weather and Real-Time Monitoring

    Directory of Open Access Journals (Sweden)

    S Watari

    2009-04-01

    Full Text Available Recent advance of information and communications technology enables to collect a large amount of ground-based and space-based observation data in real-time. The real-time data realize nowcast of space weather. This paper reports a history of space weather by the International Space Environment Service (ISES in association with the International Geophysical Year (IGY and importance of real-time monitoring in space weather.

  10. Research Directions in Real-Time Systems.

    Science.gov (United States)

    1996-09-01

    This report summarizes a survey of published research in real time systems . Material is presented that provides an overview of the topic, focusing on...communications protocols and scheduling techniques. It is noted that real - time systems deserve special attention separate from other areas because of...formal tools for design and analysis of real - time systems . The early work on applications as well as notable theoretical advances are summarized

  11. Bayesian site selection for fast Gaussian process regression

    KAUST Repository

    Pourhabib, Arash; Liang, Faming; Ding, Yu

    2014-01-01

    Gaussian Process (GP) regression is a popular method in the field of machine learning and computer experiment designs; however, its ability to handle large data sets is hindered by the computational difficulty in inverting a large covariance matrix. Likelihood approximation methods were developed as a fast GP approximation, thereby reducing the computation cost of GP regression by utilizing a much smaller set of unobserved latent variables called pseudo points. This article reports a further improvement to the likelihood approximation methods by simultaneously deciding both the number and locations of the pseudo points. The proposed approach is a Bayesian site selection method where both the number and locations of the pseudo inputs are parameters in the model, and the Bayesian model is solved using a reversible jump Markov chain Monte Carlo technique. Through a number of simulated and real data sets, it is demonstrated that with appropriate priors chosen, the Bayesian site selection method can produce a good balance between computation time and prediction accuracy: it is fast enough to handle large data sets that a full GP is unable to handle, and it improves, quite often remarkably, the prediction accuracy, compared with the existing likelihood approximations. © 2014 Taylor and Francis Group, LLC.

  12. Bayesian site selection for fast Gaussian process regression

    KAUST Repository

    Pourhabib, Arash

    2014-02-05

    Gaussian Process (GP) regression is a popular method in the field of machine learning and computer experiment designs; however, its ability to handle large data sets is hindered by the computational difficulty in inverting a large covariance matrix. Likelihood approximation methods were developed as a fast GP approximation, thereby reducing the computation cost of GP regression by utilizing a much smaller set of unobserved latent variables called pseudo points. This article reports a further improvement to the likelihood approximation methods by simultaneously deciding both the number and locations of the pseudo points. The proposed approach is a Bayesian site selection method where both the number and locations of the pseudo inputs are parameters in the model, and the Bayesian model is solved using a reversible jump Markov chain Monte Carlo technique. Through a number of simulated and real data sets, it is demonstrated that with appropriate priors chosen, the Bayesian site selection method can produce a good balance between computation time and prediction accuracy: it is fast enough to handle large data sets that a full GP is unable to handle, and it improves, quite often remarkably, the prediction accuracy, compared with the existing likelihood approximations. © 2014 Taylor and Francis Group, LLC.

  13. Application of xCELLigence RTCA Biosensor Technology for Revealing the Profile and Window of Drug Responsiveness in Real Time

    Directory of Open Access Journals (Sweden)

    Dan Kho

    2015-04-01

    Full Text Available The xCELLigence technology is a real-time cellular biosensor, which measures the net adhesion of cells to high-density gold electrode arrays printed on custom-designed E-plates. The strength of cellular adhesion is influenced by a myriad of factors that include cell type, cell viability, growth, migration, spreading and proliferation. We therefore hypothesised that xCELLigence biosensor technology would provide a valuable platform for the measurement of drug responses in a multitude of different experimental, clinical or pharmacological contexts. In this manuscript, we demonstrate how xCELLigence technology has been invaluable in the identification of (1 not only if cells respond to a particular drug, but (2 the window of drug responsiveness. The latter aspect is often left to educated guess work in classical end-point assays, whereas biosensor technology reveals the temporal profile of the response in real time, which enables both acute responses and longer term responses to be profiled within the same assay. In our experience, the xCELLigence biosensor technology is suitable for highly targeted drug assessment and also low to medium throughput drug screening, which produces high content temporal data in real time.

  14. Real-time GPS seismology using a single receiver: method comparison, error analysis and precision validation

    Science.gov (United States)

    Li, Xingxing

    2014-05-01

    displacements is accompanied by a drift due to the potential uncompensated errors. Li et al. (2013) presented a temporal point positioning (TPP) method to quickly capture coseismic displacements with a single GPS receiver in real-time. The TPP approach can overcome the convergence problem of precise point positioning (PPP), and also avoids the integration and de-trending process of the variometric approach. The performance of TPP is demonstrated to be at few centimeters level of displacement accuracy for even twenty minutes interval with real-time precise orbit and clock products. In this study, we firstly present and compare the observation models and processing strategies of the current existing single-receiver methods for real-time GPS seismology. Furthermore, we propose several refinements to the variometric approach in order to eliminate the drift trend in the integrated coseismic displacements. The mathematical relationship between these methods is discussed in detail and their equivalence is also proved. The impact of error components such as satellite ephemeris, ionospheric delay, tropospheric delay, and geometry change on the retrieved displacements are carefully analyzed and investigated. Finally, the performance of these single-receiver approaches for real-time GPS seismology is validated using 1 Hz GPS data collected during the Tohoku-Oki earthquake (Mw 9.0, March 11, 2011) in Japan. It is shown that few centimeters accuracy of coseismic displacements is achievable. Keywords: High-rate GPS; real-time GPS seismology; a single receiver; PPP; variometric approach; temporal point positioning; error analysis; coseismic displacement; fault slip inversion;

  15. Moving through time: the role of personality in three real-life contexts.

    Science.gov (United States)

    Duffy, Sarah E; Feist, Michele I; McCarthy, Steven

    2014-01-01

    In English, two deictic space-time metaphors are in common usage: the Moving Ego metaphor conceptualizes the ego as moving forward through time and the Moving Time metaphor conceptualizes time as moving forward toward the ego (Clark, 1973). Although earlier research investigating the psychological reality of these metaphors has typically examined spatial influences on temporal reasoning (e.g., Boroditsky & Ramscar, 2002), recent lines of research have extended beyond this, providing initial evidence that personality differences and emotional experiences may also influence how people reason about events in time (Duffy & Feist, 2014; Hauser, Carter, & Meier, 2009; Richmond, Wilson, & Zinken, 2012). In this article, we investigate whether these relationships have force in real life. Building on the effects of individual differences in self-reported conscientiousness and procrastination found by Duffy and Feist (2014), we examined whether, in addition to self-reported conscientiousness and procrastination, there is a relationship between conscientious and procrastinating behaviors and temporal perspective. We found that participants who adopted the Moving Time perspective were more likely to exhibit conscientious behaviors, while those who adopted the Moving Ego perspective were more likely to procrastinate, suggesting that the earlier effects reach beyond the laboratory. Copyright © 2014 Cognitive Science Society, Inc.

  16. A Bayesian foundation for individual learning under uncertainty

    Directory of Open Access Journals (Sweden)

    Christoph eMathys

    2011-05-01

    Full Text Available Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty. The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next higher level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i are analytical and extremely efficient, enabling real-time learning, (ii have a natural interpretation in terms of RL, and (iii contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty. These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability

  17. A bayesian foundation for individual learning under uncertainty.

    Science.gov (United States)

    Mathys, Christoph; Daunizeau, Jean; Friston, Karl J; Stephan, Klaas E

    2011-01-01

    Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory.

  18. Real-Time MENTAT programming language and architecture

    Science.gov (United States)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  19. Real-time optical diagnostics of graphene growth induced by pulsed chemical vapor deposition

    Science.gov (United States)

    Puretzky, Alexander A.; Geohegan, David B.; Pannala, Sreekanth; Rouleau, Christopher M.; Regmi, Murari; Thonnard, Norbert; Eres, Gyula

    2013-06-01

    The kinetics and mechanisms of graphene growth on Ni films at 720-880 °C have been measured using fast pulses of acetylene and real-time optical diagnostics. In situ UV-Raman spectroscopy was used to unambiguously detect isothermal graphene growth at high temperatures, measure the growth kinetics with ~1 s temporal resolution, and estimate the fractional precipitation upon cooldown. Optical reflectivity and videography provided much faster temporal resolution. Both the growth kinetics and the fractional isothermal precipitation were found to be governed by the C2H2 partial pressure in the CVD pulse for a given film thickness and temperature, with up to ~94% of graphene growth occurring isothermally within 1 second at 800 °C at high partial pressures. At lower partial pressures, isothermal graphene growth is shown to continue 10 seconds after the gas pulse. These flux-dependent growth kinetics are described in the context of a dissolution/precipitation model, where carbon rapidly dissolves into the Ni film and later precipitates driven by gradients in the chemical potential. The combination of pulsed-CVD and real-time optical diagnostics opens new opportunities to understand and control the fast, sub-second growth of graphene on various substrates at high temperatures.The kinetics and mechanisms of graphene growth on Ni films at 720-880 °C have been measured using fast pulses of acetylene and real-time optical diagnostics. In situ UV-Raman spectroscopy was used to unambiguously detect isothermal graphene growth at high temperatures, measure the growth kinetics with ~1 s temporal resolution, and estimate the fractional precipitation upon cooldown. Optical reflectivity and videography provided much faster temporal resolution. Both the growth kinetics and the fractional isothermal precipitation were found to be governed by the C2H2 partial pressure in the CVD pulse for a given film thickness and temperature, with up to ~94% of graphene growth occurring isothermally

  20. Real Time Conference 2016 Overview

    Science.gov (United States)

    Luchetta, Adriano

    2017-06-01

    This is a special issue of the IEEE Transactions on Nuclear Science containing papers from the invited, oral, and poster presentation of the 20th Real Time Conference (RT2016). The conference was held June 6-10, 2016, at Centro Congressi Padova “A. Luciani,” Padova, Italy, and was organized by Consorzio RFX (CNR, ENEA, INFN, Università di Padova, Acciaierie Venete SpA) and the Istituto Nazionale di Fisica Nucleare. The Real Time Conference is multidisciplinary and focuses on the latest developments in real-time techniques in high-energy physics, nuclear physics, astrophysics and astroparticle physics, nuclear fusion, medical physics, space instrumentation, nuclear power instrumentation, general radiation instrumentation, and real-time security and safety. Taking place every second year, it is sponsored by the Computer Application in Nuclear and Plasma Sciences technical committee of the IEEE Nuclear and Plasma Sciences Society. RT2016 attracted more than 240 registrants, with a large proportion of young researchers and engineers. It had an attendance of 67 students from many countries.

  1. Run-time middleware to support real-time system scenarios

    NARCIS (Netherlands)

    Goossens, K.; Koedam, M.; Sinha, S.; Nelson, A.; Geilen, M.

    2015-01-01

    Systems on Chip (SOC) are powerful multiprocessor systems capable of running multiple independent applications, often with both real-time and non-real-time requirements. Scenarios exist at two levels: first, combinations of independent applications, and second, different states of a single

  2. Advanced real-time manipulation of video streams

    CERN Document Server

    Herling, Jan

    2014-01-01

    Diminished Reality is a new fascinating technology that removes real-world content from live video streams. This sensational live video manipulation actually removes real objects and generates a coherent video stream in real-time. Viewers cannot detect modified content. Existing approaches are restricted to moving objects and static or almost static cameras and do not allow real-time manipulation of video content. Jan Herling presents a new and innovative approach for real-time object removal with arbitrary camera movements.

  3. The effect of stimulus intensity on response time and accuracy in dynamic, temporally constrained environments.

    Science.gov (United States)

    Causer, J; McRobert, A P; Williams, A M

    2013-10-01

    The ability to make accurate judgments and execute effective skilled movements under severe temporal constraints are fundamental to elite performance in a number of domains including sport, military combat, law enforcement, and medicine. In two experiments, we examine the effect of stimulus strength on response time and accuracy in a temporally constrained, real-world, decision-making task. Specifically, we examine the effect of low stimulus intensity (black) and high stimulus intensity (sequin) uniform designs, worn by teammates, to determine the effect of stimulus strength on the ability of soccer players to make rapid and accurate responses. In both field- and laboratory-based scenarios, professional soccer players viewed developing patterns of play and were required to make a penetrative pass to an attacking player. Significant differences in response accuracy between uniform designs were reported in laboratory- and field-based experiments. Response accuracy was significantly higher in the sequin compared with the black uniform condition. Response times only differed between uniform designs in the laboratory-based experiment. These findings extend the literature into a real-world environment and have significant implications for the design of clothing wear in a number of domains. © 2012 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. Dynamic probability evaluation of safety levels of earth-rockfill dams using Bayesian approach

    Directory of Open Access Journals (Sweden)

    Zi-wu Fan

    2009-06-01

    Full Text Available In order to accurately predict and control the aging process of dams, new information should be collected continuously to renew the quantitative evaluation of dam safety levels. Owing to the complex structural characteristics of dams, it is quite difficult to predict the time-varying factors affecting their safety levels. It is not feasible to employ dynamic reliability indices to evaluate the actual safety levels of dams. Based on the relevant regulations for dam safety classification in China, a dynamic probability description of dam safety levels was developed. Using the Bayesian approach and effective information mining, as well as real-time information, this study achieved more rational evaluation and prediction of dam safety levels. With the Bayesian expression of discrete stochastic variables, the a priori probabilities of the dam safety levels determined by experts were combined with the likelihood probability of the real-time check information, and the probability information for the evaluation of dam safety levels was renewed. The probability index was then applied to dam rehabilitation decision-making. This method helps reduce the difficulty and uncertainty of the evaluation of dam safety levels and complies with the current safe decision-making regulations for dams in China. It also enhances the application of current risk analysis methods for dam safety levels.

  5. Cost-Effective Video Filtering Solution for Real-Time Vision Systems

    Directory of Open Access Journals (Sweden)

    Karl Martin

    2005-08-01

    Full Text Available This paper presents an efficient video filtering scheme and its implementation in a field-programmable logic device (FPLD. Since the proposed nonlinear, spatiotemporal filtering scheme is based on order statistics, its efficient implementation benefits from a bit-serial realization. The utilization of both the spatial and temporal correlation characteristics of the processed video significantly increases the computational demands on this solution, and thus, implementation becomes a significant challenge. Simulation studies reported in this paper indicate that the proposed pipelined bit-serial FPLD filtering solution can achieve speeds of up to 97.6 Mpixels/s and consumes 1700 to 2700 logic cells for the speed-optimized and area-optimized versions, respectively. Thus, the filter area represents only 6.6 to 10.5% of the Altera STRATIX EP1S25 device available on the Altera Stratix DSP evaluation board, which has been used to implement a prototype of the entire real-time vision system. As such, the proposed adaptive video filtering scheme is both practical and attractive for real-time machine vision and surveillance systems as well as conventional video and multimedia applications.

  6. On the choice of the demand and hydraulic modeling approach to WDN real-time simulation

    Science.gov (United States)

    Creaco, Enrico; Pezzinga, Giuseppe; Savic, Dragan

    2017-07-01

    This paper aims to analyze two demand modeling approaches, i.e., top-down deterministic (TDA) and bottom-up stochastic (BUA), with particular reference to their impact on the hydraulic modeling of water distribution networks (WDNs). In the applications, the hydraulic modeling is carried out through the extended period simulation (EPS) and unsteady flow modeling (UFM). Taking as benchmark the modeling conditions that are closest to the WDN's real operation (UFM + BUA), the analysis showed that the traditional use of EPS + TDA produces large pressure head and water discharge errors, which can be attenuated only when large temporal steps (up to 1 h in the case study) are used inside EPS. The use of EPS + BUA always yields better results. Indeed, EPS + BUA already gives a good approximation of the WDN's real operation when intermediate temporal steps (larger than 2 min in the case study) are used for the simulation. The trade-off between consistency of results and computational burden makes EPS + BUA the most suitable tool for real-time WDN simulation, while benefitting from data acquired through smart meters for the parameterization of demand generation models.

  7. Improved head direction command classification using an optimised Bayesian neural network.

    Science.gov (United States)

    Nguyen, Son T; Nguyen, Hung T; Taylor, Philip B; Middleton, James

    2006-01-01

    Assistive technologies have recently emerged to improve the quality of life of severely disabled people by enhancing their independence in daily activities. Since many of those individuals have limited or non-existing control from the neck downward, alternative hands-free input modalities have become very important for these people to access assistive devices. In hands-free control, head movement has been proved to be a very effective user interface as it can provide a comfortable, reliable and natural way to access the device. Recently, neural networks have been shown to be useful not only for real-time pattern recognition but also for creating user-adaptive models. Since multi-layer perceptron neural networks trained using standard back-propagation may cause poor generalisation, the Bayesian technique has been proposed to improve the generalisation and robustness of these networks. This paper describes the use of Bayesian neural networks in developing a hands-free wheelchair control system. The experimental results show that with the optimised architecture, classification Bayesian neural networks can detect head commands of wheelchair users accurately irrespective to their levels of injuries.

  8. Spatio-temporal analysis of Modified Omori law in Bayesian framework

    Science.gov (United States)

    Rezanezhad, V.; Narteau, C.; Shebalin, P.; Zoeller, G.; Holschneider, M.

    2017-12-01

    This work presents a study of the spatio temporal evolution of the modified Omori parameters in southern California in then time period of 1981-2016. A nearest-neighbor approach is applied for earthquake clustering. This study targets small mainshocks and corresponding big aftershocks ( 2.5 ≤ mmainshocks ≤ 4.5 and 1.8 ≤ maftershocks ≤ 2.8 ). We invert for the spatio temporal behavior of c and p values (especially c) all over the area using a MCMC based maximum likelihood estimator. As parameterizing families we use Voronoi cells with randomly distributed cell centers. Considering that c value represents a physical character like stress change we expect to see a coherent c value pattern over seismologically coacting areas. This correlation of c valus can actually be seen for the San Andreas, San Jacinto and Elsinore faults. Moreover, the depth dependency of c value is studied which shows a linear behavior of log(c) with respect to aftershock's depth within 5 to 15 km depth.

  9. Real space process algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Baeten, J.C.M.

    1993-01-01

    The real time process algebra of Baeten and Bergstra [Formal Aspects of Computing, 3, 142-188 (1991)] is extended to real space by requiring the presence of spatial coordinates for each atomic action, in addition to the required temporal attribute. It is found that asynchronous communication

  10. Real-time radionuclide identification in γ-emitter mixtures based on spiking neural network

    International Nuclear Information System (INIS)

    Bobin, C.; Bichler, O.; Lourenço, V.; Thiam, C.; Thévenin, M.

    2016-01-01

    Portal radiation monitors dedicated to the prevention of illegal traffic of nuclear materials at international borders need to deliver as fast as possible a radionuclide identification of a potential radiological threat. Spectrometry techniques applied to identify the radionuclides contributing to γ-emitter mixtures are usually performed using off-line spectrum analysis. As an alternative to these usual methods, a real-time processing based on an artificial neural network and Bayes’ rule is proposed for fast radionuclide identification. The validation of this real-time approach was carried out using γ-emitter spectra ( 241 Am, 133 Ba, 207 Bi, 60 Co, 137 Cs) obtained with a high-efficiency well-type NaI(Tl). The first tests showed that the proposed algorithm enables a fast identification of each γ-emitting radionuclide using the information given by the whole spectrum. Based on an iterative process, the on-line analysis only needs low-statistics spectra without energy calibration to identify the nature of a radiological threat. - Highlights: • A fast radionuclide identification algorithm applicable in spectroscopic portal monitors is presented. • The proposed algorithm combines a Bayesian sequential approach and a spiking neural network. • The algorithm was validated using the mixture of γ-emitter spectra provided by a well-type NaI(Tl) detector. • The radionuclide identification process is implemented using the whole γ-spectrum without energy calibration.

  11. Archtecture of distributed real-time systems

    OpenAIRE

    Wing Leung, Cheuk

    2013-01-01

    CRAFTERS (Constraint and Application Driven Framework for Tailoring Embedded Real-time System) project aims to address the problem of uncertainty and heterogeneity in a distributed system by providing seamless, portable connectivity and middleware. This thesis contributes to the project by investigating the techniques that can be used in a distributed real-time embedded system. The conclusion is that, there is a list of specifications to be meet in order to provide a transparent and real-time...

  12. Analyzing bioassay data using Bayesian methods-A primer

    International Nuclear Information System (INIS)

    Miller, G.; Inkret, W.C.; Schillaci, M.E.

    1997-01-01

    The classical statistics approach used in health physics for the interpretation of measurements is deficient in that it does not allow for the consideration of needle in a haystack effects, where events that are rare in a population are being detected. In fact, this is often the case in health physics measurements, and the false positive fraction is often very large using the prescriptions of classical statistics. Bayesian statistics provides an objective methodology to ensure acceptably small false positive fractions. The authors present the basic methodology and a heuristic discussion. Examples are given using numerically generated and real bioassay data (Tritium). Various analytical models are used to fit the prior probability distribution, in order to test the sensitivity to choice of model. Parametric studies show that the normalized Bayesian decision level k α -L c /σ 0 , where σ 0 is the measurement uncertainty for zero true amount, is usually in the range from 3 to 5 depending on the true positive rate. Four times σ 0 rather than approximately two times σ 0 , as in classical statistics, would often seem a better choice for the decision level

  13. Real-time temperature field measurement based on acoustic tomography

    International Nuclear Information System (INIS)

    Bao, Yong; Jia, Jiabin; Polydorides, Nick

    2017-01-01

    Acoustic tomography can be used to measure the temperature field from the time-of-flight (TOF). In order to capture real-time temperature field changes and accurately yield quantitative temperature images, two improvements to the conventional acoustic tomography system are studied: simultaneous acoustic transmission and TOF collection along multiple ray paths, and an offline iteration reconstruction algorithm. During system operation, all the acoustic transceivers send modulated and filtered wideband Kasami sequences simultaneously to facilitate fast and accurate TOF measurements using cross-correlation detection. For image reconstruction, the iteration process is separated and executed offline beforehand to shorten computation time for online temperature field reconstruction. The feasibility and effectiveness of the developed methods are validated in the simulation study. The simulation results demonstrate that the proposed method can reduce the processing time per frame from 160 ms to 20 ms, while the reconstruction error remains less than 5%. Hence, the proposed method has great potential in the measurement of rapid temperature change with good temporal and spatial resolution. (paper)

  14. Diagnosis and Reconfiguration using Bayesian Networks: An Electrical Power System Case Study

    Science.gov (United States)

    Knox, W. Bradley; Mengshoel, Ole

    2009-01-01

    Automated diagnosis and reconfiguration are important computational techniques that aim to minimize human intervention in autonomous systems. In this paper, we develop novel techniques and models in the context of diagnosis and reconfiguration reasoning using causal Bayesian networks (BNs). We take as starting point a successful diagnostic approach, using a static BN developed for a real-world electrical power system. We discuss in this paper the extension of this diagnostic approach along two dimensions, namely: (i) from a static BN to a dynamic BN; and (ii) from a diagnostic task to a reconfiguration task. More specifically, we discuss the auto-generation of a dynamic Bayesian network from a static Bayesian network. In addition, we discuss subtle, but important, differences between Bayesian networks when used for diagnosis versus reconfiguration. We discuss a novel reconfiguration agent, which models a system causally, including effects of actions through time, using a dynamic Bayesian network. Though the techniques we discuss are general, we demonstrate them in the context of electrical power systems (EPSs) for aircraft and spacecraft. EPSs are vital subsystems on-board aircraft and spacecraft, and many incidents and accidents of these vehicles have been attributed to EPS failures. We discuss a case study that provides initial but promising results for our approach in the setting of electrical power systems.

  15. The real-time price elasticity of electricity

    NARCIS (Netherlands)

    Lijesen, M.G.

    2007-01-01

    The real-time price elasticity of electricity contains important information on the demand response of consumers to the volatility of peak prices. Despite the importance, empirical estimates of the real-time elasticity are hardly available. This paper provides a quantification of the real-time

  16. Real-time prediction and gating of respiratory motion using an extended Kalman filter and Gaussian process regression

    International Nuclear Information System (INIS)

    Bukhari, W; Hong, S-M

    2015-01-01

    Motion-adaptive radiotherapy aims to deliver a conformal dose to the target tumour with minimal normal tissue exposure by compensating for tumour motion in real time. The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting and gating respiratory motion that utilizes a model-based and a model-free Bayesian framework by combining them in a cascade structure. The algorithm, named EKF-GPR + , implements a gating function without pre-specifying a particular region of the patient’s breathing cycle. The algorithm first employs an extended Kalman filter (LCM-EKF) to predict the respiratory motion and then uses a model-free Gaussian process regression (GPR) to correct the error of the LCM-EKF prediction. The GPR is a non-parametric Bayesian algorithm that yields predictive variance under Gaussian assumptions. The EKF-GPR + algorithm utilizes the predictive variance from the GPR component to capture the uncertainty in the LCM-EKF prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification allows us to pause the treatment beam over such instances. EKF-GPR + implements the gating function by using simple calculations based on the predictive variance with no additional detection mechanism. A sparse approximation of the GPR algorithm is employed to realize EKF-GPR + in real time. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPR + . The experimental results show that the EKF-GPR + algorithm effectively reduces the prediction error in a root-mean-square (RMS) sense by employing the gating function, albeit at the cost of a reduced duty cycle. As an example, EKF-GPR + reduces the patient-wise RMS error to 37%, 39% and 42

  17. Real-time prediction and gating of respiratory motion using an extended Kalman filter and Gaussian process regression

    Science.gov (United States)

    Bukhari, W.; Hong, S.-M.

    2015-01-01

    Motion-adaptive radiotherapy aims to deliver a conformal dose to the target tumour with minimal normal tissue exposure by compensating for tumour motion in real time. The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting and gating respiratory motion that utilizes a model-based and a model-free Bayesian framework by combining them in a cascade structure. The algorithm, named EKF-GPR+, implements a gating function without pre-specifying a particular region of the patient’s breathing cycle. The algorithm first employs an extended Kalman filter (LCM-EKF) to predict the respiratory motion and then uses a model-free Gaussian process regression (GPR) to correct the error of the LCM-EKF prediction. The GPR is a non-parametric Bayesian algorithm that yields predictive variance under Gaussian assumptions. The EKF-GPR+ algorithm utilizes the predictive variance from the GPR component to capture the uncertainty in the LCM-EKF prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification allows us to pause the treatment beam over such instances. EKF-GPR+ implements the gating function by using simple calculations based on the predictive variance with no additional detection mechanism. A sparse approximation of the GPR algorithm is employed to realize EKF-GPR+ in real time. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPR+. The experimental results show that the EKF-GPR+ algorithm effectively reduces the prediction error in a root-mean-square (RMS) sense by employing the gating function, albeit at the cost of a reduced duty cycle. As an example, EKF-GPR+ reduces the patient-wise RMS error to 37%, 39% and 42% in

  18. Real-time prediction and gating of respiratory motion using an extended Kalman filter and Gaussian process regression.

    Science.gov (United States)

    Bukhari, W; Hong, S-M

    2015-01-07

    Motion-adaptive radiotherapy aims to deliver a conformal dose to the target tumour with minimal normal tissue exposure by compensating for tumour motion in real time. The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting and gating respiratory motion that utilizes a model-based and a model-free Bayesian framework by combining them in a cascade structure. The algorithm, named EKF-GPR(+), implements a gating function without pre-specifying a particular region of the patient's breathing cycle. The algorithm first employs an extended Kalman filter (LCM-EKF) to predict the respiratory motion and then uses a model-free Gaussian process regression (GPR) to correct the error of the LCM-EKF prediction. The GPR is a non-parametric Bayesian algorithm that yields predictive variance under Gaussian assumptions. The EKF-GPR(+) algorithm utilizes the predictive variance from the GPR component to capture the uncertainty in the LCM-EKF prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification allows us to pause the treatment beam over such instances. EKF-GPR(+) implements the gating function by using simple calculations based on the predictive variance with no additional detection mechanism. A sparse approximation of the GPR algorithm is employed to realize EKF-GPR(+) in real time. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPR(+). The experimental results show that the EKF-GPR(+) algorithm effectively reduces the prediction error in a root-mean-square (RMS) sense by employing the gating function, albeit at the cost of a reduced duty cycle. As an example, EKF-GPR(+) reduces the patient-wise RMS error to 37%, 39% and

  19. A Bayesian Model of Biases in Artificial Language Learning: The Case of a Word-Order Universal

    Science.gov (United States)

    Culbertson, Jennifer; Smolensky, Paul

    2012-01-01

    In this article, we develop a hierarchical Bayesian model of learning in a general type of artificial language-learning experiment in which learners are exposed to a mixture of grammars representing the variation present in real learners' input, particularly at times of language change. The modeling goal is to formalize and quantify hypothesized…

  20. Implementing Run-Time Evaluation of Distributed Timing Constraints in a Real-Time Environment

    DEFF Research Database (Denmark)

    Kristensen, C. H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments......In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments...

  1. Bayesian Age-Period-Cohort Model of Lung Cancer Mortality

    Directory of Open Access Journals (Sweden)

    Bhikhari P. Tharu

    2015-09-01

    Full Text Available Background The objective of this study was to analyze the time trend for lung cancer mortality in the population of the USA by 5 years based on most recent available data namely to 2010. The knowledge of the mortality rates in the temporal trends is necessary to understand cancer burden.Methods Bayesian Age-Period-Cohort model was fitted using Poisson regression with histogram smoothing prior to decompose mortality rates based on age at death, period at death, and birth-cohort.Results Mortality rates from lung cancer increased more rapidly from age 52 years. It ended up to 325 deaths annually for 82 years on average. The mortality of younger cohorts was lower than older cohorts. The risk of lung cancer was lowered from period 1993 to recent periods.Conclusions The fitted Bayesian Age-Period-Cohort model with histogram smoothing prior is capable of explaining mortality rate of lung cancer. The reduction in carcinogens in cigarettes and increase in smoking cessation from around 1960 might led to decreasing trend of lung cancer mortality after calendar period 1993.

  2. REAL TIME SYSTEM OPERATIONS 2006-2007

    Energy Technology Data Exchange (ETDEWEB)

    Eto, Joseph H.; Parashar, Manu; Lewis, Nancy Jo

    2008-08-15

    The Real Time System Operations (RTSO) 2006-2007 project focused on two parallel technical tasks: (1) Real-Time Applications of Phasors for Monitoring, Alarming and Control; and (2) Real-Time Voltage Security Assessment (RTVSA) Prototype Tool. The overall goal of the phasor applications project was to accelerate adoption and foster greater use of new, more accurate, time-synchronized phasor measurements by conducting research and prototyping applications on California ISO's phasor platform - Real-Time Dynamics Monitoring System (RTDMS) -- that provide previously unavailable information on the dynamic stability of the grid. Feasibility assessment studies were conducted on potential application of this technology for small-signal stability monitoring, validating/improving existing stability nomograms, conducting frequency response analysis, and obtaining real-time sensitivity information on key metrics to assess grid stress. Based on study findings, prototype applications for real-time visualization and alarming, small-signal stability monitoring, measurement based sensitivity analysis and frequency response assessment were developed, factory- and field-tested at the California ISO and at BPA. The goal of the RTVSA project was to provide California ISO with a prototype voltage security assessment tool that runs in real time within California ISO?s new reliability and congestion management system. CERTS conducted a technical assessment of appropriate algorithms, developed a prototype incorporating state-of-art algorithms (such as the continuation power flow, direct method, boundary orbiting method, and hyperplanes) into a framework most suitable for an operations environment. Based on study findings, a functional specification was prepared, which the California ISO has since used to procure a production-quality tool that is now a part of a suite of advanced computational tools that is used by California ISO for reliability and congestion management.

  3. A study of real-time content marketing : formulating real-time content marketing based on content, search and social media

    OpenAIRE

    Nguyen, Thi Kim Duyen

    2015-01-01

    The primary objective of this research is to understand profoundly the new concept of content marketing – real-time content marketing on the aspect of the digital marketing experts. Particularly, the research will focus on the real-time content marketing theories and how to build real-time content marketing strategy based on content, search and social media. It also finds out how marketers measure and keep track of conversion rates of their real-time content marketing plan. Practically, th...

  4. Review of real-time on-line decision support system RODOS; RODOS-ohjelman arviointi

    Energy Technology Data Exchange (ETDEWEB)

    Rossi, J. [VTT Energy, Espoo (Finland). Nuclear Energy

    1997-01-01

    RODOS (Real Time Off-site Decision Support System) is a research project, which aims at the development of a versatile decision support system for management of reactor accident off-site consequence assessments in real-time in Europe and in the western parts of the former Soviet Union. The system employs both local and regional environmental radiation monitoring results and meteorological forecasts by means of which the software prepares consistent predictions ranging from the release area to long distances covering all temporal phases of the accident. The data to be obtained from the environment will be processed and based on the mathematical and physical models and it will be prepared to intelligibly form about the prevalent or future environmental radiation situation. The software is intended for operative use of radiation safety authorities. Furthermore, the software is suitable for education and training of rescue field personnel. (refs.).

  5. Application of XML in real-time data warehouse

    Science.gov (United States)

    Zhao, Yanhong; Wang, Beizhan; Liu, Lizhao; Ye, Su

    2009-07-01

    At present, XML is one of the most widely-used technologies of data-describing and data-exchanging, and the needs for real-time data make real-time data warehouse a popular area in the research of data warehouse. What effects can we have if we apply XML technology to the research of real-time data warehouse? XML technology solves many technologic problems which are impossible to be addressed in traditional real-time data warehouse, and realize the integration of OLAP (On-line Analytical Processing) and OLTP (Online transaction processing) environment. Then real-time data warehouse can truly be called "real time".

  6. a Real-Time GIS Platform for High Sour Gas Leakage Simulation, Evaluation and Visualization

    Science.gov (United States)

    Li, M.; Liu, H.; Yang, C.

    2015-07-01

    The development of high-sulfur gas fields, also known as sour gas field, is faced with a series of safety control and emergency management problems. The GIS-based emergency response system is placed high expectations under the consideration of high pressure, high content, complex terrain and highly density population in Sichuan Basin, southwest China. The most researches on high hydrogen sulphide gas dispersion simulation and evaluation are used for environmental impact assessment (EIA) or emergency preparedness planning. This paper introduces a real-time GIS platform for high-sulfur gas emergency response. Combining with real-time data from the leak detection systems and the meteorological monitoring stations, GIS platform provides the functions of simulating, evaluating and displaying of the different spatial-temporal toxic gas distribution patterns and evaluation results. This paper firstly proposes the architecture of Emergency Response/Management System, secondly explains EPA's Gaussian dispersion model CALPUFF simulation workflow under high complex terrain and real-time data, thirdly explains the emergency workflow and spatial analysis functions of computing the accident influencing areas, population and the optimal evacuation routes. Finally, a well blow scenarios is used for verify the system. The study shows that GIS platform which integrates the real-time data and CALPUFF models will be one of the essential operational platforms for high-sulfur gas fields emergency management.

  7. Real-time dynamic calibration of a tunable frequency laser source using a Fabry-Pérot interferometer

    Energy Technology Data Exchange (ETDEWEB)

    Mandula, Gábor, E-mail: mandula.gabor@wigner.mta.hu; Kis, Zsolt; Lengyel, Krisztián [Wigner Research Centre for Physics of the Hungarian Academy of Sciences, Konkoly-Thege Miklós út 29-33, H-1121 Budapest (Hungary)

    2015-12-15

    We report on a method for real-time dynamic calibration of a tunable external cavity diode laser by using a partially mode-matched plano-concave Fabry-Pérot interferometer in reflection geometry. Wide range laser frequency scanning is carried out by piezo-driven tilting of a diffractive grating playing the role of a frequency selective mirror in the laser cavity. The grating tilting system has a considerable mechanical inertness, so static laser frequency calibration leads to false results. The proposed real-time dynamic calibration based on the identification of primary- and Gouy-effect type secondary interference peaks with known frequency and temporal history can be used for a wide scanning range (from 0.2 GHz to more than 1 GHz). A concave spherical mirror with a radius of R = 100 cm and a plain 1% transmitting mirror was used as a Fabry-Pérot interferometer with various resonator lengths to investigate and demonstrate real-time calibration procedures for two kinds of laser frequency scanning functions.

  8. A real-time crash prediction model for the ramp vicinities of urban expressways

    Directory of Open Access Journals (Sweden)

    Moinul Hossain

    2013-07-01

    Full Text Available Ramp vicinities are arguably the known black-spots on urban expressways. There, while maintaining high speed, drivers need to respond to several complex events such as maneuvering, reading road signs, route planning and maintaining safe distance from other maneuvering vehicles simultaneously which demand higher level of cognitive response to ensure safety. Therefore, any additional discomfort caused by traffic dynamics may induce driving error resulting in a crash. This manuscript presents a methodology for identifying these dynamically forming hazardous traffic conditions near the ramp vicinities with high resolution real-time traffic flow data. It separates the ramp vicinities into four zones – upstream and downstream of entrance and exit ramps, and builds four separate real-time crash prediction models. Around two year (December 2007 to October 2009 crash data as well as their matching traffic sensor data from Shibuya 3 and Shinjuku 4 expressways under the jurisdiction of Tokyo Metropolitan Expressway Company Limited have been utilized for this research. Random multinomial logit, a forest of multinomial logit models, has been used to identify the most important variables. Finally, a real-time modeling method, Bayesian belief net (BBN, has been employed to build the four models using ramp flow, flow and congestion index in the upstream and flow and speed in the downstream of the ramp location as variables. The newly proposed models could predict 50%, 42%, 43% and 55% of the future crashes with around 10% false alarm for the downstream of entrance, downstream of exit, upstream of entrance and upstream of exit ramps respectively. The models can be utilized in combination with various traffic smoothing measures such as ramp metering, variable speed limit, warning messages through variable message signs, etc. to enhance safety near the ramp vicinities.

  9. BAYESIAN TECHNIQUES FOR COMPARING TIME-DEPENDENT GRMHD SIMULATIONS TO VARIABLE EVENT HORIZON TELESCOPE OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios, E-mail: junhankim@email.arizona.edu [Department of Astronomy and Steward Observatory, University of Arizona, 933 N. Cherry Avenue, Tucson, AZ 85721 (United States)

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.

  10. Experiences and recommendations in deploying a real-time, water quality monitoring system

    Science.gov (United States)

    O'Flynn, B.; Regan, F.; Lawlor, A.; Wallace, J.; Torres, J.; O'Mathuna, C.

    2010-12-01

    Monitoring of water quality at a river basin level to meet the requirements of the Water Framework Directive (WFD) using conventional sampling and laboratory-based techniques poses a significant financial burden. Wireless sensing systems offer the potential to reduce these costs considerably, as well as provide more useful, continuous monitoring capabilities by giving an accurate idea of the changing environmental and water quality in real time. It is unlikely that the traditional spot/grab sampling will provide a reasonable estimate of the true maximum and/or mean concentration for a particular physicochemical variable in a water body with marked temporal variability. When persistent fluctuations occur, it is likely only to be detected through continuous measurements, which have the capability of detecting sporadic peaks of concentration. Thus, in situ sensors capable of continuous sampling of parameters required under the WFD would therefore provide more up-to-date information, cut monitoring costs and provide better coverage representing long-term trends in fluctuations of pollutant concentrations. DEPLOY is a technology demonstration project, which began planning and station selection and design in August 2008 aiming to show how state-of-the-art technology could be implemented for cost-effective, continuous and real-time monitoring of a river catchment. The DEPLOY project is seen as an important building block in the realization of a wide area autonomous network of sensors capable of monitoring the spatial and temporal distribution of important water quality and environmental target parameters. The demonstration sites chosen are based in the River Lee, which flows through Ireland's second largest city, Cork, and were designed to include monitoring stations in five zones considered typical of significant river systems--these monitor water quality parameters such as pH, temperature, depth, conductivity, turbidity and dissolved oxygen. Over one million data points

  11. Experiences and recommendations in deploying a real-time, water quality monitoring system

    International Nuclear Information System (INIS)

    O'Flynn, B; O'Mathuna, C; Regan, F; Lawlor, A; Wallace, J; Torres, J

    2010-01-01

    Monitoring of water quality at a river basin level to meet the requirements of the Water Framework Directive (WFD) using conventional sampling and laboratory-based techniques poses a significant financial burden. Wireless sensing systems offer the potential to reduce these costs considerably, as well as provide more useful, continuous monitoring capabilities by giving an accurate idea of the changing environmental and water quality in real time. It is unlikely that the traditional spot/grab sampling will provide a reasonable estimate of the true maximum and/or mean concentration for a particular physicochemical variable in a water body with marked temporal variability. When persistent fluctuations occur, it is likely only to be detected through continuous measurements, which have the capability of detecting sporadic peaks of concentration. Thus, in situ sensors capable of continuous sampling of parameters required under the WFD would therefore provide more up-to-date information, cut monitoring costs and provide better coverage representing long-term trends in fluctuations of pollutant concentrations. DEPLOY is a technology demonstration project, which began planning and station selection and design in August 2008 aiming to show how state-of-the-art technology could be implemented for cost-effective, continuous and real-time monitoring of a river catchment. The DEPLOY project is seen as an important building block in the realization of a wide area autonomous network of sensors capable of monitoring the spatial and temporal distribution of important water quality and environmental target parameters. The demonstration sites chosen are based in the River Lee, which flows through Ireland's second largest city, Cork, and were designed to include monitoring stations in five zones considered typical of significant river systems-–these monitor water quality parameters such as pH, temperature, depth, conductivity, turbidity and dissolved oxygen. Over one million data

  12. Mixed - mode Operating System for Real - time Performance

    OpenAIRE

    Hasan M. M.; Sultana S.; Foo C.K.

    2017-01-01

    The purpose of the mixed-mode system research is to handle devices with the accuracy of real-time systems and at the same time, having all the benefits and facilities of a matured Graphic User Interface(GUI)operating system which is typicallynon-real-time. This mixed-mode operating system comprising of a real-time portion and a non-real-time portion was studied and implemented to identify the feasibilities and performances in practical applications (in the context of scheduled the real-time e...

  13. Bayesian Algorithm Implementation in a Real Time Exposure Assessment Model on Benzene with Calculation of Associated Cancer Risks

    OpenAIRE

    Sarigiannis, Dimosthenis A.; Karakitsios, Spyros P.; Gotti, Alberto; Papaloukas, Costas L.; Kassomenos, Pavlos A.; Pilidis, Georgios A.

    2009-01-01

    The objective of the current study was the development of a reliable modeling platform to calculate in real time the personal exposure and the associated health risk for filling station employees evaluating current environmental parameters (traffic, meteorological and amount of fuel traded) determined by the appropriate sensor network. A set of Artificial Neural Networks (ANNs) was developed to predict benzene exposure pattern for the filling station employees. Furthermore, a Physiology Based...

  14. Graded Alternating-Time Temporal Logic

    Science.gov (United States)

    Faella, Marco; Napoli, Margherita; Parente, Mimmo

    Graded modalities enrich the universal and existential quantifiers with the capability to express the concept of at least k or all but k, for a non-negative integer k. Recently, temporal logics such as μ-calculus and Computational Tree Logic, Ctl, augmented with graded modalities have received attention from the scientific community, both from a theoretical side and from an applicative perspective. Both μ-calculus and Ctl naturally apply as specification languages for closed systems: in this paper, we add graded modalities to the Alternating-time Temporal Logic (Atl) introduced by Alur et al., to study how these modalities may affect specification languages for open systems.

  15. Mixed-mode Operating System for Real-time Performance

    Directory of Open Access Journals (Sweden)

    M.M. Hasan

    2017-11-01

    Full Text Available The purpose of the mixed-mode system research is to handle devices with the accuracy of real-time systems and at the same time, having all the benefits and facilities of a matured Graphic User Interface (GUI operating system which is typically nonreal-time. This mixed-mode operating system comprising of a real-time portion and a non-real-time portion was studied and implemented to identify the feasibilities and performances in practical applications (in the context of scheduled the real-time events. In this research an i8751 microcontroller-based hardware was used to measure the performance of the system in real-time-only as well as non-real-time-only configurations. The real-time portion is an 486DX-40 IBM PC system running under DOS-based realtime kernel and the non-real-time portion is a Pentium III based system running under Windows NT. It was found that mixed-mode systems performed as good as a typical realtime system and in fact, gave many additional benefits such as simplified/modular programming and load tolerance.

  16. Optimal task mapping in safety-critical real-time parallel systems

    International Nuclear Information System (INIS)

    Aussagues, Ch.

    1998-01-01

    This PhD thesis is dealing with the correct design of safety-critical real-time parallel systems. Such systems constitutes a fundamental part of high-performance systems for command and control that can be found in the nuclear domain or more generally in parallel embedded systems. The verification of their temporal correctness is the core of this thesis. our contribution is mainly in the following three points: the analysis and extension of a programming model for such real-time parallel systems; the proposal of an original method based on a new operator of synchronized product of state machines task-graphs; the validation of the approach by its implementation and evaluation. The work addresses particularly the main problem of optimal task mapping on a parallel architecture, such that the temporal constraints are globally guaranteed, i.e. the timeliness property is valid. The results incorporate also optimally criteria for the sizing and correct dimensioning of a parallel system, for instance in the number of processing elements. These criteria are connected with operational constraints of the application domain. Our approach is based on the off-line analysis of the feasibility of the deadline-driven dynamic scheduling that is used to schedule tasks inside one processor. This leads us to define the synchronized-product, a system of linear, constraints is automatically generated and then allows to calculate a maximum load of a group of tasks and then to verify their timeliness constraints. The communications, their timeliness verification and incorporation to the mapping problem is the second main contribution of this thesis. FInally, the global solving technique dealing with both task and communication aspects has been implemented and evaluated in the framework of the OASIS project in the LETI research center at the CEA/Saclay. (author)

  17. Evaluation of Real-Time and Off-Line Performance of the Virtual Seismologist Earthquake Early Warning Algorithm in Switzerland

    Science.gov (United States)

    Behr, Yannik; Clinton, John; Cua, Georgia; Cauzzi, Carlo; Heimers, Stefan; Kästli, Philipp; Becker, Jan; Heaton, Thomas

    2013-04-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS installations in southern Italy, western Greece, Istanbul, Romania, and Iceland are planned or underway. In Switzerland, VS has been running in real-time on stations monitored by the Swiss Seismological Service (including stations from Austria, France, Germany, and Italy) since 2010. While originally based on the Earthworm system it has recently been ported to the SeisComp3 system. Besides taking advantage of SeisComp3's picking and phase association capabilities it greatly simplifies the potential installation of VS at networks in particular those already running SeisComp3. We present the architecture of the new SeisComp3 based version and compare its results from off-line tests with the real-time performance of VS in Switzerland over the past two years. We further show that the empirical relationships used by VS to estimate magnitudes and ground motion, originally derived from southern California data, perform well in Switzerland.

  18. Release the BEESTS: Bayesian Estimation of Ex-Gaussian STop-Signal Reaction Time Distributions

    Directory of Open Access Journals (Sweden)

    Dora eMatzke

    2013-12-01

    Full Text Available The stop-signal paradigm is frequently used to study response inhibition. Inthis paradigm, participants perform a two-choice response time task wherethe primary task is occasionally interrupted by a stop-signal that promptsparticipants to withhold their response. The primary goal is to estimatethe latency of the unobservable stop response (stop signal reaction timeor SSRT. Recently, Matzke, Dolan, Logan, Brown, and Wagenmakers (inpress have developed a Bayesian parametric approach that allows for theestimation of the entire distribution of SSRTs. The Bayesian parametricapproach assumes that SSRTs are ex-Gaussian distributed and uses Markovchain Monte Carlo sampling to estimate the parameters of the SSRT distri-bution. Here we present an efficient and user-friendly software implementa-tion of the Bayesian parametric approach —BEESTS— that can be appliedto individual as well as hierarchical stop-signal data. BEESTS comes withan easy-to-use graphical user interface and provides users with summarystatistics of the posterior distribution of the parameters as well various diag-nostic tools to assess the quality of the parameter estimates. The softwareis open source and runs on Windows and OS X operating systems. In sum,BEESTS allows experimental and clinical psychologists to estimate entiredistributions of SSRTs and hence facilitates the more rigorous analysis ofstop-signal data.

  19. Temporality and the torments of time.

    Science.gov (United States)

    Hinton, Ladson

    2015-06-01

    Immersion in time gives birth to consciousness, as well as conflict and torment. When human beings developed a sense of future, they also gained the ability to anticipate threats from nature or their fellow beings. They thereby created cultures that are bastions of survival, as well as places of poetry, art and religion where they could band together and reflect upon their common plight. The practice of psychoanalysis is in many ways a temporal process, a process of remembering, for owning and elaborating a past that gives us substance, thereby providing a basis for reflective consciousness. Stimulated by Freud's early writings, Lacan, Laplanche and their successors in particular have focussed extensively on time and psychoanalysis, and their views are a central point of this discussion. A substantial case study is offered that provides concrete examples of these perspectives. A multi-faceted view of temporality emerges, one that is more syncopated than linear or teleological. In conclusion, I will briefly discuss recent findings in the neuroscience of memory and 'time travel' that underpin contemporary psychoanalytic ideas in surprising ways. It is important to remember that acceptance of the contradictory nature of temporal experience can open space for increased freedom and playfulness. © 2015, The Society of Analytical Psychology.

  20. Testing of real-time-software

    International Nuclear Information System (INIS)

    Friesland, G.; Ovenhausen, H.

    1975-05-01

    The situation in the area of testing real-time-software is unsatisfactory. During the first phase of the project PROMOTE (prozessorientiertes Modul- und Gesamttestsystem) an analysis of the momentary situation took place, results of which are summarized in the following study about some user interviews and an analysis of relevant literature. 22 users (industry, software-houses, hardware-manufacturers, and institutes) have been interviewed. Discussions were held about reliability of real-time software with special interest to error avoidance, testing, and debugging. Main aims of the analysis of the literature were elaboration of standard terms, comparison of existing test methods and -systems, and the definition of boundaries to related areas. During the further steps of this project some means and techniques will be worked out to systematically test real-time software. (orig.) [de

  1. Validation and Assessment of Multi-GNSS Real-Time Precise Point Positioning in Simulated Kinematic Mode Using IGS Real-Time Service

    Directory of Open Access Journals (Sweden)

    Liang Wang

    2018-02-01

    Full Text Available Precise Point Positioning (PPP is a popular technology for precise applications based on the Global Navigation Satellite System (GNSS. Multi-GNSS combined PPP has become a hot topic in recent years with the development of multiple GNSSs. Meanwhile, with the operation of the real-time service (RTS of the International GNSS Service (IGS agency that provides satellite orbit and clock corrections to broadcast ephemeris, it is possible to obtain the real-time precise products of satellite orbits and clocks and to conduct real-time PPP. In this contribution, the real-time multi-GNSS orbit and clock corrections of the CLK93 product are applied for real-time multi-GNSS PPP processing, and its orbit and clock qualities are investigated, first with a seven-day experiment by comparing them with the final multi-GNSS precise product ‘GBM’ from GFZ. Then, an experiment involving real-time PPP processing for three stations in the Multi-GNSS Experiment (MGEX network with a testing period of two weeks is conducted in order to evaluate the convergence performance of real-time PPP in a simulated kinematic mode. The experimental result shows that real-time PPP can achieve a convergence performance of less than 15 min for an accuracy level of 20 cm. Finally, the real-time data streams from 12 globally distributed IGS/MGEX stations for one month are used to assess and validate the positioning accuracy of real-time multi-GNSS PPP. The results show that the simulated kinematic positioning accuracy achieved by real-time PPP on different stations is about 3.0 to 4.0 cm for the horizontal direction and 5.0 to 7.0 cm for the three-dimensional (3D direction.

  2. Efficient fuzzy Bayesian inference algorithms for incorporating expert knowledge in parameter estimation

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad

    2016-05-01

    Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert

  3. The FERMI-Elettra distributed real-time framework

    International Nuclear Information System (INIS)

    Pivetta, L.; Gaio, G.; Passuello, R.; Scalamera, G.

    2012-01-01

    FERMI-Elettra is a Free Electron Laser (FEL) based on a 1.5 GeV linac. The pulsed operation of the accelerator and the necessity to characterize and control each electron bunch requires synchronous acquisition of the beam diagnostics together with the ability to drive actuators in real-time at the linac repetition rate. The Adeos/Xenomai real-time extensions have been adopted in order to add real-time capabilities to the Linux based control system computers running the Tango software. A software communication protocol based on Gigabit Ethernet and known as Network Reflective Memory (NRM) has been developed to implement a shared memory across the whole control system, allowing computers to communicate in real-time. The NRM architecture, the real-time performance and the integration in the control system are described. (authors)

  4. Real-time video quality monitoring

    Science.gov (United States)

    Liu, Tao; Narvekar, Niranjan; Wang, Beibei; Ding, Ran; Zou, Dekun; Cash, Glenn; Bhagavathy, Sitaram; Bloom, Jeffrey

    2011-12-01

    The ITU-T Recommendation G.1070 is a standardized opinion model for video telephony applications that uses video bitrate, frame rate, and packet-loss rate to measure the video quality. However, this model was original designed as an offline quality planning tool. It cannot be directly used for quality monitoring since the above three input parameters are not readily available within a network or at the decoder. And there is a great room for the performance improvement of this quality metric. In this article, we present a real-time video quality monitoring solution based on this Recommendation. We first propose a scheme to efficiently estimate the three parameters from video bitstreams, so that it can be used as a real-time video quality monitoring tool. Furthermore, an enhanced algorithm based on the G.1070 model that provides more accurate quality prediction is proposed. Finally, to use this metric in real-world applications, we present an example emerging application of real-time quality measurement to the management of transmitted videos, especially those delivered to mobile devices.

  5. MODELING INFORMATION SYSTEM AVAILABILITY BY USING BAYESIAN BELIEF NETWORK APPROACH

    Directory of Open Access Journals (Sweden)

    Semir Ibrahimović

    2016-03-01

    Full Text Available Modern information systems are expected to be always-on by providing services to end-users, regardless of time and location. This is particularly important for organizations and industries where information systems support real-time operations and mission-critical applications that need to be available on 24  7  365 basis. Examples of such entities include process industries, telecommunications, healthcare, energy, banking, electronic commerce and a variety of cloud services. This article presents a modified Bayesian Belief Network model for predicting information system availability, introduced initially by Franke, U. and Johnson, P. (in article “Availability of enterprise IT systems – an expert based Bayesian model”. Software Quality Journal 20(2, 369-394, 2012 based on a thorough review of several dimensions of the information system availability, we proposed a modified set of determinants. The model is parameterized by using probability elicitation process with the participation of experts from the financial sector of Bosnia and Herzegovina. The model validation was performed using Monte Carlo simulation.

  6. High throughput web inspection system using time-stretch real-time imaging

    Science.gov (United States)

    Kim, Chanju

    Photonic time-stretch is a novel technology that enables capturing of fast, rare and non-repetitive events. Therefore, it operates in real-time with ability to record over long period of time while having fine temporal resolution. The powerful property of photonic time-stretch has already been employed in various fields of application such as analog-to-digital conversion, spectroscopy, laser scanner and microscopy. Further expanding the scope, we fully exploit the time-stretch technology to demonstrate a high throughput web inspection system. Web inspection, namely surface inspection is a nondestructive evaluation method which is crucial for semiconductor wafer and thin film production. We successfully report a dark-field web inspection system with line scan speed of 90.9 MHz which is up to 1000 times faster than conventional inspection instruments. The manufacturing of high quality semiconductor wafer and thin film may directly benefit from this technology as it can easily locate defects with area of less than 10 microm x 10 microm where it allows maximum web flow speed of 1.8 km/s. The thesis provides an overview of our web inspection technique, followed by description of the photonic time-stretch technique which is the keystone in our system. A detailed explanation of each component is covered to provide quantitative understanding of the system. Finally, imaging results from a hard-disk sample and flexible films are presented along with performance analysis of the system. This project was the first application of time-stretch to industrial inspection, and was conducted under financial support and with close involvement by Hitachi, Ltd.

  7. Bayesian models and meta analysis for multiple tissue gene expression data following corticosteroid administration

    Directory of Open Access Journals (Sweden)

    Kelemen Arpad

    2008-08-01

    Full Text Available Abstract Background This paper addresses key biological problems and statistical issues in the analysis of large gene expression data sets that describe systemic temporal response cascades to therapeutic doses in multiple tissues such as liver, skeletal muscle, and kidney from the same animals. Affymetrix time course gene expression data U34A are obtained from three different tissues including kidney, liver and muscle. Our goal is not only to find the concordance of gene in different tissues, identify the common differentially expressed genes over time and also examine the reproducibility of the findings by integrating the results through meta analysis from multiple tissues in order to gain a significant increase in the power of detecting differentially expressed genes over time and to find the differential differences of three tissues responding to the drug. Results and conclusion Bayesian categorical model for estimating the proportion of the 'call' are used for pre-screening genes. Hierarchical Bayesian Mixture Model is further developed for the identifications of differentially expressed genes across time and dynamic clusters. Deviance information criterion is applied to determine the number of components for model comparisons and selections. Bayesian mixture model produces the gene-specific posterior probability of differential/non-differential expression and the 95% credible interval, which is the basis for our further Bayesian meta-inference. Meta-analysis is performed in order to identify commonly expressed genes from multiple tissues that may serve as ideal targets for novel treatment strategies and to integrate the results across separate studies. We have found the common expressed genes in the three tissues. However, the up/down/no regulations of these common genes are different at different time points. Moreover, the most differentially expressed genes were found in the liver, then in kidney, and then in muscle.

  8. Heterogeneous Embedded Real-Time Systems Environment

    Science.gov (United States)

    2003-12-01

    AFRL-IF-RS-TR-2003-290 Final Technical Report December 2003 HETEROGENEOUS EMBEDDED REAL - TIME SYSTEMS ENVIRONMENT Integrated...HETEROGENEOUS EMBEDDED REAL - TIME SYSTEMS ENVIRONMENT 6. AUTHOR(S) Cosmo Castellano and James Graham 5. FUNDING NUMBERS C - F30602-97-C-0259

  9. Incremental temporal pattern mining using efficient batch-free stream clustering

    NARCIS (Netherlands)

    Lu, Y.; Hassani, M.; Seidl, T.

    2017-01-01

    This paper address the problem of temporal pattern mining from multiple data streams containing temporal events. Temporal events are considered as real world events aligned with comprehensive starting and ending timing information rather than simple integer timestamps. Predefined relations, such as

  10. Bayesian techniques for fatigue life prediction and for inference in linear time dependent PDEs

    KAUST Repository

    Scavino, Marco

    2016-01-08

    In this talk we introduce first the main characteristics of a systematic statistical approach to model calibration, model selection and model ranking when stress-life data are drawn from a collection of records of fatigue experiments. Focusing on Bayesian prediction assessment, we consider fatigue-limit models and random fatigue-limit models under different a priori assumptions. In the second part of the talk, we present a hierarchical Bayesian technique for the inference of the coefficients of time dependent linear PDEs, under the assumption that noisy measurements are available in both the interior of a domain of interest and from boundary conditions. We present a computational technique based on the marginalization of the contribution of the boundary parameters and apply it to inverse heat conduction problems.

  11. Real-time communication protocols: an overview

    NARCIS (Netherlands)

    Hanssen, F.T.Y.; Jansen, P.G.

    2003-01-01

    This paper describes several existing data link layer protocols that provide real-time capabilities on wired networks, focusing on token-ring and Carrier Sense Multiple Access based networks. Existing modifications to provide better real-time capabilities and performance are also described. Finally

  12. Self-Organization in Embedded Real-Time Systems

    CERN Document Server

    Brinkschulte, Uwe; Rettberg, Achim

    2013-01-01

    This book describes the emerging field of self-organizing, multicore, distributed and real-time embedded systems.  Self-organization of both hardware and software can be a key technique to handle the growing complexity of modern computing systems. Distributed systems running hundreds of tasks on dozens of processors, each equipped with multiple cores, requires self-organization principles to ensure efficient and reliable operation. This book addresses various, so-called Self-X features such as self-configuration, self-optimization, self-adaptation, self-healing and self-protection. Presents open components for embedded real-time adaptive and self-organizing applications; Describes innovative techniques in: scheduling, memory management, quality of service, communications supporting organic real-time applications; Covers multi-/many-core embedded systems supporting real-time adaptive systems and power-aware, adaptive hardware and software systems; Includes case studies of open embedded real-time self-organizi...

  13. Real-time systems scheduling fundamentals

    CERN Document Server

    Chetto, Maryline

    2014-01-01

    Real-time systems are used in a wide range of applications, including control, sensing, multimedia, etc.  Scheduling is a central problem for these computing/communication systems since responsible of software execution in a timely manner. This book provides state of knowledge in this domain with special emphasis on the key results obtained within the last decade. This book addresses foundations as well as the latest advances and findings in Real-Time Scheduling, giving all references to important papers. But nevertheless the chapters will be short and not overloaded with confusing details.

  14. Adaption of the temporal correlation coefficient calculation for temporal networks (applied to a real-world pig trade network).

    Science.gov (United States)

    Büttner, Kathrin; Salau, Jennifer; Krieter, Joachim

    2016-01-01

    The average topological overlap of two graphs of two consecutive time steps measures the amount of changes in the edge configuration between the two snapshots. This value has to be zero if the edge configuration changes completely and one if the two consecutive graphs are identical. Current methods depend on the number of nodes in the network or on the maximal number of connected nodes in the consecutive time steps. In the first case, this methodology breaks down if there are nodes with no edges. In the second case, it fails if the maximal number of active nodes is larger than the maximal number of connected nodes. In the following, an adaption of the calculation of the temporal correlation coefficient and of the topological overlap of the graph between two consecutive time steps is presented, which shows the expected behaviour mentioned above. The newly proposed adaption uses the maximal number of active nodes, i.e. the number of nodes with at least one edge, for the calculation of the topological overlap. The three methods were compared with the help of vivid example networks to reveal the differences between the proposed notations. Furthermore, these three calculation methods were applied to a real-world network of animal movements in order to detect influences of the network structure on the outcome of the different methods.

  15. Probabilistic safety assessment model in consideration of human factors based on object-oriented bayesian networks

    International Nuclear Information System (INIS)

    Zhou Zhongbao; Zhou Jinglun; Sun Quan

    2007-01-01

    Effect of Human factors on system safety is increasingly serious, which is often ignored in traditional probabilistic safety assessment methods however. A new probabilistic safety assessment model based on object-oriented Bayesian networks is proposed in this paper. Human factors are integrated into the existed event sequence diagrams. Then the classes of the object-oriented Bayesian networks are constructed which are converted to latent Bayesian networks for inference. Finally, the inference results are integrated into event sequence diagrams for probabilistic safety assessment. The new method is applied to the accident of loss of coolant in a nuclear power plant. the results show that the model is not only applicable to real-time situation assessment, but also applicable to situation assessment based certain amount of information. The modeling complexity is kept down and the new method is appropriate to large complex systems due to the thoughts of object-oriented. (authors)

  16. Dynamical Bayesian inference of time-evolving interactions: From a pair of coupled oscillators to networks of oscillators

    Science.gov (United States)

    Duggento, Andrea; Stankovski, Tomislav; McClintock, Peter V. E.; Stefanovska, Aneta

    2012-12-01

    Living systems have time-evolving interactions that, until recently, could not be identified accurately from recorded time series in the presence of noise. Stankovski [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.109.024101 109, 024101 (2012)] introduced a method based on dynamical Bayesian inference that facilitates the simultaneous detection of time-varying synchronization, directionality of influence, and coupling functions. It can distinguish unsynchronized dynamics from noise-induced phase slips. The method is based on phase dynamics, with Bayesian inference of the time-evolving parameters being achieved by shaping the prior densities to incorporate knowledge of previous samples. We now present the method in detail using numerically generated data, data from an analog electronic circuit, and cardiorespiratory data. We also generalize the method to encompass networks of interacting oscillators and thus demonstrate its applicability to small-scale networks.

  17. Real-time specifications

    DEFF Research Database (Denmark)

    David, A.; Larsen, K.G.; Legay, A.

    2015-01-01

    A specification theory combines notions of specifications and implementations with a satisfaction relation, a refinement relation, and a set of operators supporting stepwise design. We develop a specification framework for real-time systems using Timed I/O Automata as the specification formalism......, with the semantics expressed in terms of Timed I/O Transition Systems. We provide constructs for refinement, consistency checking, logical and structural composition, and quotient of specifications-all indispensable ingredients of a compositional design methodology. The theory is implemented in the new tool Ecdar...

  18. Time-translation noninvariance of temporal gauge propagator

    International Nuclear Information System (INIS)

    Lim, S.C.

    1992-07-01

    We show that within the framework of stochastic mechanics, the quantization of a free electromagnetic or Yang-Mills field in the temporal gauge can be consistently carried out. The resulting longitudinal component of the photon or gluon propagator is time-translation noninvariant. The exact form of the propagator depends on the additional boundary condition which fully fixes the temporal gauge. (author). 11 refs

  19. Breaking Computational Barriers: Real-time Analysis and Optimization with Large-scale Nonlinear Models via Model Reduction

    Energy Technology Data Exchange (ETDEWEB)

    Carlberg, Kevin Thomas [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Quantitative Modeling and Analysis; Drohmann, Martin [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Quantitative Modeling and Analysis; Tuminaro, Raymond S. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Computational Mathematics; Boggs, Paul T. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Quantitative Modeling and Analysis; Ray, Jaideep [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Quantitative Modeling and Analysis; van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Optimization and Uncertainty Estimation

    2014-10-01

    Model reduction for dynamical systems is a promising approach for reducing the computational cost of large-scale physics-based simulations to enable high-fidelity models to be used in many- query (e.g., Bayesian inference) and near-real-time (e.g., fast-turnaround simulation) contexts. While model reduction works well for specialized problems such as linear time-invariant systems, it is much more difficult to obtain accurate, stable, and efficient reduced-order models (ROMs) for systems with general nonlinearities. This report describes several advances that enable nonlinear reduced-order models (ROMs) to be deployed in a variety of time-critical settings. First, we present an error bound for the Gauss-Newton with Approximated Tensors (GNAT) nonlinear model reduction technique. This bound allows the state-space error for the GNAT method to be quantified when applied with the backward Euler time-integration scheme. Second, we present a methodology for preserving classical Lagrangian structure in nonlinear model reduction. This technique guarantees that important properties--such as energy conservation and symplectic time-evolution maps--are preserved when performing model reduction for models described by a Lagrangian formalism (e.g., molecular dynamics, structural dynamics). Third, we present a novel technique for decreasing the temporal complexity --defined as the number of Newton-like iterations performed over the course of the simulation--by exploiting time-domain data. Fourth, we describe a novel method for refining projection-based reduced-order models a posteriori using a goal-oriented framework similar to mesh-adaptive h -refinement in finite elements. The technique allows the ROM to generate arbitrarily accurate solutions, thereby providing the ROM with a 'failsafe' mechanism in the event of insufficient training data. Finally, we present the reduced-order model error surrogate (ROMES) method for statistically quantifying reduced- order

  20. Inferring the most probable maps of underground utilities using Bayesian mapping model

    Science.gov (United States)

    Bilal, Muhammad; Khan, Wasiq; Muggleton, Jennifer; Rustighi, Emiliano; Jenks, Hugo; Pennock, Steve R.; Atkins, Phil R.; Cohn, Anthony

    2018-03-01

    Mapping the Underworld (MTU), a major initiative in the UK, is focused on addressing social, environmental and economic consequences raised from the inability to locate buried underground utilities (such as pipes and cables) by developing a multi-sensor mobile device. The aim of MTU device is to locate different types of buried assets in real time with the use of automated data processing techniques and statutory records. The statutory records, even though typically being inaccurate and incomplete, provide useful prior information on what is buried under the ground and where. However, the integration of information from multiple sensors (raw data) with these qualitative maps and their visualization is challenging and requires the implementation of robust machine learning/data fusion approaches. An approach for automated creation of revised maps was developed as a Bayesian Mapping model in this paper by integrating the knowledge extracted from sensors raw data and available statutory records. The combination of statutory records with the hypotheses from sensors was for initial estimation of what might be found underground and roughly where. The maps were (re)constructed using automated image segmentation techniques for hypotheses extraction and Bayesian classification techniques for segment-manhole connections. The model consisting of image segmentation algorithm and various Bayesian classification techniques (segment recognition and expectation maximization (EM) algorithm) provided robust performance on various simulated as well as real sites in terms of predicting linear/non-linear segments and constructing refined 2D/3D maps.

  1. Rationalizing method of replacement intervals by using Bayesian statistics

    International Nuclear Information System (INIS)

    Kasai, Masao; Notoya, Junichi; Kusakari, Yoshiyuki

    2007-01-01

    This study represents the formulations for rationalizing the replacement intervals of equipments and/or parts taking into account the probability density functions (PDF) of the parameters of failure distribution functions (FDF) and compares the optimized intervals by our formulations with those by conventional formulations which uses only representative values of the parameters of FDF instead of using these PDFs. The failure data are generated by Monte Carlo simulations since the real failure data can not be available for us. The PDF of PDF parameters are obtained by Bayesian method and the representative values are obtained by likelihood estimation and Bayesian method. We found that the method using PDF by Bayesian method brings longer replacement intervals than one using the representative of the parameters. (author)

  2. Probabilistic daily ILI syndromic surveillance with a spatio-temporal Bayesian hierarchical model.

    Directory of Open Access Journals (Sweden)

    Ta-Chien Chan

    Full Text Available BACKGROUND: For daily syndromic surveillance to be effective, an efficient and sensible algorithm would be expected to detect aberrations in influenza illness, and alert public health workers prior to any impending epidemic. This detection or alert surely contains uncertainty, and thus should be evaluated with a proper probabilistic measure. However, traditional monitoring mechanisms simply provide a binary alert, failing to adequately address this uncertainty. METHODS AND FINDINGS: Based on the Bayesian posterior probability of influenza-like illness (ILI visits, the intensity of outbreak can be directly assessed. The numbers of daily emergency room ILI visits at five community hospitals in Taipei City during 2006-2007 were collected and fitted with a Bayesian hierarchical model containing meteorological factors such as temperature and vapor pressure, spatial interaction with conditional autoregressive structure, weekend and holiday effects, seasonality factors, and previous ILI visits. The proposed algorithm recommends an alert for action if the posterior probability is larger than 70%. External data from January to February of 2008 were retained for validation. The decision rule detects successfully the peak in the validation period. When comparing the posterior probability evaluation with the modified Cusum method, results show that the proposed method is able to detect the signals 1-2 days prior to the rise of ILI visits. CONCLUSIONS: This Bayesian hierarchical model not only constitutes a dynamic surveillance system but also constructs a stochastic evaluation of the need to call for alert. The monitoring mechanism provides earlier detection as well as a complementary tool for current surveillance programs.

  3. Bayesian inversion analysis of nonlinear dynamics in surface heterogeneous reactions.

    Science.gov (United States)

    Omori, Toshiaki; Kuwatani, Tatsu; Okamoto, Atsushi; Hukushima, Koji

    2016-09-01

    It is essential to extract nonlinear dynamics from time-series data as an inverse problem in natural sciences. We propose a Bayesian statistical framework for extracting nonlinear dynamics of surface heterogeneous reactions from sparse and noisy observable data. Surface heterogeneous reactions are chemical reactions with conjugation of multiple phases, and they have the intrinsic nonlinearity of their dynamics caused by the effect of surface-area between different phases. We adapt a belief propagation method and an expectation-maximization (EM) algorithm to partial observation problem, in order to simultaneously estimate the time course of hidden variables and the kinetic parameters underlying dynamics. The proposed belief propagation method is performed by using sequential Monte Carlo algorithm in order to estimate nonlinear dynamical system. Using our proposed method, we show that the rate constants of dissolution and precipitation reactions, which are typical examples of surface heterogeneous reactions, as well as the temporal changes of solid reactants and products, were successfully estimated only from the observable temporal changes in the concentration of the dissolved intermediate product.

  4. On Real-Time Systems Using Local Area Networks.

    Science.gov (United States)

    1987-07-01

    87-35 July, 1987 CS-TR-1892 On Real - Time Systems Using Local Area Networks*I VShem-Tov Levi Department of Computer Science Satish K. Tripathit...1892 On Real - Time Systems Using Local Area Networks* Shem-Tov Levi Department of Computer Science Satish K. Tripathit Department of Computer Science...constraints and the clock systems that feed the time to real - time systems . A model for real-time system based on LAN communication is presented in

  5. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    Science.gov (United States)

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Mixed-mode Operating System for Real-time Performance

    OpenAIRE

    M.M. Hasan; S. Sultana; C.K. Foo

    2017-01-01

    The purpose of the mixed-mode system research is to handle devices with the accuracy of real-time systems and at the same time, having all the benefits and facilities of a matured Graphic User Interface (GUI) operating system which is typically nonreal-time. This mixed-mode operating system comprising of a real-time portion and a non-real-time portion was studied and implemented to identify the feasibilities and performances in practical applications (in the context of scheduled the real-time...

  7. Linux real-time framework for fusion devices

    Energy Technology Data Exchange (ETDEWEB)

    Neto, Andre [Associacao Euratom-IST, Instituto de Plasmas e Fusao Nuclear, Av. Rovisco Pais, 1049-001 Lisboa (Portugal)], E-mail: andre.neto@cfn.ist.utl.pt; Sartori, Filippo; Piccolo, Fabio [Euratom-UKAEA, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Barbalace, Antonio [Euratom-ENEA Association, Consorzio RFX, 35127 Padova (Italy); Vitelli, Riccardo [Dipartimento di Informatica, Sistemi e Produzione, Universita di Roma, Tor Vergata, Via del Politecnico 1-00133, Roma (Italy); Fernandes, Horacio [Associacao Euratom-IST, Instituto de Plasmas e Fusao Nuclear, Av. Rovisco Pais, 1049-001 Lisboa (Portugal)

    2009-06-15

    A new framework for the development and execution of real-time codes is currently being developed and commissioned at JET. The foundations of the system are Linux, the Real Time Application Interface (RTAI) and a wise exploitation of the new i386 multi-core processors technology. The driving motivation was the need to find a real-time operating system for the i386 platform able to satisfy JET Vertical Stabilisation Enhancement project requirements: 50 {mu}s cycle time. Even if the initial choice was the VxWorks operating system, it was decided to explore an open source alternative, mostly because of the costs involved in the commercial product. The work started with the definition of a precise set of requirements and milestones to achieve: Linux distribution and kernel versions to be used for the real-time operating system; complete characterization of the Linux/RTAI real-time capabilities; exploitation of the multi-core technology; implementation of all the required and missing features; commissioning of the system. Latency and jitter measurements were compared for Linux and RTAI in both user and kernel-space. The best results were attained using the RTAI kernel solution where the time to reschedule a real-time task after an external interrupt is of 2.35 {+-} 0.35 {mu}s. In order to run the real-time codes in the kernel-space, a solution to provide user-space functionalities to the kernel modules had to be designed. This novel work provided the most common functions from the standard C library and transparent interaction with files and sockets to the kernel real-time modules. Kernel C++ support was also tested, further developed and integrated in the framework. The work has produced very convincing results so far: complete isolation of the processors assigned to real-time from the Linux non real-time activities, high level of stability over several days of benchmarking operations and values well below 3 {mu}s for task rescheduling after external interrupt. From

  8. Linux real-time framework for fusion devices

    International Nuclear Information System (INIS)

    Neto, Andre; Sartori, Filippo; Piccolo, Fabio; Barbalace, Antonio; Vitelli, Riccardo; Fernandes, Horacio

    2009-01-01

    A new framework for the development and execution of real-time codes is currently being developed and commissioned at JET. The foundations of the system are Linux, the Real Time Application Interface (RTAI) and a wise exploitation of the new i386 multi-core processors technology. The driving motivation was the need to find a real-time operating system for the i386 platform able to satisfy JET Vertical Stabilisation Enhancement project requirements: 50 μs cycle time. Even if the initial choice was the VxWorks operating system, it was decided to explore an open source alternative, mostly because of the costs involved in the commercial product. The work started with the definition of a precise set of requirements and milestones to achieve: Linux distribution and kernel versions to be used for the real-time operating system; complete characterization of the Linux/RTAI real-time capabilities; exploitation of the multi-core technology; implementation of all the required and missing features; commissioning of the system. Latency and jitter measurements were compared for Linux and RTAI in both user and kernel-space. The best results were attained using the RTAI kernel solution where the time to reschedule a real-time task after an external interrupt is of 2.35 ± 0.35 μs. In order to run the real-time codes in the kernel-space, a solution to provide user-space functionalities to the kernel modules had to be designed. This novel work provided the most common functions from the standard C library and transparent interaction with files and sockets to the kernel real-time modules. Kernel C++ support was also tested, further developed and integrated in the framework. The work has produced very convincing results so far: complete isolation of the processors assigned to real-time from the Linux non real-time activities, high level of stability over several days of benchmarking operations and values well below 3 μs for task rescheduling after external interrupt. From being the

  9. Static Schedulers for Embedded Real-Time Systems

    Science.gov (United States)

    1989-12-01

    Because of the need for having efficient scheduling algorithms in large scale real time systems , software engineers put a lot of effort on developing...provide static schedulers for he Embedded Real Time Systems with single processor using Ada programming language. The independent nonpreemptable...support the Computer Aided Rapid Prototyping for Embedded Real Time Systems so that we determine whether the system, as designed, meets the required

  10. Real-time motional Stark effect in jet

    International Nuclear Information System (INIS)

    Alves, D.; Stephen, A.; Hawkes, N.; Dalley, S.; Goodyear, A.; Felton, R.; Joffrin, E.; Fernandes, H.

    2004-01-01

    The increasing importance of real-time measurements and control systems in JET experiments, regarding e.g. Internal Transport Barrier (ITB) and q-profile control, has motivated the development of a real-time motional Stark effect (MSE) system. The MSE diagnostic allows the measurement of local magnetic fields in different locations along the neutral beam path providing, therefore, local measurement of the current and q-profiles. Recently in JET, an upgrade of the MSE diagnostic has been implemented, incorporating a totally new system which allows the use of this diagnostic as a real-time control tool as well as an extended data source for off-line analysis. This paper will briefly describe the technical features of the real-time diagnostic with main focus on the system architecture, which consists of a VME crate hosting three PowerPC processor boards and a fast ADC, all connected via Front Panel Data Port (FPDP). The DSP algorithm implements a lockin-amplifier required to demodulate the JET MSE signals. Some applications for the system will be covered such as: feeding the real-time equilibrium reconstruction code (EQUINOX) and allowing the full coverage analysis of the Neutral Beam time window. A brief comparison between the real-time MSE analysis and the off-line analysis will also be presented

  11. An Algebraic Framework for Temporal Attribute Characteristics

    DEFF Research Database (Denmark)

    Böhlen, M. H.; Gamper, J.; Jensen, Christian Søndergaard

    2006-01-01

    Most real-world database applications manage temporal data, i.e., data with associated time references that capture a temporal aspect of the data, typically either when the data is valid or when the data is known. Such applications abound in, e.g., the financial, medical, and scientific domains...

  12. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    Science.gov (United States)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  13. Scala for Real-Time Systems?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2015-01-01

    Java served well as a general-purpose language. However, during its two decades of constant change it has gotten some weight and legacy in the language syntax and the libraries. Furthermore, Java's success for real-time systems is mediocre. Scala is a modern object-oriented and functional language...... with interesting new features. Although a new language, it executes on a Java virtual machine, reusing that technology. This paper explores Scala as language for future real-time systems....

  14. Towards exascale real-time RFI mitigation

    NARCIS (Netherlands)

    van Nieuwpoort, R.V.

    2016-01-01

    We describe the design and implementation of an extremely scalable real-time RFI mitigation method, based on the offline AOFlagger. All algorithms scale linearly in the number of samples. We describe how we implemented the flagger in the LOFAR real-time pipeline, on both CPUs and GPUs. Additionally,

  15. Emergent Auditory Feature Tuning in a Real-Time Neuromorphic VLSI System.

    Science.gov (United States)

    Sheik, Sadique; Coath, Martin; Indiveri, Giacomo; Denham, Susan L; Wennekers, Thomas; Chicca, Elisabetta

    2012-01-01

    Many sounds of ecological importance, such as communication calls, are characterized by time-varying spectra. However, most neuromorphic auditory models to date have focused on distinguishing mainly static patterns, under the assumption that dynamic patterns can be learned as sequences of static ones. In contrast, the emergence of dynamic feature sensitivity through exposure to formative stimuli has been recently modeled in a network of spiking neurons based on the thalamo-cortical architecture. The proposed network models the effect of lateral and recurrent connections between cortical layers, distance-dependent axonal transmission delays, and learning in the form of Spike Timing Dependent Plasticity (STDP), which effects stimulus-driven changes in the pattern of network connectivity. In this paper we demonstrate how these principles can be efficiently implemented in neuromorphic hardware. In doing so we address two principle problems in the design of neuromorphic systems: real-time event-based asynchronous communication in multi-chip systems, and the realization in hybrid analog/digital VLSI technology of neural computational principles that we propose underlie plasticity in neural processing of dynamic stimuli. The result is a hardware neural network that learns in real-time and shows preferential responses, after exposure, to stimuli exhibiting particular spectro-temporal patterns. The availability of hardware on which the model can be implemented, makes this a significant step toward the development of adaptive, neurobiologically plausible, spike-based, artificial sensory systems.

  16. Time-Optimal Real-Time Test Case Generation using UPPAAL

    DEFF Research Database (Denmark)

    Hessel, Anders; Larsen, Kim Guldstrand; Nielsen, Brian

    2004-01-01

    Testing is the primary software validation technique used by industry today, but remains ad hoc, error prone, and very expensive. A promising improvement is to automatically generate test cases from formal models of the system under test. We demonstrate how to automatically generate real...... test purposes or generated automatically from various coverage criteria of the model.......-time conformance test cases from timed automata specifications. Specifically we demonstrate how to fficiently generate real-time test cases with optimal execution time i.e test cases that are the fastest possible to execute. Our technique allows time optimal test cases to be generated using manually formulated...

  17. Performance evaluation of near-real-time accounting systems

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    Examples are given illustrating the application of near-real-time accounting concepts and principles to actual nuclear facilities. Experience with prototypical systems at the AGNS reprocessing plant and the Los Alamos plutonium facility is described using examples of actual data to illustrate the performance and effectiveness of near-real-time systems. The purpose of the session is to enable participants to: (1) identify the major components of near-real-time accounting systems; (2) describe qualitatively the advantages, limitations, and performance of such systems in real nuclear facilities; (3) identify process and facility design characteristics that affect the performance of near-real-time systems; and (4) describe qualitatively the steps necessary to implement a near-real-time accounting and control system in a nuclear facility

  18. Cure modeling in real-time prediction: How much does it help?

    Science.gov (United States)

    Ying, Gui-Shuang; Zhang, Qiang; Lan, Yu; Li, Yimei; Heitjan, Daniel F

    2017-08-01

    Various parametric and nonparametric modeling approaches exist for real-time prediction in time-to-event clinical trials. Recently, Chen (2016 BMC Biomedical Research Methodology 16) proposed a prediction method based on parametric cure-mixture modeling, intending to cover those situations where it appears that a non-negligible fraction of subjects is cured. In this article we apply a Weibull cure-mixture model to create predictions, demonstrating the approach in RTOG 0129, a randomized trial in head-and-neck cancer. We compare the ultimate realized data in RTOG 0129 to interim predictions from a Weibull cure-mixture model, a standard Weibull model without a cure component, and a nonparametric model based on the Bayesian bootstrap. The standard Weibull model predicted that events would occur earlier than the Weibull cure-mixture model, but the difference was unremarkable until late in the trial when evidence for a cure became clear. Nonparametric predictions often gave undefined predictions or infinite prediction intervals, particularly at early stages of the trial. Simulations suggest that cure modeling can yield better-calibrated prediction intervals when there is a cured component, or the appearance of a cured component, but at a substantial cost in the average width of the intervals. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. ANUBIS: artificial neuromodulation using a Bayesian inference system.

    Science.gov (United States)

    Smith, Benjamin J H; Saaj, Chakravarthini M; Allouis, Elie

    2013-01-01

    Gain tuning is a crucial part of controller design and depends not only on an accurate understanding of the system in question, but also on the designer's ability to predict what disturbances and other perturbations the system will encounter throughout its operation. This letter presents ANUBIS (artificial neuromodulation using a Bayesian inference system), a novel biologically inspired technique for automatically tuning controller parameters in real time. ANUBIS is based on the Bayesian brain concept and modifies it by incorporating a model of the neuromodulatory system comprising four artificial neuromodulators. It has been applied to the controller of EchinoBot, a prototype walking rover for Martian exploration. ANUBIS has been implemented at three levels of the controller; gait generation, foot trajectory planning using Bézier curves, and foot trajectory tracking using a terminal sliding mode controller. We compare the results to a similar system that has been tuned using a multilayer perceptron. The use of Bayesian inference means that the system retains mathematical interpretability, unlike other intelligent tuning techniques, which use neural networks, fuzzy logic, or evolutionary algorithms. The simulation results show that ANUBIS provides significant improvements in efficiency and adaptability of the three controller components; it allows the robot to react to obstacles and uncertainties faster than the system tuned with the MLP, while maintaining stability and accuracy. As well as advancing rover autonomy, ANUBIS could also be applied to other situations where operating conditions are likely to change or cannot be accurately modeled in advance, such as process control. In addition, it demonstrates one way in which neuromodulation could fit into the Bayesian brain framework.

  20. From Territorial to Temporal Ambitions: The Politics of Time and Imagination in Massive Multiplayer Online Forecasting Games

    Directory of Open Access Journals (Sweden)

    Lonny J. Avi Brooks

    2015-09-01

    Full Text Available In 2010, the online forecasting game Urgent Evoke, produced by Jane McGonigal, former Director of Gaming at the Institute for the Future and the World Bank Institute, elicited praise by gaming critics as a model for “serious gaming.” The game promised to show how players could think about long-term solutions to urgent social problems like hunger, poverty, conflict, and climate change using the African continent as its test bed. Players imagined future temporal outcomes through remixing media. In a qualitative analysis of actual game play during the real-time introduction of the game for teaching organizational communication, Evoke became a platform for student inquiry for questioning its underlying design as an expansion of territorial and temporal conquest. Evoke served as a springboard for building a literacy of critical time among students in accessing stakeholder power to determine the future. Students challenged, created, and followed the cultural capital of promissory visions in circulation. The curriculum design for using serious and forecasting games like Evoke must account for the conceptual development of what Sarah Sharma calls critical time or chronopolitics, a hidden politics of time that shapes our approaches to cultures, organizations, and innovation. By placing spatial and temporal dynamics center stage, we investigated how serious games produce a chronopolitics of time differentiating among people by class and ethnicity. Alternative Reality Games offer the potential for building a literacy of critical forecasting time to understand the practices for anticipating the future as temporal networks of power, different and uneven.

  1. Distributed Issues for Ada Real-Time Systems

    Science.gov (United States)

    1990-07-23

    NUMBERS Distributed Issues for Ada Real - Time Systems MDA 903-87- C- 0056 S. AUTHOR(S) Thomas E. Griest 7. PERFORMING ORGANiZATION NAME(S) AND ADORESS(ES) 8...considerations. I Adding to the problem of distributed real - time systems is the issue of maintaining a common sense of time among all of the processors...because -omeone is waiting for the final output of a very large set of computations. However in real - time systems , consistent meeting of short-term

  2. Design Specifications for Adaptive Real-Time Systems

    Science.gov (United States)

    1991-12-01

    TICfl \\ E CT E Design Specifications for JAN’\\ 1992 Adaptive Real - Time Systems fl Randall W. Lichota U, Alice H. Muntz - December 1991 \\ \\\\/ 0 / r...268-2056 Technical Report CMU/SEI-91-TR-20 ESD-91-TR-20 December 1991 Design Specifications for Adaptive Real - Time Systems Randall W. Lichota Hughes...Design Specifications for Adaptive Real - Time Systems Abstract: The design specification method described in this report treats a software

  3. Design Recovery Technology for Real-Time Systems.

    Science.gov (United States)

    1995-10-01

    RL-TR-95-208 Final Technical Report October 1995 DESIGN RECOVERY TECHNOLOGY FOR REAL TIME SYSTEMS The MITRE Corporation Lester J. Holtzblatt...92 - Jan 95 4. TTTLE AND SUBTITLE DESIGN RECOVERY TECHNOLOGY FOR REAL - TIME SYSTEMS 6. AUTHOR(S) Lester J. Holtzblatt, Richard Piazza, and Susan...behavior of real - time systems in general, our initial efforts have centered on recovering this information from one system in particular, the Modular

  4. [Real time 3D echocardiography

    Science.gov (United States)

    Bauer, F.; Shiota, T.; Thomas, J. D.

    2001-01-01

    Three-dimensional representation of the heart is an old concern. Usually, 3D reconstruction of the cardiac mass is made by successive acquisition of 2D sections, the spatial localisation and orientation of which require complex guiding systems. More recently, the concept of volumetric acquisition has been introduced. A matricial emitter-receiver probe complex with parallel data processing provides instantaneous of a pyramidal 64 degrees x 64 degrees volume. The image is restituted in real time and is composed of 3 planes (planes B and C) which can be displaced in all spatial directions at any time during acquisition. The flexibility of this system of acquisition allows volume and mass measurement with greater accuracy and reproducibility, limiting inter-observer variability. Free navigation of the planes of investigation allows reconstruction for qualitative and quantitative analysis of valvular heart disease and other pathologies. Although real time 3D echocardiography is ready for clinical usage, some improvements are still necessary to improve its conviviality. Then real time 3D echocardiography could be the essential tool for understanding, diagnosis and management of patients.

  5. Real-time communication for distributed plasma control systems

    Energy Technology Data Exchange (ETDEWEB)

    Luchetta, A. [Consorzio RFX, Associazione Euratom-ENEA sulla Fusione, Corso Stati Uniti 4, Padova 35127 (Italy)], E-mail: adriano.luchetta@igi.cnr.it; Barbalace, A.; Manduchi, G.; Soppelsa, A.; Taliercio, C. [Consorzio RFX, Associazione Euratom-ENEA sulla Fusione, Corso Stati Uniti 4, Padova 35127 (Italy)

    2008-04-15

    Real-time control applications will benefit in the near future from the enhanced performance provided by multi-core processor architectures. Nevertheless real-time communication will continue to be critical in distributed plasma control systems where the plant under control typically is distributed over a wide area. At RFX-mod real-time communication is crucial for hard real-time plasma control, due to the distributed architecture of the system, which consists of several VMEbus stations. The system runs under VxWorks and uses Gigabit Ethernet for sub-millisecond real-time communication. To optimize communication in the system, a set of detailed measurements has been carried out on the target platforms (Motorola MVME5100 and MVME5500) using either the VxWorks User Datagram Protocol (UDP) stack or raw communication based on the data link layer. Measurements have been carried out also under Linux, using its UDP stack or, in alternative, RTnet, an open source hard real-time network protocol stack. RTnet runs under Xenomai or RTAI, two popular real-time extensions based on the Linux kernel. The paper reports on the measurements carried out and compares the results, showing that the performance obtained by using open source code is suitable for sub-millisecond real-time communication in plasma control.

  6. An adaptive Bayesian inference algorithm to estimate the parameters of a hazardous atmospheric release

    Science.gov (United States)

    Rajaona, Harizo; Septier, François; Armand, Patrick; Delignon, Yves; Olry, Christophe; Albergel, Armand; Moussafir, Jacques

    2015-12-01

    In the eventuality of an accidental or intentional atmospheric release, the reconstruction of the source term using measurements from a set of sensors is an important and challenging inverse problem. A rapid and accurate estimation of the source allows faster and more efficient action for first-response teams, in addition to providing better damage assessment. This paper presents a Bayesian probabilistic approach to estimate the location and the temporal emission profile of a pointwise source. The release rate is evaluated analytically by using a Gaussian assumption on its prior distribution, and is enhanced with a positivity constraint to improve the estimation. The source location is obtained by the means of an advanced iterative Monte-Carlo technique called Adaptive Multiple Importance Sampling (AMIS), which uses a recycling process at each iteration to accelerate its convergence. The proposed methodology is tested using synthetic and real concentration data in the framework of the Fusion Field Trials 2007 (FFT-07) experiment. The quality of the obtained results is comparable to those coming from the Markov Chain Monte Carlo (MCMC) algorithm, a popular Bayesian method used for source estimation. Moreover, the adaptive processing of the AMIS provides a better sampling efficiency by reusing all the generated samples.

  7. Analyzing bioassay data using Bayesian methods -- A primer

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Schillaci, M.E.; Martz, H.F.; Little, T.T.

    2000-06-01

    The classical statistics approach used in health physics for the interpretation of measurements is deficient in that it does not take into account needle in a haystack effects, that is, correct identification of events that are rare in a population. This is often the case in health physics measurements, and the false positive fraction (the fraction of results measuring positive that are actually zero) is often very large using the prescriptions of classical statistics. Bayesian statistics provides a methodology to minimize the number of incorrect decisions (wrong calls): false positives and false negatives. The authors present the basic method and a heuristic discussion. Examples are given using numerically generated and real bioassay data for tritium. Various analytical models are used to fit the prior probability distribution in order to test the sensitivity to choice of model. Parametric studies show that for typical situations involving rare events the normalized Bayesian decision level k{sub {alpha}} = L{sub c}/{sigma}{sub 0}, where {sigma}{sub 0} is the measurement uncertainty for zero true amount, is in the range of 3 to 5 depending on the true positive rate. Four times {sigma}{sub 0} rather than approximately two times {sigma}{sub 0}, as in classical statistics, would seem a better choice for the decision level in these situations.

  8. Hierarchical Bayesian modeling of the space - time diffusion patterns of cholera epidemic in Kumasi, Ghana

    NARCIS (Netherlands)

    Osei, Frank B.; Osei, F.B.; Duker, Alfred A.; Stein, A.

    2011-01-01

    This study analyses the joint effects of the two transmission routes of cholera on the space-time diffusion dynamics. Statistical models are developed and presented to investigate the transmission network routes of cholera diffusion. A hierarchical Bayesian modelling approach is employed for a joint

  9. Uses and misuses of Bayes' rule and Bayesian classifiers in cybersecurity

    Science.gov (United States)

    Bard, Gregory V.

    2017-12-01

    This paper will discuss the applications of Bayes' Rule and Bayesian Classifiers in Cybersecurity. While the most elementary form of Bayes' rule occurs in undergraduate coursework, there are more complicated forms as well. As an extended example, Bayesian spam filtering is explored, and is in many ways the most triumphant accomplishment of Bayesian reasoning in computer science, as nearly everyone with an email address has a spam folder. Bayesian Classifiers have also been responsible significant cybersecurity research results; yet, because they are not part of the standard curriculum, few in the mathematics or information-technology communities have seen the exact definitions, requirements, and proofs that comprise the subject. Moreover, numerous errors have been made by researchers (described in this paper), due to some mathematical misunderstandings dealing with conditional independence, or other badly chosen assumptions. Finally, to provide instructors and researchers with real-world examples, 25 published cybersecurity papers that use Bayesian reasoning are given, with 2-4 sentence summaries of the focus and contributions of each paper.

  10. Real-time 3D vectorcardiography: an application for didactic use

    International Nuclear Information System (INIS)

    Daniel, G; Lissa, G; Redondo, D Medina; Vasquez, L; Zapata, D

    2007-01-01

    The traditional approach to teach the physiological basis of electrocardiography, based only on textbooks, turns out to be insufficient or confusing for students of biomedical sciences. The addition of laboratory practice to the curriculum enables students to approach theoretical aspects from a hands-on experience, resulting in a more efficient and deeper knowledge of the phenomena of interest. Here, we present the development of a PC-based application meant to facilitate the understanding of cardiac bioelectrical phenomena by visualizing in real time the instantaneous 3D cardiac vector. The system uses 8 standard leads from a 12-channel electrocardiograph. The application interface has pedagogic objectives, and facilitates the observation of cardiac depolarization and repolarization and its temporal relationship with the ECG, making it simpler to interpret

  11. Time and temporality: linguistic distribution in human life-games

    DEFF Research Database (Denmark)

    Cowley, Stephen

    2014-01-01

    culture, individuals developed a knack of using linguistic distribution to link metabolism with collective ways of assessing and managing experience. Of human temporal management, the best known case is the mental time travel enabled by, among other functions, autobiographical memory. One contribution......While clock-time can be used to clarify facts, all living systems construct their own temporalities. Having illustrated the claim for foxtail grasses, it is argued that, with motility and brains, organisms came to use temporalities that build flexibility into behavior. With the rise of human...

  12. Real-time quasi-3D tomographic reconstruction

    Science.gov (United States)

    Buurlage, Jan-Willem; Kohr, Holger; Palenstijn, Willem Jan; Joost Batenburg, K.

    2018-06-01

    Developments in acquisition technology and a growing need for time-resolved experiments pose great computational challenges in tomography. In addition, access to reconstructions in real time is a highly demanded feature but has so far been out of reach. We show that by exploiting the mathematical properties of filtered backprojection-type methods, having access to real-time reconstructions of arbitrarily oriented slices becomes feasible. Furthermore, we present , software for visualization and on-demand reconstruction of slices. A user of can interactively shift and rotate slices in a GUI, while the software updates the slice in real time. For certain use cases, the possibility to study arbitrarily oriented slices in real time directly from the measured data provides sufficient visual and quantitative insight. Two such applications are discussed in this article.

  13. Integrated survival analysis using an event-time approach in a Bayesian framework.

    Science.gov (United States)

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  14. Integrated survival analysis using an event-time approach in a Bayesian framework

    Science.gov (United States)

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  15. De toekomst van Real Time Intelligence

    NARCIS (Netherlands)

    Broek, J. van den; Berg, C.H. van den

    2013-01-01

    Al direct vanaf de start van de Nationale Politie is gewerkt aan het opzetten van tien real-time intelligence centra in Nederland. Van daaruit worden 24 uur per dag en zeven dagen in de week agenten op straat actief ondersteund met real-time informatie bij de melding waar ze op af gaan. In de visie

  16. Real-Time Parameter Identification

    Data.gov (United States)

    National Aeronautics and Space Administration — Armstrong researchers have implemented in the control room a technique for estimating in real time the aerodynamic parameters that describe the stability and control...

  17. Real time process algebra with time-dependent conditions

    NARCIS (Netherlands)

    Baeten, J.C.M.; Middelburg, C.A.

    We extend the main real time version of ACP presented in [6] with conditionals in which the condition depends on time. This extension facilitates flexible dependence of proccess behaviour on initialization time. We show that the conditions concerned generalize the conditions introduced earlier

  18. Paleotempestological chronology developed from gas ion source AMS analysis of carbonates determined through real-time Bayesian statistical approach

    Science.gov (United States)

    Wallace, D. J.; Rosenheim, B. E.; Roberts, M. L.; Burton, J. R.; Donnelly, J. P.; Woodruff, J. D.

    2014-12-01

    Is a small quantity of high-precision ages more robust than a higher quantity of lower-precision ages for sediment core chronologies? AMS Radiocarbon ages have been available to researchers for several decades now, and precision of the technique has continued to improve. Analysis and time cost is high, though, and projects are often limited in terms of the number of dates that can be used to develop a chronology. The Gas Ion Source at the National Ocean Sciences Accelerator Mass Spectrometry Facility (NOSAMS), while providing lower-precision (uncertainty of order 100 14C y for a sample), is significantly less expensive and far less time consuming than conventional age dating and offers the unique opportunity for large amounts of ages. Here we couple two approaches, one analytical and one statistical, to investigate the utility of an age model comprised of these lower-precision ages for paleotempestology. We use a gas ion source interfaced to a gas-bench type device to generate radiocarbon dates approximately every 5 minutes while determining the order of sample analysis using the published Bayesian accumulation histories for deposits (Bacon). During two day-long sessions, several dates were obtained from carbonate shells in living position in a sediment core comprised of sapropel gel from Mangrove Lake, Bermuda. Samples were prepared where large shells were available, and the order of analysis was determined by the depth with the highest uncertainty according to Bacon. We present the results of these analyses as well as a prognosis for a future where such age models can be constructed from many dates that are quickly obtained relative to conventional radiocarbon dates. This technique currently is limited to carbonates, but development of a system for organic material dating is underway. We will demonstrate the extent to which sacrificing some analytical precision in favor of more dates improves age models.

  19. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  20. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  1. Real Time Linux - The RTOS for Astronomy?

    Science.gov (United States)

    Daly, P. N.

    The BoF was attended by about 30 participants and a free CD of real time Linux-based upon RedHat 5.2-was available. There was a detailed presentation on the nature of real time Linux and the variants for hard real time: New Mexico Tech's RTL and DIAPM's RTAI. Comparison tables between standard Linux and real time Linux responses to time interval generation and interrupt response latency were presented (see elsewhere in these proceedings). The present recommendations are to use RTL for UP machines running the 2.0.x kernels and RTAI for SMP machines running the 2.2.x kernel. Support, both academically and commercially, is available. Some known limitations were presented and the solutions reported e.g., debugging and hardware support. The features of RTAI (scheduler, fifos, shared memory, semaphores, message queues and RPCs) were described. Typical performance statistics were presented: Pentium-based oneshot tasks running > 30kHz, 486-based oneshot tasks running at ~ 10 kHz, periodic timer tasks running in excess of 90 kHz with average zero jitter peaking to ~ 13 mus (UP) and ~ 30 mus (SMP). Some detail on kernel module programming, including coding examples, were presented showing a typical data acquisition system generating simulated (random) data writing to a shared memory buffer and a fifo buffer to communicate between real time Linux and user space. All coding examples were complete and tested under RTAI v0.6 and the 2.2.12 kernel. Finally, arguments were raised in support of real time Linux: it's open source, free under GPL, enables rapid prototyping, has good support and the ability to have a fully functioning workstation capable of co-existing hard real time performance. The counter weight-the negatives-of lack of platforms (x86 and PowerPC only at present), lack of board support, promiscuous root access and the danger of ignorance of real time programming issues were also discussed. See ftp://orion.tuc.noao.edu/pub/pnd/rtlbof.tgz for the StarOffice overheads

  2. Reasoning with data an introduction to traditional and Bayesian statistics using R

    CERN Document Server

    Stanton, Jeffrey M

    2017-01-01

    Engaging and accessible, this book teaches readers how to use inferential statistical thinking to check their assumptions, assess evidence about their beliefs, and avoid overinterpreting results that may look more promising than they really are. It provides step-by-step guidance for using both classical (frequentist) and Bayesian approaches to inference. Statistical techniques covered side by side from both frequentist and Bayesian approaches include hypothesis testing, replication, analysis of variance, calculation of effect sizes, regression, time series analysis, and more. Students also get a complete introduction to the open-source R programming language and its key packages. Throughout the text, simple commands in R demonstrate essential data analysis skills using real-data examples. The companion website provides annotated R code for the book's examples, in-class exercises, supplemental reading lists, and links to online videos, interactive materials, and other resources.

  3. The real-time price elasticity of electricity

    International Nuclear Information System (INIS)

    Lijesen, Mark G.

    2007-01-01

    The real-time price elasticity of electricity contains important information on the demand response of consumers to the volatility of peak prices. Despite the importance, empirical estimates of the real-time elasticity are hardly available. This paper provides a quantification of the real-time relationship between total peak demand and spot market prices. We find a low value for the real-time price elasticity, which may partly be explained from the fact that not all users observe the spot market price. If we correct for this phenomenon, we find the elasticity to be fairly low for consumers currently active in the spot market. If this conclusion applies to all users, this would imply a limited scope for government intervention in supply security issues. (Author)

  4. A Programmable Microkernel for Real-Time Systems

    Science.gov (United States)

    2003-06-01

    A Programmable Microkernel for Real - Time Systems Christoph M. Kirsch Thomas A. Henzinger Marco A.A. Sanvido Report No. UCB/CSD-3-1250 June 2003...TITLE AND SUBTITLE A Programmable Microkernel for Real - Time Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 A Programmable Microkernel for Real - Time Systems ∗ Christoph M

  5. Temporal logic motion planning

    CSIR Research Space (South Africa)

    Seotsanyana, M

    2010-01-01

    Full Text Available In this paper, a critical review on temporal logic motion planning is presented. The review paper aims to address the following problems: (a) In a realistic situation, the motion planning problem is carried out in real-time, in a dynamic, uncertain...

  6. Real-time object-oriented programming: studies and proposals

    International Nuclear Information System (INIS)

    Fouquier, Gilles

    1996-01-01

    This thesis contributes to the introduction of real-time features in object-oriented programming. Object-oriented programming favours modularity and reusability. Therefore, its application to real-time introduces many theoretical and conceptual problems. To deal with these problems, a new real-time object-oriented programming model is presented. This model is based on the active object model which allows concurrence and maintains the encapsulation property. The real-time aspect is treated by replacing the concept of task by the concept of method processing and by associating a real-time constraint to each message (priority or deadline). The set of all the running methods is scheduled. This model, called ATOME, contains several sub-models to deal with the usual concurrence control integrating their priority and deadline processing. The classical HPF and EDF scheduling avoid priority or deadline inversion. This model and its variants are new proposals to program real-time applications in the object-oriented way, therefore easing reusability and code writing. The feasibility of this approach is demonstrated by extending and existing active object-based language to real-time, in using the rules defined in the ATOME model. (author) [fr

  7. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  8. Analysis of hourly crash likelihood using unbalanced panel data mixed logit model and real-time driving environmental big data.

    Science.gov (United States)

    Chen, Feng; Chen, Suren; Ma, Xiaoxiang

    2018-06-01

    Driving environment, including road surface conditions and traffic states, often changes over time and influences crash probability considerably. It becomes stretched for traditional crash frequency models developed in large temporal scales to capture the time-varying characteristics of these factors, which may cause substantial loss of critical driving environmental information on crash prediction. Crash prediction models with refined temporal data (hourly records) are developed to characterize the time-varying nature of these contributing factors. Unbalanced panel data mixed logit models are developed to analyze hourly crash likelihood of highway segments. The refined temporal driving environmental data, including road surface and traffic condition, obtained from the Road Weather Information System (RWIS), are incorporated into the models. Model estimation results indicate that the traffic speed, traffic volume, curvature and chemically wet road surface indicator are better modeled as random parameters. The estimation results of the mixed logit models based on unbalanced panel data show that there are a number of factors related to crash likelihood on I-25. Specifically, weekend indicator, November indicator, low speed limit and long remaining service life of rutting indicator are found to increase crash likelihood, while 5-am indicator and number of merging ramps per lane per mile are found to decrease crash likelihood. The study underscores and confirms the unique and significant impacts on crash imposed by the real-time weather, road surface, and traffic conditions. With the unbalanced panel data structure, the rich information from real-time driving environmental big data can be well incorporated. Copyright © 2018 National Safety Council and Elsevier Ltd. All rights reserved.

  9. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  10. Real-time construction and visualisation of drift-free video mosaics from unconstrained camera motion

    Directory of Open Access Journals (Sweden)

    Mateusz Brzeszcz

    2015-08-01

    Full Text Available This work proposes a novel approach for real-time video mosaicking facilitating drift-free mosaic construction and visualisation, with integrated frame blending and redundancy management, that is shown to be flexible to a range of varying mosaic scenarios. The approach supports unconstrained camera motion with in-sequence loop closing, variation in camera focal distance (zoom and recovery from video sequence breaks. Real-time performance, over extended duration sequences, is realised via novel aspects of frame management within the mosaic representation and thus avoiding the high data redundancy associated with temporally dense, spatially overlapping video frame inputs. This managed set of image frames is visualised in real time using a dynamic mosaic representation of overlapping textured graphics primitives in place of the traditional globally constructed, and hence frequently reconstructed, mosaic image. Within this formulation, subsequent optimisation occurring during online construction can thus efficiency adjust relative frame positions via simple primitive position transforms. Effective visualisation is similarly facilitated by online inter-frame blending to overcome the illumination and colour variance associated with modern camera hardware. The evaluation illustrates overall robustness in video mosaic construction under a diverse range of conditions including indoor and outdoor environments, varying illumination and presence of in-scene motion on varying computational platforms.

  11. Real-time full-field characterization of transient dissipative soliton dynamics in a mode-locked laser

    Science.gov (United States)

    Ryczkowski, P.; Närhi, M.; Billet, C.; Merolla, J.-M.; Genty, G.; Dudley, J. M.

    2018-04-01

    Dissipative solitons are remarkably localized states of a physical system that arise from the dynamical balance between nonlinearity, dispersion and environmental energy exchange. They are the most universal form of soliton that can exist, and are seen in far-from-equilibrium systems in many fields, including chemistry, biology and physics. There has been particular interest in studying their properties in mode-locked lasers, but experiments have been limited by the inability to track the dynamical soliton evolution in real time. Here, we use simultaneous dispersive Fourier transform and time-lens measurements to completely characterize the spectral and temporal evolution of ultrashort dissipative solitons as their dynamics pass through a transient unstable regime with complex break-up and collisions before stabilization. Further insight is obtained from reconstruction of the soliton amplitude and phase and calculation of the corresponding complex-valued eigenvalue spectrum. These findings show how real-time measurements provide new insights into ultrafast transient dynamics in optics.

  12. Mechatronic modeling of real-time wheel-rail contact

    CERN Document Server

    Bosso, Nicola; Gugliotta, Antonio; Somà, Aurelio

    2013-01-01

    Real-time simulations of the behaviour of a rail vehicle require realistic solutions of the wheel-rail contact problem which can work in a real-time mode. Examples of such solutions for the online mode have been well known and are implemented within standard and commercial tools for the simulation codes for rail vehicle dynamics. This book is the result of the research activities carried out by the Railway Technology Lab of the Department of Mechanical and Aerospace Engineering at Politecnico di Torino. This book presents work on the project for the development of a real-time wheel-rail contact model and provides the simulation results obtained with dSpace real-time hardware. Besides this, the implementation of the contact model for the development of a real-time model for the complex mechatronic system of a scaled test rig is presented in this book and may be useful for the further validation of the real-time contact model with experiments on a full scale test rig.

  13. Internet-accessible real-time weather information system

    Digital Repository Service at National Institute of Oceanography (India)

    Desai, R.G.P.; Joseph, A.; Desa, E.; Mehra, P.; Desa, E.; Gouveia, A.D.

    An internet-accessible real-time weather information system has been developed. This system provides real-time accessibility to weather information from a multitude of spatially distributed weather stations. The Internet connectivity also offers...

  14. Automated real-time software development

    Science.gov (United States)

    Jones, Denise R.; Walker, Carrie K.; Turkovich, John J.

    1993-01-01

    A Computer-Aided Software Engineering (CASE) system has been developed at the Charles Stark Draper Laboratory (CSDL) under the direction of the NASA Langley Research Center. The CSDL CASE tool provides an automated method of generating source code and hard copy documentation from functional application engineering specifications. The goal is to significantly reduce the cost of developing and maintaining real-time scientific and engineering software while increasing system reliability. This paper describes CSDL CASE and discusses demonstrations that used the tool to automatically generate real-time application code.

  15. Bayesian Group Bridge for Bi-level Variable Selection.

    Science.gov (United States)

    Mallick, Himel; Yi, Nengjun

    2017-06-01

    A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.

  16. Geomagnetic Observatory Data for Real-Time Applications

    Science.gov (United States)

    Love, J. J.; Finn, C. A.; Rigler, E. J.; Kelbert, A.; Bedrosian, P.

    2015-12-01

    The global network of magnetic observatories represents a unique collective asset for the scientific community. Historically, magnetic observatories have supported global magnetic-field mapping projects and fundamental research of the Earth's interior and surrounding space environment. More recently, real-time data streams from magnetic observatories have become an important contributor to multi-sensor, operational monitoring of evolving space weather conditions, especially during magnetic storms. In this context, the U.S. Geological Survey (1) provides real-time observatory data to allied space weather monitoring projects, including those of NOAA, the U.S. Air Force, NASA, several international agencies, and private industry, (2) collaborates with Schlumberger to provide real-time geomagnetic data needed for directional drilling for oil and gas in Alaska, (3) develops products for real-time evaluation of hazards for the electric-power grid industry that are associated with the storm-time induction of geoelectric fields in the Earth's conducting lithosphere. In order to implement strategic priorities established by the USGS Natural Hazards Mission Area and the National Science and Technology Council, and with a focus on developing new real-time products, the USGS is (1) leveraging data management protocols already developed by the USGS Earthquake Program, (2) developing algorithms for mapping geomagnetic activity, a collaboration with NASA and NOAA, (3) supporting magnetotelluric surveys and developing Earth conductivity models, a collaboration with Oregon State University and the NSF's EarthScope Program, (4) studying the use of geomagnetic activity maps and Earth conductivity models for real-time estimation of geoelectric fields, (5) initiating geoelectric monitoring at several observatories, (6) validating real-time estimation algorithms against historical geomagnetic and geoelectric data. The success of these long-term projects is subject to funding constraints

  17. Real-time earthquake data feasible

    Science.gov (United States)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  18. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  19. Verifying real-time systems against scenario-based requirements

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Li, Shuhao; Nielsen, Brian

    2009-01-01

    We propose an approach to automatic verification of real-time systems against scenario-based requirements. A real-time system is modeled as a network of Timed Automata (TA), and a scenario-based requirement is specified as a Live Sequence Chart (LSC). We define a trace-based semantics for a kernel...... subset of the LSC language. By equivalently translating an LSC chart into an observer TA and then non-intrusively composing this observer with the original system model, the problem of verifying a real-time system against a scenario-based requirement reduces to a classical real-time model checking...

  20. Real-time UNIX in HEP data acquisition

    International Nuclear Information System (INIS)

    Buono, S.; Gaponenko, I.; Jones, R.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P.Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Aguer, M.; Huet, M.

    1994-01-01

    Today's experimentation in high energy physics is characterized by an increasing need for sensitivity to rare phenomena and complex physics signatures, which require the use of huge and sophisticated detectors and consequently a high performance readout and data acquisition. Multi-level triggering, hierarchical data collection and an always increasing amount of processing power, distributed throughout the data acquisition layers, will impose a number of features on the software environment, especially the need for a high level of standardization. Real-time UNIX seems, today, the best solution for the platform independence, operating system interface standards and real-time features necessary for data acquisition in HEP experiments. We present the results of the evaluation, in a realistic application environment, of a Real-Time UNIX operating system: the EP/LX real-time UNIX system. ((orig.))

  1. ClockWork: a Real-Time Feasibility Analysis Tool

    NARCIS (Netherlands)

    Jansen, P.G.; Hanssen, F.T.Y.; Mullender, Sape J.

    ClockWork shows that we can improve the flexibility and efficiency of real-time kernels. We do this by proposing methods for scheduling based on so-called Real-Time Transactions. ClockWork uses Real-Time Transactions which allow scheduling decisions to be taken by the system. A programmer does not

  2. Failure analysis of real-time systems

    International Nuclear Information System (INIS)

    Jalashgar, A.; Stoelen, K.

    1998-01-01

    This paper highlights essential aspects of real-time software systems that are strongly related to the failures and their course of propagation. The significant influence of means-oriented and goal-oriented system views in the description, understanding and analysing of those aspects is elaborated. The importance of performing failure analysis prior to reliability analysis of real-time systems is equally addressed. Problems of software reliability growth models taking the properties of such systems into account are discussed. Finally, the paper presents a preliminary study of a goal-oriented approach to model the static and dynamic characteristics of real-time systems, so that the corresponding analysis can be based on a more descriptive and informative picture of failures, their effects and the possibility of their occurrence. (author)

  3. Can Real-Time Data Also Be Climate Quality?

    Science.gov (United States)

    Brewer, M.; Wentz, F. J.

    2015-12-01

    GMI, AMSR-2 and WindSat herald a new era of highly accurate and timely microwave data products. Traditionally, there has been a large divide between real-time and re-analysis data products. What if these completely separate processing systems could be merged? Through advanced modeling and physically based algorithms, Remote Sensing Systems (RSS) has narrowed the gap between real-time and research-quality. Satellite microwave ocean products have proven useful for a wide array of timely Earth science applications. Through cloud SST capabilities have enormously benefited tropical cyclone forecasting and day to day fisheries management, to name a few. Oceanic wind vectors enhance operational safety of shipping and recreational boating. Atmospheric rivers are of import to many human endeavors, as are cloud cover and knowledge of precipitation events. Some activities benefit from both climate and real-time operational data used in conjunction. RSS has been consistently improving microwave Earth Science Data Records (ESDRs) for several decades, while making near real-time data publicly available for semi-operational use. These data streams have often been produced in 2 stages: near real-time, followed by research quality final files. Over the years, we have seen this time delay shrink from months or weeks to mere hours. As well, we have seen the quality of near real-time data improve to the point where the distinction starts to blur. We continue to work towards better and faster RFI filtering, adaptive algorithms and improved real-time validation statistics for earlier detection of problems. Can it be possible to produce climate quality data in real-time, and what would the advantages be? We will try to answer these questions…

  4. Academic Training: Real Time Process Control - Lecture series

    CERN Multimedia

    Françoise Benz

    2004-01-01

    ACADEMIC TRAINING LECTURE REGULAR PROGRAMME 7, 8 and 9 June From 11:00 hrs to 12:00 hrs - Main Auditorium bldg. 500 Real Time Process Control T. Riesco / CERN-TS What exactly is meant by Real-time? There are several definitions of real-time, most of them contradictory. Unfortunately the topic is controversial, and there does not seem to be 100% agreement over the terminology. Real-time applications are becoming increasingly important in our daily lives and can be found in diverse environments such as the automatic braking system on an automobile, a lottery ticket system, or robotic environmental samplers on a space station. These lectures will introduce concepts and theory like basic concepts timing constraints, task scheduling, periodic server mechanisms, hard and soft real-time.ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch

  5. Detection of Histoplasma capsulatum from clinical specimens by cycling probe-based real-time PCR and nested real-time PCR.

    Science.gov (United States)

    Muraosa, Yasunori; Toyotome, Takahito; Yahiro, Maki; Watanabe, Akira; Shikanai-Yasuda, Maria Aparecida; Kamei, Katsuhiko

    2016-05-01

    We developed new cycling probe-based real-time PCR and nested real-time PCR assays for the detection of Histoplasma capsulatum that were designed to detect the gene encoding N-acetylated α-linked acidic dipeptidase (NAALADase), which we previously identified as an H. capsulatum antigen reacting with sera from patients with histoplasmosis. Both assays specifically detected the DNAs of all H. capsulatum strains but not those of other fungi or human DNA. The limited of detection (LOD) of the real-time PCR assay was 10 DNA copies when using 10-fold serial dilutions of the standard plasmid DNA and 50 DNA copies when using human serum spiked with standard plasmid DNA. The nested real-time PCR improved the LOD to 5 DNA copies when using human serum spiked with standard plasmid DNA, which represents a 10-fold higher than that observed with the real-time PCR assay. To assess the ability of the two assays to diagnose histoplasmosis, we analyzed a small number of clinical specimens collected from five patients with histoplasmosis, such as sera (n = 4), formalin-fixed paraffin-embedded (FFPE) tissue (n = 4), and bronchoalveolar lavage fluid (BALF) (n = 1). Although clinical sensitivity of the real-time PCR assay was insufficiently sensitive (33%), the nested real-time PCR assay increased the clinical sensitivity (77%), suggesting it has a potential to be a useful method for detecting H. capsulatum DNA in clinical specimens. © The Author 2015. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Reviewing real-time performance of nuclear reactor safety systems

    International Nuclear Information System (INIS)

    Preckshot, G.G.

    1993-08-01

    The purpose of this paper is to recommend regulatory guidance for reviewers examining real-time performance of computer-based safety systems used in nuclear power plants. Three areas of guidance are covered in this report. The first area covers how to determine if, when, and what prototypes should be required of developers to make a convincing demonstration that specific problems have been solved or that performance goals have been met. The second area has recommendations for timing analyses that will prove that the real-time system will meet its safety-imposed deadlines. The third area has description of means for assessing expected or actual real-time performance before, during, and after development is completed. To ensure that the delivered real-time software product meets performance goals, the paper recommends certain types of code-execution and communications scheduling. Technical background is provided in the appendix on methods of timing analysis, scheduling real-time computations, prototyping, real-time software development approaches, modeling and measurement, and real-time operating systems

  7. Reviewing real-time performance of nuclear reactor safety systems

    Energy Technology Data Exchange (ETDEWEB)

    Preckshot, G.G. [Lawrence Livermore National Lab., CA (United States)

    1993-08-01

    The purpose of this paper is to recommend regulatory guidance for reviewers examining real-time performance of computer-based safety systems used in nuclear power plants. Three areas of guidance are covered in this report. The first area covers how to determine if, when, and what prototypes should be required of developers to make a convincing demonstration that specific problems have been solved or that performance goals have been met. The second area has recommendations for timing analyses that will prove that the real-time system will meet its safety-imposed deadlines. The third area has description of means for assessing expected or actual real-time performance before, during, and after development is completed. To ensure that the delivered real-time software product meets performance goals, the paper recommends certain types of code-execution and communications scheduling. Technical background is provided in the appendix on methods of timing analysis, scheduling real-time computations, prototyping, real-time software development approaches, modeling and measurement, and real-time operating systems.

  8. Connecting real-time data to algorithms and databases: EarthCube's Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS)

    Science.gov (United States)

    Daniels, M. D.; Graves, S. J.; Kerkez, B.; Chandrasekar, V.; Vernon, F.; Martin, C. L.; Maskey, M.; Keiser, K.; Dye, M. J.

    2015-12-01

    The Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) project was funded under the National Science Foundation's EarthCube initiative. CHORDS addresses the ever-increasing importance of real-time scientific data in the geosciences, particularly in mission critical scenarios, where informed decisions must be made rapidly. Access to constant streams of real-time data also allow many new transient phenomena in space-time to be observed, however, much of these streaming data are either completely inaccessible or only available to proprietary in-house tools or displays. Small research teams do not have the resources to develop tools for the broad dissemination of their unique real-time data and require an easy to use, scalable, cloud-based solution to facilitate this access. CHORDS will make these diverse streams of real-time data available to the broader geosciences community. This talk will highlight a recently developed CHORDS portal tools and processing systems which address some of the gaps in handling real-time data, particularly in the provisioning of data from the "long-tail" scientific community through a simple interface that is deployed in the cloud, is scalable and is able to be customized by research teams. A running portal, with operational data feeds from across the nation, will be presented. The processing within the CHORDS system will expose these real-time streams via standard services from the Open Geospatial Consortium (OGC) in a way that is simple and transparent to the data provider, while maximizing the usage of these investments. The ingestion of high velocity, high volume and diverse data has allowed the project to explore a NoSQL database implementation. Broad use of the CHORDS framework by geoscientists will help to facilitate adaptive experimentation, model assimilation and real-time hypothesis testing.

  9. Virtual timers in hierarchical real-time systems

    NARCIS (Netherlands)

    Heuvel, van den M.M.H.P.; Holenderski, M.J.; Cools, W.A.; Bril, R.J.; Lukkien, J.J.; Zhu, D.

    2009-01-01

    Hierarchical scheduling frameworks (HSFs) provide means for composing complex real-time systems from welldefined subsystems. This paper describes an approach to provide hierarchically scheduled real-time applications with virtual event timers, motivated by the need for integrating priority

  10. Temporal and spectral manipulations of correlated photons using a time-lens

    OpenAIRE

    Mittal, Sunil; Orre, Venkata Vikram; Restelli, Alessandro; Salem, Reza; Goldschmidt, Elizabeth A.; Hafezi, Mohammad

    2017-01-01

    A common challenge in quantum information processing with photons is the limited ability to manipulate and measure correlated states. An example is the inability to measure picosecond scale temporal correlations of a multi-photon state, given state-of-the-art detectors have a temporal resolution of about 100 ps. Here, we demonstrate temporal magnification of time-bin entangled two-photon states using a time-lens, and measure their temporal correlation function which is otherwise not accessibl...

  11. Bayesian analysis of magnetic island dynamics

    International Nuclear Information System (INIS)

    Preuss, R.; Maraschek, M.; Zohm, H.; Dose, V.

    2003-01-01

    We examine a first order differential equation with respect to time used to describe magnetic islands in magnetically confined plasmas. The free parameters of this equation are obtained by employing Bayesian probability theory. Additionally, a typical Bayesian change point is solved in the process of obtaining the data

  12. Real-time video compressing under DSP/BIOS

    Science.gov (United States)

    Chen, Qiu-ping; Li, Gui-ju

    2009-10-01

    This paper presents real-time MPEG-4 Simple Profile video compressing based on the DSP processor. The programming framework of video compressing is constructed using TMS320C6416 Microprocessor, TDS510 simulator and PC. It uses embedded real-time operating system DSP/BIOS and the API functions to build periodic function, tasks and interruptions etcs. Realize real-time video compressing. To the questions of data transferring among the system. Based on the architecture of the C64x DSP, utilized double buffer switched and EDMA data transfer controller to transit data from external memory to internal, and realize data transition and processing at the same time; the architecture level optimizations are used to improve software pipeline. The system used DSP/BIOS to realize multi-thread scheduling. The whole system realizes high speed transition of a great deal of data. Experimental results show the encoder can realize real-time encoding of 768*576, 25 frame/s video images.

  13. A class of kernel based real-time elastography algorithms.

    Science.gov (United States)

    Kibria, Md Golam; Hasan, Md Kamrul

    2015-08-01

    In this paper, a novel real-time kernel-based and gradient-based Phase Root Seeking (PRS) algorithm for ultrasound elastography is proposed. The signal-to-noise ratio of the strain image resulting from this method is improved by minimizing the cross-correlation discrepancy between the pre- and post-compression radio frequency signals with an adaptive temporal stretching method and employing built-in smoothing through an exponentially weighted neighborhood kernel in the displacement calculation. Unlike conventional PRS algorithms, displacement due to tissue compression is estimated from the root of the weighted average of the zero-lag cross-correlation phases of the pair of corresponding analytic pre- and post-compression windows in the neighborhood kernel. In addition to the proposed one, the other time- and frequency-domain elastography algorithms (Ara et al., 2013; Hussain et al., 2012; Hasan et al., 2012) proposed by our group are also implemented in real-time using Java where the computations are serially executed or parallely executed in multiple processors with efficient memory management. Simulation results using finite element modeling simulation phantom show that the proposed method significantly improves the strain image quality in terms of elastographic signal-to-noise ratio (SNRe), elastographic contrast-to-noise ratio (CNRe) and mean structural similarity (MSSIM) for strains as high as 4% as compared to other reported techniques in the literature. Strain images obtained for the experimental phantom as well as in vivo breast data of malignant or benign masses also show the efficacy of our proposed method over the other reported techniques in the literature. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Real-time prediction of respiratory motion based on local regression methods

    International Nuclear Information System (INIS)

    Ruan, D; Fessler, J A; Balter, J M

    2007-01-01

    Recent developments in modulation techniques enable conformal delivery of radiation doses to small, localized target volumes. One of the challenges in using these techniques is real-time tracking and predicting target motion, which is necessary to accommodate system latencies. For image-guided-radiotherapy systems, it is also desirable to minimize sampling rates to reduce imaging dose. This study focuses on predicting respiratory motion, which can significantly affect lung tumours. Predicting respiratory motion in real-time is challenging, due to the complexity of breathing patterns and the many sources of variability. We propose a prediction method based on local regression. There are three major ingredients of this approach: (1) forming an augmented state space to capture system dynamics, (2) local regression in the augmented space to train the predictor from previous observation data using semi-periodicity of respiratory motion, (3) local weighting adjustment to incorporate fading temporal correlations. To evaluate prediction accuracy, we computed the root mean square error between predicted tumor motion and its observed location for ten patients. For comparison, we also investigated commonly used predictive methods, namely linear prediction, neural networks and Kalman filtering to the same data. The proposed method reduced the prediction error for all imaging rates and latency lengths, particularly for long prediction lengths

  15. Real-time ISEE data system

    Science.gov (United States)

    Tsurutani, B. T.; Baker, D. N.

    1979-01-01

    A real-time ISEE data system directed toward predicting geomagnetic substorms and storms is discussed. Such a system may allow up to 60+ minutes advance warning of magnetospheric substorms and up to 30 minute warnings of geomagnetic storms (and other disturbances) induced by high-speed streams and solar flares. The proposed system utilizes existing capabilities of several agencies (NASA, NOAA, USAF), and thereby minimizes costs. This same concept may be applicable to data from other spacecraft, and other NASA centers; thus, each individual experimenter can receive quick-look data in real time at his or her base institution.

  16. Real-Time Speciation of Uranium During Active Bioremediation and U(IV) Reoxidation

    International Nuclear Information System (INIS)

    Komlos, J.; Mishra, B.; Lanzirotti, A.; Myneni, S.; Jaffe, P.

    2008-01-01

    The biological reduction of uranium from soluble U(VI) to insoluble U(IV) has shown potential to prevent uranium migration in groundwater. To gain insight into the extent of uranium reduction that can occur during biostimulation and to what degree U(IV) reoxidation will occur under field relevant conditions after biostimulation is terminated, X-ray absorption near edge structure (XANES) spectroscopy was used to monitor: (1) uranium speciation in situ in a flowing column while active reduction was occurring; and (2) in situ postbiostimulation uranium stability and speciation when exposed to incoming oxic water. Results show that after 70 days of bioreduction in a high (30 mM) bicarbonate solution, the majority (>90%) of the uranium in the column was immobilized as U(IV). After acetate addition was terminated and oxic water entered the column, in situ real-time XANES analysis showed that U(IV) reoxidation to U(VI) (and subsequent remobilization) occurred rapidly (on the order of minutes) within the reach of the oxygen front and the spatial and temporal XANES spectra captured during reoxidation allowed for real-time uranium reoxidation rates to be calculated.

  17. Data assimilation of citizen collected information for real-time flood hazard mapping

    Science.gov (United States)

    Sayama, T.; Takara, K. T.

    2017-12-01

    Many studies in data assimilation in hydrology have focused on the integration of satellite remote sensing and in-situ monitoring data into hydrologic or land surface models. For flood predictions also, recent studies have demonstrated to assimilate remotely sensed inundation information with flood inundation models. In actual flood disaster situations, citizen collected information including local reports by residents and rescue teams and more recently tweets via social media also contain valuable information. The main interest of this study is how to effectively use such citizen collected information for real-time flood hazard mapping. Here we propose a new data assimilation technique based on pre-conducted ensemble inundation simulations and update inundation depth distributions sequentially when local data becomes available. The propose method is composed by the following two-steps. The first step is based on weighting average of preliminary ensemble simulations, whose weights are updated by Bayesian approach. The second step is based on an optimal interpolation, where the covariance matrix is calculated from the ensemble simulations. The proposed method was applied to case studies including an actual flood event occurred. It considers two situations with more idealized one by assuming continuous flood inundation depth information is available at multiple locations. The other one, which is more realistic case during such a severe flood disaster, assumes uncertain and non-continuous information is available to be assimilated. The results show that, in the first idealized situation, the large scale inundation during the flooding was estimated reasonably with RMSE effective. Nevertheless, the applications of the proposed data assimilation method demonstrated a high potential of this method for assimilating citizen collected information for real-time flood hazard mapping in the future.

  18. A real-time brain-machine interface combining motor target and trajectory intent using an optimal feedback control design.

    Directory of Open Access Journals (Sweden)

    Maryam M Shanechi

    Full Text Available Real-time brain-machine interfaces (BMI have focused on either estimating the continuous movement trajectory or target intent. However, natural movement often incorporates both. Additionally, BMIs can be modeled as a feedback control system in which the subject modulates the neural activity to move the prosthetic device towards a desired target while receiving real-time sensory feedback of the state of the movement. We develop a novel real-time BMI using an optimal feedback control design that jointly estimates the movement target and trajectory of monkeys in two stages. First, the target is decoded from neural spiking activity before movement initiation. Second, the trajectory is decoded by combining the decoded target with the peri-movement spiking activity using an optimal feedback control design. This design exploits a recursive Bayesian decoder that uses an optimal feedback control model of the sensorimotor system to take into account the intended target location and the sensory feedback in its trajectory estimation from spiking activity. The real-time BMI processes the spiking activity directly using point process modeling. We implement the BMI in experiments consisting of an instructed-delay center-out task in which monkeys are presented with a target location on the screen during a delay period and then have to move a cursor to it without touching the incorrect targets. We show that the two-stage BMI performs more accurately than either stage alone. Correct target prediction can compensate for inaccurate trajectory estimation and vice versa. The optimal feedback control design also results in trajectories that are smoother and have lower estimation error. The two-stage decoder also performs better than linear regression approaches in offline cross-validation analyses. Our results demonstrate the advantage of a BMI design that jointly estimates the target and trajectory of movement and more closely mimics the sensorimotor control system.

  19. Introduction to applied Bayesian statistics and estimation for social scientists

    CERN Document Server

    Lynch, Scott M

    2007-01-01

    ""Introduction to Applied Bayesian Statistics and Estimation for Social Scientists"" covers the complete process of Bayesian statistical analysis in great detail from the development of a model through the process of making statistical inference. The key feature of this book is that it covers models that are most commonly used in social science research - including the linear regression model, generalized linear models, hierarchical models, and multivariate regression models - and it thoroughly develops each real-data example in painstaking detail.The first part of the book provides a detailed

  20. Nonlinear Bayesian filtering and learning: a neuronal dynamics for perception.

    Science.gov (United States)

    Kutschireiter, Anna; Surace, Simone Carlo; Sprekeler, Henning; Pfister, Jean-Pascal

    2017-08-18

    The robust estimation of dynamical hidden features, such as the position of prey, based on sensory inputs is one of the hallmarks of perception. This dynamical estimation can be rigorously formulated by nonlinear Bayesian filtering theory. Recent experimental and behavioral studies have shown that animals' performance in many tasks is consistent with such a Bayesian statistical interpretation. However, it is presently unclear how a nonlinear Bayesian filter can be efficiently implemented in a network of neurons that satisfies some minimum constraints of biological plausibility. Here, we propose the Neural Particle Filter (NPF), a sampling-based nonlinear Bayesian filter, which does not rely on importance weights. We show that this filter can be interpreted as the neuronal dynamics of a recurrently connected rate-based neural network receiving feed-forward input from sensory neurons. Further, it captures properties of temporal and multi-sensory integration that are crucial for perception, and it allows for online parameter learning with a maximum likelihood approach. The NPF holds the promise to avoid the 'curse of dimensionality', and we demonstrate numerically its capability to outperform weighted particle filters in higher dimensions and when the number of particles is limited.